Dear Reader, this book project brings to you a unique study tool for ESD
protection solutions used in analog-integrated circuit (IC) design. Quick-start
learning is combined with in-depth understanding for the whole spectrum of cross-
disciplinary knowledge required to excel in the ESD field. The chapters cover
technical material from elementary semiconductor structure and device levels up
to complex analog circuit design examples and case studies.
District energy (DE) systems use central heating and/or cooling facilities to provide
heating and/or cooling services for communities. The advantages of district energy
over conventional heating and cooling include improved efficiency, reliability and
safety, reduced environmental impact, and for many situations better economics.
DE systems can be particularly beneficial when integrated with cogeneration plants
for electricity and heat, i.e., with combined heat and power (CHP) plants. One of
the main impediments to increased use of cogeneration-based district energy is a
lack of understanding of the behavior of integrated forms of such systems. This
book is aimed at providing information on district energy and cogeneration tech-
nologies, as well as systems that combine them.
Since the original publication of Manual 74 in 1991, and the preceding
“Guidelines for Transmission Line Structural Loading” in 1984, the
understanding of structural loadings on transmission line structures has
broadened signifi cantly. However, improvements in computational capa-
bility have enabled the transmission line engineer to more easily deter-
mine structural loadings without properly understanding the parameters
that affect these loads. Many seasoned professionals have expressed
concern for the apparent lack of recent information on the topic of struc-
tural loadings as new engineers enter this industry. The Committee on
Electrical Transmission Structures is charged with the responsibility to
report, evaluate, and provide loading requirements of transmission struc-
tures. This task committee was therefore formed to update and revise the
1991 manual.
If one examines the current literature on GPS receiver design, most of it is quite a
bit above the level of the novice. It is taken for granted that the reader is already at a
fairly high level of understanding and proceeds from there. This text will be an
attempt to take the reader through the concepts and circuits needed to be able to
understand how a GPS receiver works from the antenna to the solution of user
position.
My association with the theory of controls in continuous time started during my studies at
the Indian Institute of Technology, Kharagpur, India, in 1974 as an undergraduate student
in the Controls and Power program. The initial introduction by Professors Kesavamurthy,
Y. P. Singh, and Rajagopalan laid the foundation for a good basic understanding of the
subject matter. This pursuit and further advanced study in the field of digital controls
continued during my days as a graduate student in the Electrical and Systems Engineering
Department at the University of Connecticut in Storrs, from 1983 to 1988.
The chief objective of Electric Machinery continues to be to build a strong
foundation in the basic principles of electromechanics and electric machinery.
Through all of its editions, the emphasis of Electric Machinery has been
on both physical insight and analytical techniques. Mastery of the material covered
will provide both the basis for understanding many real-world electric-machinery
applications as well as the foundation for proceeding on to more advanced courses in
electric machinery design and control.
Signals convey information. Systems transform signals. This book introduces the mathe-
matical models used to design and understand both. It is intended for students interested
in developing a deep understanding of how to digitally create and manipulate signals to
measure and control the physical world and to enhance human experience and communi-
cation.
Design for manufacturability and statistical design encompass a number
of activities and areas of study spanning the integrated circuit design and
manufacturing worlds. In the early days of the planar integrated circuit, it was
typical for a handful of practitioners working on a particular design to have
a fairly complete understanding of the manufacturing process, the resulting
semiconductor active and passive devices, as well as the resulting circuit -
often composed of as few as tens of devices. With the success of semiconductor
scaling, predicted and - to a certain extent even driven - by Moore’s law, and
the vastly increased complexity of modern nano-meter scale processes and the
billion-device circuits they allow, there came a necessary separation between
the various disciplines.
Artificial Intelligence (AI) has undoubtedly been one of the most important buz-
zwords over the past years. The goal in AI is to design algorithms that transform com-
puters into “intelligent” agents. By intelligence here we do not necessarily mean an
extraordinary level of smartness shown by superhuman; it rather often involves very
basic problems that humans solve very frequently in their day-to-day life. This can
be as simple as recognizing faces in an image, driving a car, playing a board game, or
reading (and understanding) an article in a newspaper. The intelligent behaviour ex-
hibited by humans when “reading” is one of the main goals for a subfield of AI called
Natural Language Processing (NLP). Natural language 1 is one of the most complex
tools used by humans for a wide range of reasons, for instance to communicate with
others, to express thoughts, feelings and ideas, to ask questions, or to give instruc-
tions. Therefore, it is crucial for computers to possess the ability to use the same tool
in order to effectively interact with humans.
Computer science as an academic discipline began in the 1960’s. Emphasis was on
programming languages, compilers, operating systems, and the mathematical theory that
supported these areas. Courses in theoretical computer science covered finite automata,
regular expressions, context-free languages, and computability. In the 1970’s, the study
of algorithms was added as an important component of theory. The emphasis was on
making computers useful. Today, a fundamental change is taking place and the focus is
more on a wealth of applications. There are many reasons for this change. The merging
of computing and communications has played an important role. The enhanced ability
to observe, collect, and store data in the natural sciences, in commerce, and in other
fields calls for a change in our understanding of data and how to handle it in the modern
setting. The emergence of the web and social networks as central aspects of daily life
presents both opportunities and challenges for theory.