- You'll save an extra 5% on Books purchased from Amazon.ca, now through July 29th. No code necessary, discount applied at checkout. Here's how (restrictions apply)
Entropy Demystified: The Second Law Reduced to Plain Common Sense Paperback – Jun 18 2008
Special Offers and Product Promotions
No Kindle device required. Download one of the Free Kindle apps to start reading Kindle books on your smartphone, tablet, and computer.
Getting the download link through email is temporarily not available. Please check back later.
To get the free app, enter your mobile phone number.
This book makes very good reading for all students of thermodynamics, as well as for more-advanced people who do (or do not) feel comfortable with the fascinating concept of entropy. -- CERN Courier "CERN Courier" --This text refers to an alternate Paperback edition.
Top Customer Reviews
Most Helpful Customer Reviews on Amazon.com (beta)
In a nutshell, this is very much a book for laymen. If you want an intuitive grasp of what entropy's about in the context of everyday physics without getting bogged down in math, then this may be a great book for you. The book uses as little math as possible in its explanations, and effectively assumes you're unfamiliar with or have forgotten high-school-level math operations such as factorials and logarithms. It manages to pound its point home reasonably well using lots and lots of fairly simple thought experiments that only differ from each other by little incremental steps.
On the other hand, if you already know anything at all about the information-theoretic formulation of entropy, already have an appreciation for the Law of Large Numbers, and have heard the words "macrostates" and "microstates" before, then there's nothing in this book you aren't likely to understand already. If you've taken a course on statistical mechanics and finished it without being horrendously confused, but maybe were hoping for a useful refresher on how different formulations of entropy are related, you should pass on this book. If you were hoping for illumination about the aspects of entropy that are actually at all "interesting" to modern physicists, such as black hole entropy (or the bizarre theories it's spawned such as the holographic principle), this is definitely not the book you're looking for.
Also, the book has no index. This is less annoying than it would be in a book that had more meat to it, but still, any 200+ page nonfiction book with no index should be taken out and shot as a matter of principle.
thermodynamics and statistical mechanics for many years and is well
aware that students learn the second law but do not understand it,
simply because it can not be explained in the framework of classical
thermodynamics, in which it was first formulated by Lord Kelvin (i.e.
William Thomson, 1824-1907) and Rudolf Julius Emanuel Clausius
(1822-1888). Hence, this law and the connected concept of entropy are
usually surrounded by some mysterious halo: there is something (the
entropy), defined as the ratio between heat and temperature, that is
always increasing. The students not only do not understand _why_ it is
always increasing (it is left as a principle in classical
thermodynamics), but also ask themselves what is the _source_ of such
ever increasing quantity.
We feel comfortable with the first law, that is the principle of energy conservation, because our experience always
suggests that if we use some resource (the energy) to perform any work,
then we are left with less available energy for further tasks. The
first law simply tells us that the heat is
another form of energy so that nothing is actually lost, something which
we can accept without pain. In addition, the second law says that,
though the total energy is constant, we can not always recycle 100% of
it because there is a limit on the efficiency of conversion of heat into
work (the highest efficiency being given by the Carnot cycle, named
after Nicolas Léonard Sadi Carnot, 1796-1832). Again, we can accept it
quite easily, because it sounds natural, i.e. in accordance with our
common sense: we do not know any perpetual engine. But our daily
experience is not sufficient to make us understand what entropy is, and
why it must always increase.
The author shows that, if we identify the entropy with the concept of
"missing information" of the system at equilibrium, following the work
done by Claude Elwood Shannon (1916-2001) in 1948, we obtain a well
defined and (at least in principle) measurable quantity. This quantity,
apart from a multiplicative constant, has the same behavior as the
entropy: for every spontaneous process of an isolated system, it must
increase until the equilibrium state is reached. The missing
information, rather than the disorder (not being a well defined
quantity), is the key concept to understand the second law.
I should say here that the identity of entropy and missing
information is not a widespread idea among physicists, so that many
people may not appreciate this point. However, the arguments of this
book are quite convincing, and different opinions are also taken into
account and commented.
In addition, Ben-Naim thinks that the entropy should be taught as an
dimensionless quantity, being defined as the ratio between heat, that is
energy, and temperature, that is a measure of the average kinetic energy
of the atoms and molecules. The only difference with the missing
information, again dimensionless, is the scale: because the missing
information can be defined as the number of binary questions (with
answer "yes" on "no" only) which are necessary to identify the
microscopic state of the system, this number comes out to be incredibly
large for ordinary physical systems, involving a number of constituents
of the order of the Avogadro's number. This numerical difference makes
me think about the difference between mass and energy, connected by the
Einstein's most famous equation E = m c^2: they could be measured using
the same units (as it is actually done in high-energy physics), the sole
difference being that even a very small mass amounts to a huge quantity
The mystery of the ever increasing entropy can be explained once (and
only if) we realize that the matter is not continue, but discrete. The
author basically follows the work of Josiah Willard Gibbs (1839-1903),
who developed the statistical mechanical theory of matter based on a
purely probabilistic approach. First, one has to accept the fact that
macroscopic measurements are not sensitive enough to distinguish
microscopic configurations when they differ for thousands or even
millions of atoms, just because the total number of particles is usually
very large (usually of the order of 10^23 at least). Then, under the
hypothesis that each microscopic state is equally probable, i.e. that
the system will spend almost the same time in each micro-state, one can
group indistinguishable micro-states into macro-states. The latter are
the only thing we can monitor with macroscopic measurements. Under the
commonly accepted hypothesis that all microscopic configurations are
equally probable, macro-states composed by larger numbers of
micro-states will be more probable, i.e. the system will spend more time
in such macro-states.
As a naive example, one could start with a system prepared in such a way
that all its constituents are in the same microscopic configuration.
One could think about a sample of N dices, all of them showing the same
face, say the first one. The questions could be: (1) "Are all dices
showing the same face?"--Yes--; (2) "Is the face value larger or equal
than 3?"--No--; (3) "Is the face value larger or equal than 2?"--No--;
at this point we know that the value is 1. In general, the number of
binary questions is proportional to the logarithm in base 2 of the
number C of possible configurations, that is O(log_2 C). Now imagine to
randomly mix the dice by throwing all of them. The answer to the first
question would be "No", so that a completely different series of
questions has to be asked to find the microscopic configurations.
First, one may procede by finding how many dice show the value 1, for
example, asking O(log_2 N) questions. Suppose that the answer is M<N:
then one should find exactly what dice are showing this face, by asking
O(N) questions. The next step is to find how many dice show the value
2, among the N-M remaining ones, and so on. When N is very large, the
number of questions increases rapidly. So far, we have being speaking
about "microscopic" configurations, describing the exact state of all
dice. Now, we can imagine to be interested only in the "macroscopic"
configuration defined by the sum of all values. It is very easy to
imagine that the "microscopic" configurations corresponding to sum
values around 3N (corresponging to a uniform distribution of values)
will be many more than those with sum near N or 6N (which need all dice
showing 1 or 6, respectively). If we repeatedly shake the box or throw
all dice, most of the time we will obtain a sum near to 3N, and larger
deviations will be rarer. Hence, such a system will soon approach the
"equilibrium" state in which the sum is very near to 3N.
As a matter of fact, when the number of possible microscopic
configurations increases, the probability distribution of macro-states
becomes narrower and narrower, so that for ordinary systems the
probability to have a fluctuation large enough to be measured is
incredibly small. Actually, as Ben-Naim clearly emphasizes, the
probabilistic formulation of the second law of thermodynamics allows us
to quantify its validity, in terms of the time one should wait to be
able to find a fluctuation large enough to be measured. It comes out
that, for ordinary systems, the probability to have any measurable
fluctuation away from the equilibrium state is so low that the universe
age is practically negligible compared to the time we should wait to
observe such fluctuation. From this point of view, the second law is
far more "absolute" than the other laws of physics, for which at best we
could state that they are valid since the beginning of the universe life.
The book is a very good reading for all students who approach the
thermodynamics and also for more advanced people who do or do not feel
comfortable with the fascinating concept of entropy. Ben-Naim is also
the author of a more technical book ("Statistical Thermodynamics Based
on Information. A Farewell to Entropy", World Scientific, A Farewell To Entropy) in
which these guidelines are the base for a more detailed treatment of
statistical mechanics. Because we usually learn things much better when
following a cyclical approach, I encourage the readers to start with the
book "Entropy Demystified" and then seriously consider to go deeper into
the details of statistical mechanics with the more technical book by
Ben-Naim, of which I was delighted to read the draft.
You can't "avoid" entropy. Entropy is something very real: E.g. in broadband transmission the cost (e.g. chip size, power dissipation, heat generation) of managing entropy is almost proportional to the amount of entropy, which is to be managed. And climate change also can be explained by the entropy accounting (entropy generation, import, export) of the biosphere and the clogging of the interfaces of the biosphere, which are required to get rid of the entropy generated within the biosphere.
Therefore we need comprehensible explanations for entropy. My personal interest is not so much in entropy itself, but in how teachers and authors manage to explain entropy. Arieh Ben-Naim manages to get rid of all the fuzz which comes with so many publications related to entropy. He really manages to demystify entropy. I think, there are two paths which one could select to explain entropy. One is within information theory, the other one uses statistical physics. Ben-Naim chose the second one and thus not only managed to demystify entropy, but also demystified statistical physics: From my point of view, you just need a high school degree in order to be able to comprehend his book. Or you even may be lucky to have a teacher, who uses this book in the final high school year.
Economists and social scientists could get some help from the book too in understanding, what entropy really means. Indicators like the inequality measures of Theil and Kolm are entropy measures. And Nicholas Georgescu Roegen will be easier to understand. (The book would have been helpful to him too.)
Besides its content, I also like the making of the little book from Arieh Ben-Naim. It got very nice illustrations. And they are not just nice, they also are helpful. Here scientific thinking comes together with simple love to make things beautiful. It seems, that good science also leads to good aesthetics.
Related to this book, I also recommend the publications of M.V.Volkenstein (like Physics and Biology), although they are mostly out of print.
Anyone who learns entropy in terms of thermodynamics (that is, heat cycles) is done a horrible disservice. The microscopic view of entropy makes intuitive sense, and most people do get it. I, however, was stuck for the longest time in that I could not understand why the thermodynamic definitions of entropy were intuitively obvious, why they had to exist, from purely macroscopic reasoning. Ben-Naim clarifies, as most professors of this subject do not or probably do not know, that it is not possible to justify such equations based purely on reasoning.
Additionally, Ben-Naim describes entropy itself in terms of information theory. This is invaluable; it is far more rigorous than the naive "disorder" analogy. Anyone who has done more than just basic qualitative questions recognizes that the notion of "ordered" vs "disordered" is inherently fuzzy in examples of solvation. The value of using information theory to then discuss tempergy, or temperature in units of energy, is intuitively valuable.
Ben-Naim also discusses entropy as a generalized property from several different common views, which are equivalent. The argument of showing how a certain quantity, seemingly different because of the dependent variable, is actually logically the same is an argument familiar to physicists, helping to put the macroscopic notion of entropy put on firm footing, and not the "Where did this come from" basis of saying it is the contour integral of the change in heat per temperature.
If you are a scientist, you will fly through this book, and reap quick rewards. Chemists/Physicists will be already quite used to much of the material in the book, but the analysis of certain chapters (for me, 2, 6, and his epilogue 8) are invaluable, clearly spoken insights.
He also offers to send him an email if you are confused!
This is what science should be: writing done without pretension (in contrast to Atkins).
Look for similar items by category
- Books > Professional & Technical > Engineering > Chemical > Thermodynamics
- Books > Professional & Technical > Engineering > Materials Science > Thermodynamics
- Books > Professional & Technical > Professional Science > Chemistry > Physical & Theoretical
- Books > Professional & Technical > Professional Science > Physics
- Books > Science & Math > Chemistry > Physical & Theoretical > Physical Chemistry
- Books > Science & Math > History & Philosophy > History of Science
- Books > Science & Math > Physics > Dynamics > Thermodynamics
- Books > Science & Math > Physics > Entropy
- Books > Textbooks > Sciences > Chemistry
- Books > Textbooks > Sciences > Mechanics
- Books > Textbooks > Sciences > Physics