• List Price: CDN$ 59.50
  • You Save: CDN$ 17.83 (30%)
Temporarily out of stock.
Order now and we'll deliver when available. We'll e-mail you with an estimated delivery date as soon as we have more information. Your account will only be charged when we ship the item.
Ships from and sold by Amazon.ca. Gift-wrap available.
Entropy Demystified: The ... has been added to your Cart
+ CDN$ 6.49 shipping
Used: Very Good | Details
Sold by momox ca
Condition: Used: Very Good
Comment: Please allow 1-2 weeks for delivery. For DVDs please check region code before ordering.
Have one to sell?
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See all 2 images

Entropy Demystified: The Second Law Reduced to Plain Common Sense Paperback – Jun 18 2008

5.0 out of 5 stars 1 customer review

See all 5 formats and editions Hide other formats and editions
Amazon Price
New from Used from
Kindle Edition
"Please retry"
"Please retry"
CDN$ 41.67
CDN$ 30.03 CDN$ 27.03

Harry Potter and the Cursed Child
click to open popover

Special Offers and Product Promotions

  • You'll save an extra 5% on Books purchased from Amazon.ca, now through July 29th. No code necessary, discount applied at checkout. Here's how (restrictions apply)

No Kindle device required. Download one of the Free Kindle apps to start reading Kindle books on your smartphone, tablet, and computer.
Getting the download link through email is temporarily not available. Please check back later.

  • Apple
  • Android
  • Windows Phone
  • Android

To get the free app, enter your mobile phone number.

Product Details

  • Paperback: 250 pages
  • Publisher: World Scientific Publishing; Expanded ed edition (June 18 2008)
  • Language: English
  • ISBN-10: 9812832254
  • ISBN-13: 978-9812832252
  • Product Dimensions: 15.2 x 1.6 x 22.9 cm
  • Shipping Weight: 458 g
  • Average Customer Review: 5.0 out of 5 stars 1 customer review
  • Amazon Bestsellers Rank: #613,066 in Books (See Top 100 in Books)
  •  Would you like to update product info, give feedback on images, or tell us about a lower price?

  • See Complete Table of Contents

Product Description


This book makes very good reading for all students of thermodynamics, as well as for more-advanced people who do (or do not) feel comfortable with the fascinating concept of entropy. -- CERN Courier "CERN Courier" --This text refers to an alternate Paperback edition.

Customer Reviews

5.0 out of 5 stars
5 star
4 star
3 star
2 star
1 star
See the customer review
Share your thoughts with other customers

Top Customer Reviews

By George Poirier TOP 50 REVIEWERVINE VOICE on Feb. 11 2010
Format: Paperback Verified Purchase
In this easy-to-read book, the author explains the nature of entropy using vary basic probabilistic arguments. The author assumes that the reader knows no mathematics, nor has any knowledge of physics but can use ordinary common sense in reasoning things out. Most of the arguments make use of dice and coins until near the end where real systems, e.g., gasses, are discussed. The writing style is very clear, authoritative, highly accessible and friendly. Some concepts and conclusions are deliberately repeated; this can be quite useful to readers who are new to this subject. The book's level is very basic and could be easily understood by any interested general reader or high school student.
Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again.
Report abuse

Most Helpful Customer Reviews on Amazon.com (beta)

Amazon.com: HASH(0x999b7744) out of 5 stars 40 reviews
94 of 108 people found the following review helpful
By Scott Davies - Published on Amazon.com
Format: Paperback Verified Purchase
After seeing nothing but five-star reviews for this book, I figured I'd pick it up despite having little feel for what its target audience was since none of it was actually viewable on Amazon.

In a nutshell, this is very much a book for laymen. If you want an intuitive grasp of what entropy's about in the context of everyday physics without getting bogged down in math, then this may be a great book for you. The book uses as little math as possible in its explanations, and effectively assumes you're unfamiliar with or have forgotten high-school-level math operations such as factorials and logarithms. It manages to pound its point home reasonably well using lots and lots of fairly simple thought experiments that only differ from each other by little incremental steps.

On the other hand, if you already know anything at all about the information-theoretic formulation of entropy, already have an appreciation for the Law of Large Numbers, and have heard the words "macrostates" and "microstates" before, then there's nothing in this book you aren't likely to understand already. If you've taken a course on statistical mechanics and finished it without being horrendously confused, but maybe were hoping for a useful refresher on how different formulations of entropy are related, you should pass on this book. If you were hoping for illumination about the aspects of entropy that are actually at all "interesting" to modern physicists, such as black hole entropy (or the bizarre theories it's spawned such as the holographic principle), this is definitely not the book you're looking for.

Also, the book has no index. This is less annoying than it would be in a book that had more meat to it, but still, any 200+ page nonfiction book with no index should be taken out and shot as a matter of principle.
46 of 52 people found the following review helpful
HASH(0x9999a12c) out of 5 stars Another way to enjoy fundamental physics! Oct. 14 2007
By Diego Casadei - Published on Amazon.com
Format: Paperback
Arieh Ben-Naim, professor at the Hebrew University of Jerusalem, taught
thermodynamics and statistical mechanics for many years and is well
aware that students learn the second law but do not understand it,
simply because it can not be explained in the framework of classical
thermodynamics, in which it was first formulated by Lord Kelvin (i.e.
William Thomson, 1824-1907) and Rudolf Julius Emanuel Clausius
(1822-1888). Hence, this law and the connected concept of entropy are
usually surrounded by some mysterious halo: there is something (the
entropy), defined as the ratio between heat and temperature, that is
always increasing. The students not only do not understand _why_ it is
always increasing (it is left as a principle in classical
thermodynamics), but also ask themselves what is the _source_ of such
ever increasing quantity.

We feel comfortable with the first law, that is the principle of energy conservation, because our experience always
suggests that if we use some resource (the energy) to perform any work,
then we are left with less available energy for further tasks. The
first law simply tells us that the heat is
another form of energy so that nothing is actually lost, something which
we can accept without pain. In addition, the second law says that,
though the total energy is constant, we can not always recycle 100% of
it because there is a limit on the efficiency of conversion of heat into
work (the highest efficiency being given by the Carnot cycle, named
after Nicolas Léonard Sadi Carnot, 1796-1832). Again, we can accept it
quite easily, because it sounds natural, i.e. in accordance with our
common sense: we do not know any perpetual engine. But our daily
experience is not sufficient to make us understand what entropy is, and
why it must always increase.

The author shows that, if we identify the entropy with the concept of
"missing information" of the system at equilibrium, following the work
done by Claude Elwood Shannon (1916-2001) in 1948, we obtain a well
defined and (at least in principle) measurable quantity. This quantity,
apart from a multiplicative constant, has the same behavior as the
entropy: for every spontaneous process of an isolated system, it must
increase until the equilibrium state is reached. The missing
information, rather than the disorder (not being a well defined
quantity), is the key concept to understand the second law.

I should say here that the identity of entropy and missing
information is not a widespread idea among physicists, so that many
people may not appreciate this point. However, the arguments of this
book are quite convincing, and different opinions are also taken into
account and commented.

In addition, Ben-Naim thinks that the entropy should be taught as an
dimensionless quantity, being defined as the ratio between heat, that is
energy, and temperature, that is a measure of the average kinetic energy
of the atoms and molecules. The only difference with the missing
information, again dimensionless, is the scale: because the missing
information can be defined as the number of binary questions (with
answer "yes" on "no" only) which are necessary to identify the
microscopic state of the system, this number comes out to be incredibly
large for ordinary physical systems, involving a number of constituents
of the order of the Avogadro's number. This numerical difference makes
me think about the difference between mass and energy, connected by the
Einstein's most famous equation E = m c^2: they could be measured using
the same units (as it is actually done in high-energy physics), the sole
difference being that even a very small mass amounts to a huge quantity
of energy.

The mystery of the ever increasing entropy can be explained once (and
only if) we realize that the matter is not continue, but discrete. The
author basically follows the work of Josiah Willard Gibbs (1839-1903),
who developed the statistical mechanical theory of matter based on a
purely probabilistic approach. First, one has to accept the fact that
macroscopic measurements are not sensitive enough to distinguish
microscopic configurations when they differ for thousands or even
millions of atoms, just because the total number of particles is usually
very large (usually of the order of 10^23 at least). Then, under the
hypothesis that each microscopic state is equally probable, i.e. that
the system will spend almost the same time in each micro-state, one can
group indistinguishable micro-states into macro-states. The latter are
the only thing we can monitor with macroscopic measurements. Under the
commonly accepted hypothesis that all microscopic configurations are
equally probable, macro-states composed by larger numbers of
micro-states will be more probable, i.e. the system will spend more time
in such macro-states.

As a naive example, one could start with a system prepared in such a way
that all its constituents are in the same microscopic configuration.
One could think about a sample of N dices, all of them showing the same
face, say the first one. The questions could be: (1) "Are all dices
showing the same face?"--Yes--; (2) "Is the face value larger or equal
than 3?"--No--; (3) "Is the face value larger or equal than 2?"--No--;
at this point we know that the value is 1. In general, the number of
binary questions is proportional to the logarithm in base 2 of the
number C of possible configurations, that is O(log_2 C). Now imagine to
randomly mix the dice by throwing all of them. The answer to the first
question would be "No", so that a completely different series of
questions has to be asked to find the microscopic configurations.
First, one may procede by finding how many dice show the value 1, for
example, asking O(log_2 N) questions. Suppose that the answer is M<N:
then one should find exactly what dice are showing this face, by asking
O(N) questions. The next step is to find how many dice show the value
2, among the N-M remaining ones, and so on. When N is very large, the
number of questions increases rapidly. So far, we have being speaking
about "microscopic" configurations, describing the exact state of all
dice. Now, we can imagine to be interested only in the "macroscopic"
configuration defined by the sum of all values. It is very easy to
imagine that the "microscopic" configurations corresponding to sum
values around 3N (corresponging to a uniform distribution of values)
will be many more than those with sum near N or 6N (which need all dice
showing 1 or 6, respectively). If we repeatedly shake the box or throw
all dice, most of the time we will obtain a sum near to 3N, and larger
deviations will be rarer. Hence, such a system will soon approach the
"equilibrium" state in which the sum is very near to 3N.

As a matter of fact, when the number of possible microscopic
configurations increases, the probability distribution of macro-states
becomes narrower and narrower, so that for ordinary systems the
probability to have a fluctuation large enough to be measured is
incredibly small. Actually, as Ben-Naim clearly emphasizes, the
probabilistic formulation of the second law of thermodynamics allows us
to quantify its validity, in terms of the time one should wait to be
able to find a fluctuation large enough to be measured. It comes out
that, for ordinary systems, the probability to have any measurable
fluctuation away from the equilibrium state is so low that the universe
age is practically negligible compared to the time we should wait to
observe such fluctuation. From this point of view, the second law is
far more "absolute" than the other laws of physics, for which at best we
could state that they are valid since the beginning of the universe life.

The book is a very good reading for all students who approach the
thermodynamics and also for more advanced people who do or do not feel
comfortable with the fascinating concept of entropy. Ben-Naim is also
the author of a more technical book ("Statistical Thermodynamics Based
on Information. A Farewell to Entropy", World Scientific, A Farewell To Entropy) in
which these guidelines are the base for a more detailed treatment of
statistical mechanics. Because we usually learn things much better when
following a cyclical approach, I encourage the readers to start with the
book "Entropy Demystified" and then seriously consider to go deeper into
the details of statistical mechanics with the more technical book by
Ben-Naim, of which I was delighted to read the draft.
15 of 17 people found the following review helpful
HASH(0x9999a36c) out of 5 stars Entropy - no big deal Nov. 7 2007
By Nico van der Vegt - Published on Amazon.com
Format: Paperback
"... Arieh Ben-Naim invites the reader to experience the joy of appreciating something which has eluded understanding for many years -entropy and the second law of thermodynamics". This statement on the back cover for sure will reflect the experience of many who read this book. I highly recommend it to anyone who wants to understand or teach the mysterious concept "entropy". Just sit back, open this delightful book, and experience how your foggy ideas are cleared up within just a couple of enjoyable hours. You need no prior knowledge; if you have learned how to read and how to count numbers between one and ten you possess all qualifications needed to read and appreciate all of its contents. The author not only succeeds to brilliantly explain the meaning of entropy, its statistical interpretation and why common sense leads us to conclude entropy (most likely) is ever-increasing - he moreover provides compelling arguments to do away with the second law altogether: ".. because science will find it unnecessary to formulate a law of physics based on purely logical deduction". This concluding sentence by Ben-Naim will be further substantiated in a forthcoming book by the same author. In addition to the present book, which I highly recommend to everbody who wants to learn about entropy in general, I also want to recommend another recent book by Ben-Naim on molecular theory of solutions to students and scientists interested in the entropy of solvation processes. The scientific literature on this topic is huge and -above all - utterly confusing. Ben-Naim's clearly formulated ideas have helped me a lot in understanding the subject better.
17 of 20 people found the following review helpful
HASH(0x9999a5e8) out of 5 stars Entropy Defuzzyfied Oct. 16 2007
By GK - Published on Amazon.com
Format: Paperback Verified Purchase
Adam Smith's "Invisible Hand" leads many people to think, that markets have the power to repair "themselves". But even in markets as open systems, there are irreversible processes, as the openness of real systems always is limited. Adam Smith, still in a Newtonian world, didn't know anything about the "second 'law' of thermodynamics" and "entropy". But at least today we should know better. Unfortunately entropy still seems to be some mystic thing to many, which to deal with should be avoided. (Knowing about entropy also increases responsibility. Some like to avoid that as well.)

You can't "avoid" entropy. Entropy is something very real: E.g. in broadband transmission the cost (e.g. chip size, power dissipation, heat generation) of managing entropy is almost proportional to the amount of entropy, which is to be managed. And climate change also can be explained by the entropy accounting (entropy generation, import, export) of the biosphere and the clogging of the interfaces of the biosphere, which are required to get rid of the entropy generated within the biosphere.

Therefore we need comprehensible explanations for entropy. My personal interest is not so much in entropy itself, but in how teachers and authors manage to explain entropy. Arieh Ben-Naim manages to get rid of all the fuzz which comes with so many publications related to entropy. He really manages to demystify entropy. I think, there are two paths which one could select to explain entropy. One is within information theory, the other one uses statistical physics. Ben-Naim chose the second one and thus not only managed to demystify entropy, but also demystified statistical physics: From my point of view, you just need a high school degree in order to be able to comprehend his book. Or you even may be lucky to have a teacher, who uses this book in the final high school year.

Economists and social scientists could get some help from the book too in understanding, what entropy really means. Indicators like the inequality measures of Theil and Kolm are entropy measures. And Nicholas Georgescu Roegen will be easier to understand. (The book would have been helpful to him too.)

Besides its content, I also like the making of the little book from Arieh Ben-Naim. It got very nice illustrations. And they are not just nice, they also are helpful. Here scientific thinking comes together with simple love to make things beautiful. It seems, that good science also leads to good aesthetics.

Related to this book, I also recommend the publications of M.V.Volkenstein (like Physics and Biology), although they are mostly out of print.
6 of 7 people found the following review helpful
HASH(0x9999a480) out of 5 stars Entropy Discussed WITHOUT Logical Leaps Jan. 1 2009
By Robert W. Molt Jr. - Published on Amazon.com
Format: Paperback Verified Purchase
I consider this book to be of great value to be read by any scientist (I am one, hence I will not speak for non-scientists).

Anyone who learns entropy in terms of thermodynamics (that is, heat cycles) is done a horrible disservice. The microscopic view of entropy makes intuitive sense, and most people do get it. I, however, was stuck for the longest time in that I could not understand why the thermodynamic definitions of entropy were intuitively obvious, why they had to exist, from purely macroscopic reasoning. Ben-Naim clarifies, as most professors of this subject do not or probably do not know, that it is not possible to justify such equations based purely on reasoning.

Additionally, Ben-Naim describes entropy itself in terms of information theory. This is invaluable; it is far more rigorous than the naive "disorder" analogy. Anyone who has done more than just basic qualitative questions recognizes that the notion of "ordered" vs "disordered" is inherently fuzzy in examples of solvation. The value of using information theory to then discuss tempergy, or temperature in units of energy, is intuitively valuable.

Ben-Naim also discusses entropy as a generalized property from several different common views, which are equivalent. The argument of showing how a certain quantity, seemingly different because of the dependent variable, is actually logically the same is an argument familiar to physicists, helping to put the macroscopic notion of entropy put on firm footing, and not the "Where did this come from" basis of saying it is the contour integral of the change in heat per temperature.

If you are a scientist, you will fly through this book, and reap quick rewards. Chemists/Physicists will be already quite used to much of the material in the book, but the analysis of certain chapters (for me, 2, 6, and his epilogue 8) are invaluable, clearly spoken insights.

He also offers to send him an email if you are confused!

This is what science should be: writing done without pretension (in contrast to Atkins).