Eliezer Yudkowsky previously wrote (6 years ago!) about the second law of thermodynamics. Many commenters were skeptical about the statement, "if you know the positions and momenta of every particle in a glass of water, it is at absolute zero temperature," because they don't know what temperature is. This is a common confusion.

**Entropy**

To specify the precise state of a classical system, you need to know its location in phase space. For a bunch of helium atoms whizzing around in a box, phase space is the position and momentum of each helium atom. For *N* atoms in the box, that means 6*N* numbers to completely specify the system.

Lets say you know the total energy of the gas, but nothing else. It will be the case that a fantastically huge number of points in phase space will be consistent with that energy.* In the absence of any more information it is correct to assign a uniform distribution to this region of phase space. The entropy of a uniform distribution is the logarithm of the number of points, so that's that. If you also know the volume, then the number of points in phase space consistent with both the energy and volume is necessarily smaller, so the entropy is smaller.

This might be confusing to chemists, since they memorized a formula for the entropy of an ideal gas, and it's ostensibly objective. Someone with perfect knowledge of the system will calculate the same number on the right side of that equation, but to them, that number isn't the entropy. It's the entropy of the gas if you know nothing more than energy, volume, and number of particles.

**Temperature**

The existence of temperature follows from the zeroth and second laws of thermodynamics: thermal equilibrium is transitive, and entropy is maximum in equilibrium. Temperature is then defined as the thermodynamic quantity that is the shared by systems in equilibrium.

If two systems are in equilibrium then they cannot increase entropy by flowing energy from one to the other. That means that if we flow a tiny bit of energy from one to the other (*δU*_{1} = -*δU*_{2}), the entropy change in the first must be the opposite of the entropy change of the second (*δS*_{1} = -*δS*_{2}), so that the total entropy (*S*_{1} + *S*_{2}) doesn't change. For systems in equilibrium, this leads to (*∂S*_{1}/*∂U*_{1}) = (*∂S*_{2}/*∂U*_{2}). Define 1/*T* = (*∂S*/*∂U*), and we are done.

Temperature is sometimes taught as, "a measure of the average kinetic energy of the particles," because for an ideal gas *U*/*N *= (3/2) *k _{B}T*. This is wrong as a definition, for the same reason that the ideal gas entropy isn't the definition of entropy.

Probability is in the mind. Entropy is a function of probabilities, so entropy is in the mind. Temperature is a derivative of entropy, so temperature is in the mind.

**Second Law Trickery**

With perfect knowledge of a system, it is possible to extract all of its energy as work. EY states it clearly:

So (again ignoring quantum effects for the moment), if you

knowthe states of all the molecules in a glass of hot water, it is cold in a genuinely thermodynamic sense: you can take electricity out of it and leave behind an ice cube.

Someone who doesn't know the state of the water will observe a violation of the second law. This is allowed. Let that sink in for a minute. Jaynes calls it second law trickery, and I can't explain it better than he does, so I won't try:

A physical system always has more macroscopic degrees of freedom beyond what we control or observe, and by manipulating them a trickster can always make us see an apparent violation of the second law.

Therefore the correct statement of the second law is not that an entropy decrease is impossible in principle, or even improbable; rather that it

cannot be achieved reproducibly by manipulating the macrovariables {XAny attempt to write a stronger law than this will put one at the mercy of a trickster, who can produce a violation of it._{1}, ..., X_{n}} that we have chosen to define our macrostate.But recognizing this should increase rather than decrease our confidence in the future of the second law, because it means that if an experimenter ever sees an apparent violation, then instead of issuing a sensational announcement, it will be more prudent to search for that unobserved degree of freedom. That is, the connection of entropy with information works both ways; seeing an apparent decrease of entropy signifies ignorance of what were the relevant macrovariables.

**Homework**

I've actually given you enough information on statistical mechanics to calculate an interesting system. Say you have *N* particles, each fixed in place to a lattice. Each particle can be in one of two states, with energies 0 and ε. Calculate and plot the entropy if you know the total energy: *S*(*E*), and then the energy as a function of temperature: *E*(*T*). This is essentially a combinatorics problem, and you may assume that *N* is large, so use Stirling's approximation. What you will discover should make sense using the correct definitions of entropy and temperature.

*: How many combinations of 10^{23 }numbers between 0 and 10 add up to 5×10^{23}?

This is a good article making a valuable point. But this —

— is a confusing way to speak. There is such a thing as "the average kinetic energy of the particles", and one measure of this thing is called "temperature" in some contexts. There is nothing wrong with this as long as you are clear ab... (read more)

I am not sure this is true as stated. An omniscient Maxwell demon that would only allow hot molecules out runs into a number of problems, and an experimentally constructed Maxwell's demon works by converting coherent light (low entropy) into incoherent (high entropy).

Maxwell's demon, as criticized in your first link, isn't omniscient. It has to observe incoming particles, and the claim is that this process generates the entropy.

[Spoiler alert: I can't find any 'spoiler' mode for comments, so I'm just going to give the answers here, after a break, so collapse the comment if you don't want to see that]

.

.

.

.

.

.

.

.

.

.

For the entropy (in natural units), I get

%20=%20N%20\ln%20N%20-%20\frac{E}{\epsilon}%20\ln%20\frac{E}{\epsilon}%20-%20\left(%20N%20-%20\frac{E}{\epsilon}%20\right)%20\ln%20\left(%20N%20-%20\frac{E}{\epsilon}%20\right))

and for the energy, I get

%20=%20\frac{\epsilon%20N}{e%5E{\epsilon%20/%20T}%20-%201})

Is this right? (upon reflection and upon consulting graphs, it seems right ... (read more)

This is related to the physics of computation: ultimate physical computers coincides with temperatures approaching need to 0 K - (reversible computing, Landauer Principle) . Heat/entropy is computational stupidity.

Incidentally, this also explains the fermi paradox: post singularity civilizations migrate out away from their hot stars into the cold interstellar spaces, becoming dark matter. (which however, does not imply that all cold dark matter is intelligent)

I think I've figured out what's bothering me about this. If we think of temperature in terms of our uncertainty about where the system is in phase space, rather than how large a region of phase space fits the macroscopic state, then we gain a little in using the second law, but give up a lot everywhere else. Unless I am mistaken, we lose the following:

I am not sure what this means. In what sense is probability in the mind, but energy isn't? Or if energy is in the mind, as well, what physical characteristic is not and why?

Thanks for this. I am definitely going to use the Gibbs paradox (page 3 of the Jaynes paper) to nerd-snipe my physics-literate friends.

I'l follow suit with the previous spoiler warning.

SPOILER ALERT .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

I took a bit different approach from the others that have solved this, or maybe you'd just say I quit early once I thought I'd shown the thing I thought you were trying to show:

If we write entropy in terms of the number of particles, N and the fraction of them that are excited: α ≡ E/(Nε) , and take the derivative with respect to α, we get:

dS/dα = N log [(1-α)/α]

Or if that N is bothering you (since temperature is usually an intensive property), we can just write:

T = 1/(dS/dE) =... (read more)

I am not quite sure in which way this statement is useful.

"..and for an encore goes on to prove that black is white and gets himself killed on the next zebra crossing." --

Douglas AdamsIf I plunge my hand into boiling water, I will get scalded. Will I still get scalded if I know the position and momentum of every particle involved? If so, what causes it? If not, where does this stop -- is everything in the mind?

ETA: I should have reread the discussion first, because there has been a substantial amount about this very question. However, I'm not sure it has come to a conclus... (read more)

There is a peculiar consequence of this, pointed out by Cosma Shalizi. Suppose we have a deterministic physical system S, and we observe this system carefully over time. We are steadily gaining information about its microstates, and therefore by this definition, its entropy should be decreasing.

You might say, "the system isn't closed, because it is being observed." But consider the system "S plus the observer." Saying that entropy is nondecreasing over time seems to require that the observer is in doubt about its own microstates. What does that mean?