Lately I’ve been hanging out on Physics Stack Exchange, a question-and-answer site for physicists and people interested in physics. Someone asked a question recently about the relationship between thermodynamics and a quantity from information theory. It lead me to quite an interesting result, which I think is new.

The question was

“I am facing with the concept of cross entropy. I would like to know the thermodynamic and statistical meaning of cross entropy (if exists)?” — Physics Stack Exchange user emanuele

I found the question interesting and did a few calculations. I couldn’t find a nice thermodynamic meaning for the cross entropy, but I did find one for a related quantity called the Kullback-Leibler divergence. As far as I know, this is a new result.

The cross entropy between two probability distributions is defined as

These two probability distributions should both refer to the same set of underlying states. Normally in thermodynamics we think of a system only having one probability distribution, which represents (roughly) the range of possible states the system might be in at the present time. But systems can change over time. So let’s imagine have a system (with constant volume) that’s initially in equilibrium with a heat bath at a temperature . According to the usual principles of statistical mechanics, its state can be represented by the probability distribtion

where (I’ve set Boltzmann’s constant equal to 1 for clarity) and is a normalisation factor called the partition function. The are the energy levels of the system’s permitted microscopic states.

Now let’s imagine we pick up our system and put it in contact with a heat bath at a different temperature, , and let it come to equilibrium again. Since no work has been done, all the values will be unchanged and we’ll have a new distribtion that looks like this

where .

Now we can do a bit of algebra to find the cross-entropy:

It’s a standard result from statistical mechanics that

Solving this for and substituting into the cross entropy formula we have

Physicists are, generally speaking, afraid of any quantity that has entropy units, and if they see one they like to multiply it by a temperature in order to make it look like an energy. If we multiply this by we get

It’s possible that this might have a nice thermodynamic interpretation in terms of something like the maximum amount of work that we can extract from doing this transformation under a particular set of circumstances — but if it does then I haven’t seen it just yet. The expression looks tantalisingly like a change in free energy ($latex \Delta U – T\Delta S$), but it’s not quite the same.

However, we can get a much more interesting result if we note that in information theory, the Kullback-Leibler divergence (aka information gain) is often seen as more fundamental than the cross entropy. The KL-divergence is defined as

which in our case is equal to

This is much more interesting than the result for the cross-entropy, because it does have a clear thermodynamic interpretation. When we put the system in contact with the second heat bath, its entropy changes by $latex \Delta S$, and the entropy of the heat bath changes by $latex -\Delta U/T_2$. (This is because entropy is heat divided by temperature – an amount $latex \Delta U$ leaves the system, so $latex -\Delta U$ enters the heat bath.) So the KL-divergence is just the total change in entropy after we put the system in contact with the new heat bath. I’m quite excited about this because I didn’t know it before, and I don’t think anyone else did either!

We can even take this a bit further. Let’s imagine putting a heat engine in between the system and the second heat reservoir. So we’ll try to extract some useful work from the flow of heat that takes place as the system and the heat bath equilibriate. If we do this the total change of entropy becomes $latex \Delta S + (-\Delta U – W)/T_2$. This has to be greater than 0, which means that $latex W\le T_2\Delta S – \Delta U$.

Now, if we do that physicist thing of multiplying $latex D_{KL}$ by $latex T_2$, it becomes $latex T_2\Delta S – \Delta U$, which is the value for the maximum work that we just calculated. So while the thermodynamic meaning of the cross-entropy isn’t clear to me, the KL-divergence does seem to have a nice interpretation in terms of work.

(My apologies for some of the LaTeX code not being rendered properly. I don’t know why it’s not working.)