Lately I’ve been hanging out on Physics Stack Exchange, a question-and-answer site for physicists and people interested in physics. Someone asked a question recently about the relationship between thermodynamics and a quantity from information theory. It lead me to quite an interesting result, which I think is new.

## Fisher on Thermodynamics and Evolution

I’ve been reading Ronald Fishers book: The Genetical Theory of Natural Selection, which is now publicly available. I was a little surprised to find he wrote a page or two on thermodynamics and entropy in evolution, here it is, verbatim, with a couple of comments on the numbered points. First though, his definition, in words, of the fundamental theorum of Natural Selection:

The rate of increase in fitness of any organism at any time is equal to its genetic variance at that time.

with that in mind…

## Entropy is Disorder

Well, it’s not exactly, but I thought I’d be argumentative. Here is the problem with entropy and disorder as I see it, possibly somewhat different to Nathaniel.

There are two things that are commonly called entropy, one of which is a specific case of the other. These two types of entropy are physical/thermodynamic entropy and statistical entropy. Thermodynamic entropy is a statistical entropy applied specifically to physical microstates. As physicists generally agree on their definition of the microstates, thermodynamic entropy is well defined physical quantity. Statistical entropy on the other hand can be applied to anything that we can define a probability measure for.

## Entropy is Not Disorder

The second law of thermodynamics — the law of entropy — is a fascinating thing. It’s the law that makes the past different from the future; it’s the law that predicts an effective end to the Universe, yet it’s the law that makes life possible. It’s also a deeply mysterious law. It took well over a century for true meaning of entropy to be understood (in fact arguments on the subject still rage today), and we still don’t understand, on a cosmological level, exactly why it was so low in the past.

One of the things that’s often said about entropy is that it means “disorder”. This post is about that idea. It’s worth discussing for two reasons: firstly, it’s wrong. It’s close to the truth, in the same sort of way that a spectrum is close to a rainbow, but not the same. Secondly, the real truth is much more interesting.

## Falling Into a Black Hole, Part 1

A little while ago I read Leonard Susskind’s book The Black Hole War (subtitle: My Battle with Stephen Hawking to make the World Safe For Quantum Mechanics). It’s an interesting and mostly quite readable popular science book about the black hole information paradox. Susskind thinks that information isn’t destroyed when stuff falls into a black hole, and his book is about why.

The first part of the book has some useful thought experiments about black holes, some of which I’ll take you through below. After that it starts to talk about string theory, whereupon it becomes as utterly incomprehensible as any other book on the subject.

However, I think Susskind makes an important logical error just before he turns to string theory. I think that if you correct this error then it leads to a much more elegant resolution of the information paradox — one that doesn’t require the use of string theory. I won’t get as far as talking about that in this post, but I will point out the error I think Susskind makes, and show how resolving it leads to a simpler explanation of what happens when something passes an event horizon.