Entropy is Disorder

by Lucas Wilkins

Well, it’s not exactly, but I thought I’d be argumentative. Here is the problem with entropy and disorder as I see it, possibly somewhat different to Nathaniel.

There are two things that are commonly called entropy, one of which is a specific case of the other. These two types of entropy are physical/thermodynamic entropy and statistical entropy. Thermodynamic entropy is a statistical entropy applied specifically to physical microstates. As physicists generally agree on their definition of the microstates, thermodynamic entropy is well defined physical quantity. Statistical entropy on the other hand can be applied to anything that we can define a probability measure for.

Now lets look at disorder. What’s that about? First of all disorder is subjective:

Introducing Mister A. Mister A has at hand an equation, it is the logistic equation, a chaotic map. He can use this to produce a what could be seen as a sequence of random numbers, but he knows exactly how this sequence of numbers is produced. To him, they are completely ordered (satisfying x_{n+k} = r_i x_n(1-x_n) for some real r_is in [0,4] and integer k, starting with x_1...x_i...x_k). Mister A has a friend Mister B, who is blissfully unaware of the logistic map or Mister As use of it .When Mister B is shown this list of numbers he sees no pattern. And why should he. He sees a string of numbers that seem completely random. Mister B can even try various mathematical transformations, but unless he hits on exactly the conditions that Mister A used to make the sequence, it will appear forever random.

To Mister A, the sequence is completely ordered. To Mister B, the sequence is completely disordered. The conclusion we can draw from this is: when we talk about disorder we have to specify a particular point of view.

Thermodynamic entropy has an inherent point of view: the physical microstates of the system being studied. This point of view is of course completely unintuitive to all but a few (physicists). This microscopic point of view is not the macroscopic point of view where one might see the phase boundaries Nathaniel mentioed in his post on this topic (he discusses the disorder of an emulsion and a layered oil/water mixture). However, this macroscopic level of description is completely reasonable, even though thermodynamic entropy isn’t the correct choice for a correlate of disorder.

We can still come up with some other measure of disorder for the macroscopic system, and it can be an entropy too. We could take the emulsified/seperated system and split it up into macroscopic voxels of sizes smaller than the droplets, and, over a small but not to small period of time measure the probability of each one being mainly full of water or mainly full of oil. We could then find the statistical entropy of this. This entropy  would decrease with time, contrary to what one might be lead to believe from the second law of thermodynamics. But thermodynamic entropy is the only type of entropy that one should expect to obey the second law, and this isn’t that.

So I would like to say something slightly different and far less polemic than title of this post :

Thermodynamic entropy is only disorder if you are a physicist with a physicists conception of disorder. But, whatever your concept of disorder is, there is probably a statistical entropy which corresponds your own particular version of it,  just don’t expect your entropy to increase with time.

About these ads

5 Responses to “Entropy is Disorder”

  1. I agree of course. I was talking specifically about the thermodynamic entropy – that post is one of a series of posts on the subject of thermodynamic entropy, and I haven’t really covered its statistical nature yet, although I did allude to it in that last post. Not for the first time I find myself wishing these two separate but related concepts (thermodynamic and statistical entropy) hadn’t been given the same name.

    But, fyi, it’s easier than you think for Mr. B to find the pattern in Mr. A’s numbers. I think that deserves a post of its own…

  2. Claude Shannon tells a story that when he was working on his seminal paper on communication theory, he didn’t know what to call the quantity
    {-\sum_i p_i \log p_i}.
    He considered the name “uncertainty” (which is what it is, according to his derivation) but then he spoke to John von Neumann about it, who said

    You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.

    You can see where von Neumann was coming from there, but it was really a mistake in my opinion. If shannon had called the quantity uncertainty instead of entropy then we wouldn’t have this annoying confusion between thermodynamic and statistical entropy all the time. But more than that, people be going around saying “entropy is uncertainty” instead of “entropy is disorder”. The former has the massive advantage of actually being true, and also not widely enough understood.

  3. Also, I would correct the first sentence of your conclusion to read

    Thermodynamic entropy is only disorder if:

    1. You are a physicist with a physicist’s conception of disorder, AND
    2. You are a physicist with a physicist’s conception of entropy, AND
    3. You are studying an ideal gas rather than any other system.

Trackbacks

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 477 other followers

%d bloggers like this: