Posts tagged ‘entropy’

April 21, 2012

An interesting relationship between physics and information theory

by Nathaniel Virgo

Lately I’ve been hanging out on Physics Stack Exchange, a question-and-answer site for physicists and people interested in physics. Someone asked a question recently about the relationship between thermodynamics and a quantity from information theory.  It lead me to quite an interesting result, which I think is new.

November 23, 2011

Question

by Nathaniel Virgo

It seems like there should be a word that goes in the bottom-right here:

Energy Power
Entropy ?????

However, as far as I’m aware no such word exists, so we’ll have to make one up.  Does anyone have any good ideas?

To be clear, what I’m after is a general term for any quantity whose units are entropy-units-per-time-unit, i.e. JK^{-1}s^{-1} or \text{bits}\cdot s^{-1}.  The term “entropy production” is currently in use for the rate at which systems create entropy, but I want a word that can also refer to the rate at which systems extract negative entropy from their surroundings. (You can have a power loss as well as a power gain.)

The only thing I can think of is “empowerment”, which sort-of makes sense but is icky.

July 18, 2011

What is life? (and why the answer doesn’t matter)

by Nathaniel Virgo

What is life?  Some people will say it’s obvious: life is reproduction.  But I may never choose to reproduce, and a worker ant couldn’t if it wanted to – does that make us dead?

Others will say life is evolution.  But on closer inspection, that doesn’t really stand up either.  Evolution is easy enough to implement on a computer.  You just store a bunch of random bit strings in memory, evaluate them according to some “fitness function”, and then “mutate” and “recombine” the best ones to produce a new generation.  By iterating this process you get what’s called a “genetic algorithm”, and this can be used to design robot controllers and all sorts of other things.  These things evolve, but are they alive?  Some might say yes, but anyone with any experience in genetic algorithms will say no.

June 22, 2011

Fisher on Thermodynamics and Evolution

by Lucas Wilkins

I’ve been reading Ronald Fishers book: The Genetical Theory of Natural Selection, which is now publicly available. I was a little surprised to find he wrote a page or two on thermodynamics and entropy in evolution, here it is, verbatim, with a couple of comments on the numbered points. First though, his definition, in words, of the fundamental theorum of Natural Selection:

The rate of increase in fitness of any organism at any time is equal to its genetic variance at that time.

with that in mind…

May 9, 2011

Is the second law of thermodynamics connected to the expansion of the universe?

by Nathaniel Virgo

This is just a bit of idle wondering, another little bit of amateur cosmology from someone who should probably know better. The question I’m asking myself today is, is the second law of thermodynamics connected to the expansion of the universe?

April 30, 2011

Powers of 2

by Nathaniel Virgo

The relationship between probability and information is interesting and fun.  The table below is a work in progress, but I think it’s kind of cool already.  The idea is to compare information quantities with the probabilities of unlikely events.  For instance, if I flipped a coin for every bit on my hard drive and they all came up heads, it would be pretty unlikely.  But what else would be that improbable?

April 18, 2011

Entropy is Disorder

by Lucas Wilkins

Well, it’s not exactly, but I thought I’d be argumentative. Here is the problem with entropy and disorder as I see it, possibly somewhat different to Nathaniel.

There are two things that are commonly called entropy, one of which is a specific case of the other. These two types of entropy are physical/thermodynamic entropy and statistical entropy. Thermodynamic entropy is a statistical entropy applied specifically to physical microstates. As physicists generally agree on their definition of the microstates, thermodynamic entropy is well defined physical quantity. Statistical entropy on the other hand can be applied to anything that we can define a probability measure for.

April 15, 2011

Entropy is Not Disorder

by Nathaniel Virgo

The second law of thermodynamics — the law of entropy — is a fascinating thing.  It’s the law that makes the past different from the future; it’s the law that predicts an effective end to the Universe, yet it’s the law that makes life possible.  It’s also a deeply mysterious law.  It took well over a century for true meaning of entropy to be understood (in fact arguments on the subject still rage today), and we still don’t understand, on a cosmological level, exactly why it was so low in the past.

One of the things that’s often said about entropy is that it means “disorder”.  This post is about that idea.  It’s worth discussing for two reasons: firstly, it’s wrong.  It’s close to the truth, in the same sort of way that a spectrum is close to a rainbow, but not the same.  Secondly, the real truth is much more interesting.

Follow

Get every new post delivered to your Inbox.

Join 474 other followers