Posts tagged ‘information’

June 1, 2013

Friston’s Free Energy for Dummies

by Lucas Wilkins

People always want an explanation of Friston’s Free Energy that doesn’t have any maths. This is quite a challenge, but I hope I have managed to produce something comprehensible.

This is basically a summary of Friston’s Entropy paper (available here). A friend of jellymatter was instrumental in its production, and for this reason I am fairly confident that my summary is going in the right direction, even if I have not emphasised exactly the same things as Friston.

I’ve made a point of writing this without any maths, and I have highlighted what I consider to be the main assumptions of the paper and maked them with a P.

July 1, 2012

Visualizing the mutual information and an introduction to information geometry

by Lucas Wilkins

For a while now I have had an interest in information geometry. The maxims that geometry is intuitive maths and information theory is intuitive statistics seem pretty fair to me, so it’s quite surprising to find a lack of easy to understand introductions to information geometry. This is my first attempt, the idea is to get an geometric understanding of the mutual information and to introduce a few select concepts from information geometry.

April 21, 2012

An interesting relationship between physics and information theory

by Nathaniel Virgo

Lately I’ve been hanging out on Physics Stack Exchange, a question-and-answer site for physicists and people interested in physics. Someone asked a question recently about the relationship between thermodynamics and a quantity from information theory.  It lead me to quite an interesting result, which I think is new.

December 8, 2011

Poll discussion: The Monty Hall Controversy

by Lucas Wilkins

The latest Jellymatter poll has been up for a while now, time to discuss what the correct solution is. As well as sounding like a question from a Voight-Kampff test, it is a “double trick question”, based on the Monty Hall problem. It was a little mean of me to post it with my own agenda in mind.

For me, the interesting thing about the Monty Hall problem is vehemency of those who argue for “switch”, option. The argument is nearly always unjustified. Whilst arguing this I will talk about how the problem has been stated in the past: It’s history shows how quickly someones brief, informal argument can change into an unintuitive answer to a ill-posed question and then into a dogmatic belief.

December 6, 2011

Poll: Cups and a pea

by Lucas Wilkins

You’re walking down a back alley and find a man with the archetypal three cups and pea. You decide to gamble with him in a game of ‘guess where the pea is’; after all the odds are reasonable and he has assured you that he will demonstrate that at the pea is under one of the cups. He places the pea under one of the cups and shuffles them rapidly and you choose one of the cups. At this point the man overturns one of the cups you did not choose – there is no pea underneath it. He then asks you whether you would like to choose the other upright cup instead…

November 23, 2011

Question

by Nathaniel Virgo

It seems like there should be a word that goes in the bottom-right here:

Energy Power
Entropy ?????

However, as far as I’m aware no such word exists, so we’ll have to make one up.  Does anyone have any good ideas?

To be clear, what I’m after is a general term for any quantity whose units are entropy-units-per-time-unit, i.e. JK^{-1}s^{-1} or \text{bits}\cdot s^{-1}.  The term “entropy production” is currently in use for the rate at which systems create entropy, but I want a word that can also refer to the rate at which systems extract negative entropy from their surroundings. (You can have a power loss as well as a power gain.)

The only thing I can think of is “empowerment”, which sort-of makes sense but is icky.

July 17, 2011

Reification: just because a thing has a name, doesn’t mean it is a thing

by James Thorniley

Science does not rely on investigators being unbiased “automatons.” Instead, it relies on methods that limit the ability of the investigator’s admittedly inevitable biases to skew the results.

So says a paper by J. E. Lewis et al in which they claim Steven Jay Gould was wrong when he said early 19th century craniometrist Samuel George Morton “finagled” his data to match his own racist preconceptions. They had another look at the data, and actually remeasured some of Morton’s skulls, and claim that Morton’s reported results actually fit his racial bias less than a fully accurate study would have.

Depressingly a number of modern day internet racists seem picked up on the headline message “Gould was wrong” and assumed that means the paper supports racial theories about intelligence or other differences. The paper doesn’t support any such ideas, and that’s not the subject of this post. It’s just worth pointing that out.

What this paper is about is whether scientists’ personal biases influence the results they get. This isn’t about whether Morton was “right” in a scientific sense, because everyone agrees he wasn’t. It’s about whether he made the right conclusions based on the evidence available to him. It’s a historical question – modern anthropology has essentially nothing to do with this.

April 30, 2011

Powers of 2

by Nathaniel Virgo

The relationship between probability and information is interesting and fun.  The table below is a work in progress, but I think it’s kind of cool already.  The idea is to compare information quantities with the probabilities of unlikely events.  For instance, if I flipped a coin for every bit on my hard drive and they all came up heads, it would be pretty unlikely.  But what else would be that improbable?

April 18, 2011

Entropy is Disorder

by Lucas Wilkins

Well, it’s not exactly, but I thought I’d be argumentative. Here is the problem with entropy and disorder as I see it, possibly somewhat different to Nathaniel.

There are two things that are commonly called entropy, one of which is a specific case of the other. These two types of entropy are physical/thermodynamic entropy and statistical entropy. Thermodynamic entropy is a statistical entropy applied specifically to physical microstates. As physicists generally agree on their definition of the microstates, thermodynamic entropy is well defined physical quantity. Statistical entropy on the other hand can be applied to anything that we can define a probability measure for.

April 17, 2011

A scientist modelling a scientist modelling science

by Lucas Wilkins

The is a follow up from Nathaniel’s post. One of the ways that the probabilities of probabilities can be used is in asking what experiments would be best for a scientist to do. We can do this because scientists would like to have a logically consistent system that describes the world but make measurements which are not completely certain – the interpretation of probability as uncertain logic is justified.

Lets make a probabilist model of scientific inquiry. To do this, the first component we need is a model of “what science knows”, or equally, “what the literature says”. For the purposes here, I will only consider what science knows about one statement: “The literature says X is true”. I’ll write this as p(X|L) and its negation as p(\bar{X}|L) = 1 - p(X|L). This is a really minimal example.

Follow

Get every new post delivered to your Inbox.

Join 477 other followers