September 25, 2012
This post is about a classic probability puzzle. It goes something like this: I place two envelopes on the table in front of you. One of them contains a Prize, which is an amount of money in pounds, but you don’t know how much it is. The other one contains a Special Bonus Prize, which is worth exactly twice as much money as the Prize. It’s your lucky day — but you can only choose one envelope. Which do you choose?
“Well,” you say to yourself, “it doesn’t matter, they’re both the same,” so you pick one at random. Let’s say it’s the one on the left. But now I ask you if you want to change your mind.
“Well,” you might say to yourself, “let x be the amount of money in the envelope I’m holding. This envelope has a 50% chance of being the Prize, in which case the other envelope contains 2x. On the other hand, there’s a 50% chance that this is the Special Bonus Prize, in which case the other envelope contains 0.5x. But still, the expected value of the other envelope is 0.5*2x + 0.5*0.5x = 1.25x. So on the balance of probabilities I should definitely switch.” But then I offer to let you switch again, and again, and again, and every time you go through the same reasoning, never managing to settle on a particular envelope because each one seems like it should contain more money than the other. Clearly something is wrong with this reasoning, but what is it?
In this post, I’ll solve this problem in what I consider to be the proper Bayesian way, pinpointing exactly where the problem is. You might want to think about the question for a bit and come up with your own idea of its solution before reading on.
read more »
December 8, 2011
The latest Jellymatter poll has been up for a while now, time to discuss what the correct solution is. As well as sounding like a question from a Voight-Kampff test, it is a “double trick question”, based on the Monty Hall problem. It was a little mean of me to post it with my own agenda in mind.
For me, the interesting thing about the Monty Hall problem is vehemency of those who argue for “switch”, option. The argument is nearly always unjustified. Whilst arguing this I will talk about how the problem has been stated in the past: It’s history shows how quickly someones brief, informal argument can change into an unintuitive answer to a ill-posed question and then into a dogmatic belief.
read more »
September 22, 2011
A NASA research satellite is spinning out of control and is due to crash-land today. The satellite, which weighs 20,000 tonnes and could easily be mistaken for a small moon, is expected to explode in a deadly fireball of fiery death, engulfing an area of at least 500 square kilometers. Its malfunctioning weapon systems were designed to target cities, and NASA expects it to obliterate one with an expected population of just over two million. Unfortunately, because of the battle station’s unpredictable trajectory, they won’t know which population centre is doomed until about two hours before it hits, leaving precious little time to evacuate the area. A NASA spokesperson was unavailable for comment on why a research satellite needs to be so big, how such a gigantic object was launched into Earth’s orbit, or why its trajectory is biased towards built-up areas. However, they did release an estimate that your personal probability of being one of the millions who perish in this impending disaster is 1 in 3200…
…or at least, that’s what I was able to deduce from a news report on (Australian) telly the other day, which quoted the 1 on 3200 figure and then ran a whole segment on how much more likely you are to be hit by this satellite than to be struck by lightning, win the lottery, etc.
Of course, in reality, the 1 in 3200 figure is the probability of the debris from the falling spacecraft hitting some person, somewhere in the world, i.e. there’s a 3199 in 3200 chance that it will just plop into the ocean or crash in some unpopulated area and not hit anyone at all. Since there are seven billion people in the world, your chance of being the one person who does get hit is 1 in 22 trillion. You’re more likely to win the lottery and get struck by lightning than you are to be struck by debris from this particular piece of falling space hardware. All of which was completely obvious to me the moment the news anchor quoted the figure – with just a tiny bit of thought it should have been obvious to them as well.
April 30, 2011
The relationship between probability and information is interesting and fun. The table below is a work in progress, but I think it’s kind of cool already. The idea is to compare information quantities with the probabilities of unlikely events. For instance, if I flipped a coin for every bit on my hard drive and they all came up heads, it would be pretty unlikely. But what else would be that improbable?
read more »
April 17, 2011
The is a follow up from Nathaniel’s post. One of the ways that the probabilities of probabilities can be used is in asking what experiments would be best for a scientist to do. We can do this because scientists would like to have a logically consistent system that describes the world but make measurements which are not completely certain – the interpretation of probability as uncertain logic is justified.
Lets make a probabilist model of scientific inquiry. To do this, the first component we need is a model of “what science knows”, or equally, “what the literature says”. For the purposes here, I will only consider what science knows about one statement: “The literature says X is true”. I’ll write this as and its negation as . This is a really minimal example.
read more »