Posts tagged ‘probability’

June 24, 2013

Measure Theoretic Probability for Dummies: Part I

by Lucas Wilkins

Nothing makes me empathise more with those struggling with probability theory than reading things like this on Wikipedia:

Let (Ω, F, P) be a measure space with P(Ω)=1. Then (Ω, F, P) is a probability space, with sample space Ω, event space F and probability measure P.

This is written so that only the people who already know what it is saying can understand it. The only possible value of this sentence would be to someone who managed to study measure theory without being exposed to it’s most widespread application; in other words: no one! Whilst the attitude this, and soooo many Wikipedia pages displays encourages people to be precise in a way that mathematicians cherrish, it also alienates a lot of perfectly capable, intelligent people who just run out of patience in the face of the relentless influx of oblique statements.

Personally, I think that understanding probability spaces is very important, but for the reasons including those I mention above, most people find the measure theoretic formalisation daunting. Here I have tried to outline the most widely used formalisation, which has turned out to be far more work than I expected…

September 25, 2012

Paradoxes of probability theory: the two envelopes

by Nathaniel Virgo

This post is about a classic probability puzzle. It goes something like this: I place two envelopes on the table in front of you. One of them contains a Prize, which is an amount of money in pounds, but you don’t know how much it is. The other one contains a Special Bonus Prize, which is worth exactly twice as much money as the Prize. It’s your lucky day — but you can only choose one envelope. Which do you choose?

“Well,” you say to yourself, “it doesn’t matter, they’re both the same,” so you pick one at random. Let’s say it’s the one on the left. But now I ask you if you want to change your mind.

“Well,” you might say to yourself, “let x be the amount of money in the envelope I’m holding. This envelope has a 50% chance of being the Prize, in which case the other envelope contains 2x. On the other hand, there’s a 50% chance that this is the Special Bonus Prize, in which case the other envelope contains 0.5x. But still, the expected value of the other envelope is 0.5*2x + 0.5*0.5x = 1.25x. So on the balance of probabilities I should definitely switch.” But then I offer to let you switch again, and again, and again, and every time you go through the same reasoning, never managing to settle on a particular envelope because each one seems like it should contain more money than the other.  Clearly something is wrong with this reasoning, but what is it?

In this post, I’ll solve this problem in what I consider to be the proper Bayesian way, pinpointing exactly where the problem is.  You might want to think about the question for a bit and come up with your own idea of its solution before reading on.

December 8, 2011

Poll discussion: The Monty Hall Controversy

by Lucas Wilkins

The latest Jellymatter poll has been up for a while now, time to discuss what the correct solution is. As well as sounding like a question from a Voight-Kampff test, it is a “double trick question”, based on the Monty Hall problem. It was a little mean of me to post it with my own agenda in mind.

For me, the interesting thing about the Monty Hall problem is vehemency of those who argue for “switch”, option. The argument is nearly always unjustified. Whilst arguing this I will talk about how the problem has been stated in the past: It’s history shows how quickly someones brief, informal argument can change into an unintuitive answer to a ill-posed question and then into a dogmatic belief.

September 22, 2011

Giganto-Satellite to Crush City

by Nathaniel Virgo

A NASA research satellite is spinning out of control and is due to crash-land today.  The satellite, which weighs 20,000 tonnes and could easily be mistaken for a small moon, is expected to explode in a deadly fireball of fiery death, engulfing an area of at least 500 square kilometers.  Its malfunctioning weapon systems were designed to target cities, and NASA expects it to obliterate one with an expected population of just over two million.  Unfortunately, because of the battle station’s unpredictable trajectory, they won’t know which population centre is doomed until about two hours before it hits, leaving precious little time to evacuate the area.  A NASA spokesperson was unavailable for comment on why a research satellite needs to be so big, how such a gigantic object was launched into Earth’s orbit, or why its trajectory is biased towards built-up areas.  However, they did release an estimate that your personal probability of being one of the millions who perish in this impending disaster is 1 in 3200…

…or at least, that’s what I was able to deduce from a news report on (Australian) telly the other day, which quoted the 1 on 3200 figure and then ran a whole segment on how much more likely you are to be hit by this satellite than to be struck by lightning, win the lottery, etc.

Of course, in reality, the 1 in 3200 figure is the probability of the debris from the falling spacecraft hitting some person, somewhere in the world, i.e. there’s a 3199 in 3200 chance that it will just plop into the ocean or crash in some unpopulated area and not hit anyone at all.  Since there are seven billion people in the world, your chance of being the one person who does get hit is 1 in 22 trillion.  You’re more likely to win the lottery and get struck by lightning than you are to be struck by debris from this particular piece of falling space hardware.  All of which was completely obvious to me the moment the news anchor quoted the figure – with just a tiny bit of thought it should have been obvious to them as well.

June 4, 2011

The Projected Mind

by Lucas Wilkins

This is a post about the “Projected Mind Fallacy”, as named by Edwin Jaynes. Roughly, the projected mind fallacy is the mistaking of uncertainty about the world for a property of the world itself. In other words, thinking that God plays dice. Unfortunately though, it’s not as simple as this and I feel that it’s interpretation deserves some discussion. It is not obvious exactly what should count ‘the world itself’/the non dice playing God/reality, the rest of this post is about how I think this question should be answered (and some other stuff).

Mind projection fallacies can often be spotted by their absurdity: If I had a bag of snooker balls it would be ridiculous to think that the balls exist in some kind of mixture of colours until I pick them out an look at them. Surely the balls are objectively some colour whether or not I decide to look at them. It is (if one accepts the mind projection fallacy) fallacious to say that that the balls that have an indeterminant colour, when in fact, it is just me who doesn’t know which one I will pick out. It is often stated as the confusion of ontology with epistemology, but these words don’t really help anyone understand it.

April 30, 2011

Powers of 2

by Nathaniel Virgo

The relationship between probability and information is interesting and fun.  The table below is a work in progress, but I think it’s kind of cool already.  The idea is to compare information quantities with the probabilities of unlikely events.  For instance, if I flipped a coin for every bit on my hard drive and they all came up heads, it would be pretty unlikely.  But what else would be that improbable?

April 17, 2011

A scientist modelling a scientist modelling science

by Lucas Wilkins

The is a follow up from Nathaniel’s post. One of the ways that the probabilities of probabilities can be used is in asking what experiments would be best for a scientist to do. We can do this because scientists would like to have a logically consistent system that describes the world but make measurements which are not completely certain – the interpretation of probability as uncertain logic is justified.

Lets make a probabilist model of scientific inquiry. To do this, the first component we need is a model of “what science knows”, or equally, “what the literature says”. For the purposes here, I will only consider what science knows about one statement: “The literature says X is true”. I’ll write this as p(X|L) and its negation as p(\bar{X}|L) = 1 - p(X|L). This is a really minimal example.

Follow

Get every new post delivered to your Inbox.

Join 480 other followers