This is a post about the “Projected Mind Fallacy”, as named by Edwin Jaynes. Roughly, the projected mind fallacy is the mistaking of uncertainty about the world for a property of the world itself. In other words, thinking that God plays dice. Unfortunately though, it’s not as simple as this and I feel that it’s interpretation deserves some discussion. It is not obvious exactly what should count ‘the world itself’/the non dice playing God/reality, the rest of this post is about how I think this question should be answered (and some other stuff).
Mind projection fallacies can often be spotted by their absurdity: If I had a bag of snooker balls it would be ridiculous to think that the balls exist in some kind of mixture of colours until I pick them out an look at them. Surely the balls are objectively some colour whether or not I decide to look at them. It is (if one accepts the mind projection fallacy) fallacious to say that that the balls that have an indeterminant colour, when in fact, it is just me who doesn’t know which one I will pick out. It is often stated as the confusion of ontology with epistemology, but these words don’t really help anyone understand it.
In the case of the snooker balls, there is an ontology (my own usage, more like this) – a list of facts: I assume that the colour of the balls remains the same – that they wont, for example, all turn purple when my back is turned. Importantly, I am making the assumption that at some lower level than the ball picking experiment, the colour of the balls is completely predictable. So if I was to say probability of picking a red ball is 15/22, this statement comes loaded with a whole pile of facts which are assumed: That I know for sure that the bag contains a full set of snooker balls, that I wont put my hand in the bag to only find butterscotch flavoured blancmange, that snooker balls don’t reproduce, etc, etc.
Statements about what one knows, such as probabilities, come with tacit assumptions about what there is to know about. Some of these assumptions may be scientific or social convention, but some are facts that are plainly true to any human, or to any agent, or to any collection of particles with particular properties. This applies to non-probabilistic statements too, but we are not interested in them here.
Of course, this idea is not new at all. It’s all very postmodern. Arguments like this (although, I expect not expressed like I have here) form the basis for the field of critical theory which for a long period of time warred with science before both parties lost interest. I like to think that it went something like this…
Humanities: Your interpretation is only rational under a particular set of assumptions. In fact, it’s quite useful to consider other assumptions, especially if one wants to understand behavior one would consider irrational.
Science: Bollocks, we have Fisherian statistics which is truly objective. It’s all we need.
Humanities: My maths skills aren’t good enough to produce a coherent argument, but there seems to be something very wrong with what you are saying.
Science: Yes, truly objective, and there is nothing you can do to change our minds, as we are objectively correct.
Humanities: But, don’t your tests depend on absurd things like the intent of the experimenter?
Science: I don’t know about that, I just put numbers into a computer. How can the output of a program depend on intent? Besides, whoever wrote the program thought about these things, I don’t have to.
Humanities: So, you just assume nothing wrong with it and put your fingers in your ears whenever someone else says otherwise?
Science: Pretty much
Humanities: But thats cra…
Science: La la la, do do do, nah nah…
Humanities: I give up.
Science: Good, I don’t care anyway
… and they went their separate ways. There is little doubt in my mind that this conflict has hindered the adoption of Bayesian statistics in the sciences. I have quite a lot of sympathy for the postmodernists, even though I think they often go a bit too far. The reason I mention this is for the sake of scientists: just because something that impacts science is said by someone from the arts or humanities doesn’t make it automatically wrong – it would be a mistake to ignore something just because it is on ‘the wrong side’ of a tedious and dated academic spat. You can be a realist without rejecting postmodern arguments outright.
So, OK, back to the main point. Probabilistic statements require an ontology (the list of facts) which we are free to choose, if only implicitly. The projected mind fallacy is about putting a probabilistic statement into the ontology. What’s wrong with that? In fact, previously I gave an example of a case where it is actually the correct thing to do. Namely, when reasoning about another reasoning party. It is definitely not a problem in such cases. It is more complicated though: there is your ontology, and, there is the other party’s ontology within it. But, if the other party is well defined, it is quite doable.
Having another party involved requires something else too. It requires a mechanism by which the other party’s knowledge becomes something that you can measure. There are plenty of ways this can be done for other people, and even for some more abstract third parties. But, how would a god of snooker balls manifest his/her knowledge? You can find yourself without a mechanistic explanation or having to suggest some fairly stupid things. A good case of this arises in quantum mechanics, if we have a probability distribution defined by a wave-function which is not a property of our knowledge, how does this become a real measurable thing? Collapse, is the usual answer from the Copenhagen interpretation. And wave-function collapse, as demonstrated by Schrödinger’s cat, is a very silly idea – it is possible for uncertainty to ‘leak’ into things that we had presumed were certain.
When you make a statement wherein the ontology contains a probabilistic statement, you must ask what the ontology of the probabilistic statement is. If your answer is ‘my ontology’, then clearly the statement is about your own knowledge and the statement shouldn’t really be part of your ontology, so can be considered fallacious. In the case of the snooker balls, your answer could be the ontology of everyone who is prevented from looking in the bag but knows what is in it – including yourself. When ‘my ontology’ isn’t the answer, the ontology may be that of another agent, but what agent would that be, often its hard to say. The god of snooker ball colour? I guess a better answer would be: the god of everything, except snooker ball colour, who in which respect is ignorant. If you believe in such entities (they are in the original ontology), then fair enough, if it turns out that you are in effect postulating the existence of something that doesn’t exist (i.e. it is non-existence is in the original ontology), or, something that manifests its knowledge in a way which is absurd or unknowable, then your statement is either fallacious, or at best not very sensible.
Whether you consider mind projection an actual fallacy or not, the results of treating it as such help in creating a sensible and consistent description of the world: We can be sure we have a thorough description which does not leak uncertainty. If we ignore it, we are ignoring a useful tool for creating sensible explanations.