The "mind projection fallacy" occurs when somebody expects an overly direct resemblance between the intuitive language of the mind, and the language of physical reality.
As an archetypal example: Suppose you flip a coin, slap it down over your wrist, and don't yet look at it. Does it make sense to say that the probability of the coin being 'heads' is 1/2? How can this be true, when the coin itself is either objectively heads or objectively tails?
To which we might reply: Uncertainty is in the mind, not in reality. If you're ignorant about a coin, that's not a fact about the coin, it's a fact about you. A blank map does not correspond to a blank territory: if you come to a part of the country where the map is blank because nobody's mapped the country yet, you won't actually see a vast white space stretching out in front of you.
It makes sense that your brain, the map, has an internal measure of how it's more or less sure of something. But that doesn't mean the coin itself has to contain a corresponding quantity of increased or decreased sureness; it can just be heads or tails.
"Ontology" refers to the elementary or basic components of a system. The ontology of your map includes an intuitive measure of your uncertainty - we can imagine it as something your brain represents as a primitive type in its computations, like 'floating-point numbers' are a primitive type in CPUs. The Mind Projection Fallacy is when we reason as if the territory, the physical universe and its laws, must have the same sort of ontology as the map.
See also:
The "mind"mind projection fallacy"fallacy" occurs when somebody expects an overly direct resemblance between the intuitive language of the mind, and the language of physical reality.
Consider the map and territory metaphor, in which the world is a like a territory and your mental model of the world is like a map of that territory. In this metaphor, the mind projection fallacy is analogous to thinking that the territory can be folded up and put into your pocket.
As an archetypal example: Suppose you flip a coin, slap it down overagainst your wrist, and don't yet look at it. Does it make sense to say that the probability of the coin being 'heads'heads is 1/2?50%? How can this be true, when the coin itself is already either objectivelydefinitely heads or objectivelydefinitely tails?
To which we might reply:One who says "the coin is fundamentally uncertain; it is a feature of the coin that it is always 50% likely to be heads" commits the mind projection fallacy. Uncertainty is in the mind, not in reality. If you're ignorant about a coin, that's not a fact about the coin, it's a fact about you. A blank map does not correspond to a blank territory: if you come to a part of the country where the map is blank because nobody's mapped the country yet, you won't actually see a vast white space stretching out in front of you.
It makes sense that your brain, the map, has an internal measure of how it's more or less sure of something. But that doesn't mean the coin itself has to contain a corresponding quantity of increased or decreased sureness; it canis just be heads or tails.
"Ontology" refers toThe ontology of a system is the elementary or basic components of athat system. The ontology of your map includes anmodel of the world may include intuitive measuremeasures of your uncertainty - wethat it can imagine ituse to represent the state of the coin, used as something your brain represents as aprimitives like floating-point numbers are primitive type in its computations, like 'floating-point numbers' are a primitive type in CPUs.computers. The Mind Projection Fallacy is when we reasonmind projection fallacy occurs whenever someone reasons as if the territory, the physical universe and its laws, must have the same sort of ontology as the map.map, our models of reality.