Reminded me of The Three Christs of Ypsilanti:
To study the basis for delusional belief systems, [psychologist] Rokeach brought together three men who each claimed to be Jesus Christ and confronted them with one another's conflicting claims, while encouraging them to interact personally as a support group. Rokeach also attempted to manipulate other aspects of their delusions by inventing messages from imaginary characters. He did not, as he had hoped, provoke any lessening of the patients' delusions, but did document a number of changes in their beliefs.
While initially the three patients quarreled over who was holier and reached the point of physical altercation, they eventually each explained away the other two as being mental patients in a hospital, or dead and being operated by machines.
Where are we on selectively/temporarily/safely de-activating brain regions? Magnetic field to the RDPC sounds like it'd be fantastically fun at partiesextremely informative under the right circumstances.
I had the exact same thought myself back in 2008, so I asked an experimental psych professor about this. At the same, he said that the TMS devices that we had are somewhat wide-area and also induce considerable muscle activation. This doesn't matter very much when studying the occipital lobe, but for the prefrontal cortex you basically start scrunching up the person's face, which is fairly distracting. Maybe worth trying anyway.
I've wanted to get my hands on a TMS device for years. Building one at home does not seem particularly feasible, and the magnetism involved is probably dangerous for nearby metal/electronics...
Building one at home does not seem particularly feasible, and the magnetism involved is probably dangerous for nearby metal/electronics...
A few minutes on Google makes this seem very unlikely.
I'm scared as hell to induce currents in my brain without knowing the neurobiology of it, but I do understand the electrical engineering half, so if you want an electromagnet and driver, I'll help you build one.
Would a neurologist who has thus far been immersed daily with the fact that all brains can fail in all sorts of interesting ways be hit just as bad with these delusions if given brain damage as someone who might have operated all their life under a sort of naive realism that makes no difference between reality and their brain's picture of it? What about a philosopher with no neurological experience but with a well-seated obsession with the map not being the territory?
Had to make an account to answer this one, since I can give unique insight
I'm an atypical case in that I had the Capgras Delusion (along with Reduplicative Paramnesia) in childhood, rather than as an adult. The delusions started sometime around 6-9 years of age. I hid it from others, partly because I halfway knew it was ridiculous, partly because I didn't want to let out that I was on to them...and it caused me quite a bit of anxiety, because I felt like I lost my loved ones and slipped into parallel universes every few days. I would try to keep my eyes on my loved ones, because as soon as I looked away and looked back the feeling that something was different would return.
Sometime around 12-14, I realized how implausible it was for any kind of impostor to conduct such large scale conspiracy, and how implausible it was that I was slipping into parallel universe. I told my parents what I was experiencing and admitted it was irrational. I forced myself to ignore the feeling every time it came (though it still bothered me). Eventually around 17 the feeling stopped bothering me altogether, although little twinges still occured from time to time.
I'm currently in what I would consider to...
This is yet again a different scenario, but very interesting, thanks! It does occur to me now that there might be adult trauma patients who can see through the delusion, and never get diagnosed with it, since they don't start raving about impostor family members but just go, whoa, brain seems messed, better go see the stroke doctor.
Some strangely common childhood beliefs:
Everyone except you is a robot
Your life is like the Truman Show
I have read the book (I recently received it from an elderly friend who hoarded books--I picked through about $20,000 worth of books and chose several hundred dollars worth), and it started off interesting, to hear of her personal experience of the stroke and its accompanying mind-states. She seems to have fought her way through various delusions, but not with any more success than other examples cited here. Yes, she is/was a neuroscientist. She also proudly proclaims that she tells her bowels "Good job! I am so thankful that you do exactly what you are meant to do!" every time she takes a dump, and concluded the book with some painfully New Age-y exhortations which gave me the same urge to roll around frothing at the mouth that I often experienced with clearly delusional Christian preachers in church.
All of the theories presented in this post seem to make the implausible assumption that somehow the brain acts like a hypothetical ideally rational individual and that impairment somehow breaks some aspect of this rationality.
However, there is a great deal of evidence the brain works nothing like this. In contrast, it has many specific modules that are responsible for certain kinds of thought or behavior. These modules are not weighed by some rational actor that sifts through them, they are the brain. When these modules come into conflict, e.g., in the standard word/color test where yellow is spelled in red, fairly simply conflict resolution methods are brought into play. When things go wrong in the brain, either an impairment in conflict resolution mechanisms or in the underlying modules themselves, things will go wonky in specific (not general) ways.
Speaking from personal experience, being in a psychotic/paranoid state simply makes certain things seem super salient to you. You can be quite well aware of the rational arguments against the conclusion you are worrying about but it's just so salient that it 'wins.' In other words it also feels like there is just a failure in yo...
It is embarrassing to admit but I used to think I really had dog ears and a tail until I was about 16.
Well, at least older students found it completely adorable when I made noises...and the school authorities thought I was like smart or something and didn't really care either.
I don't really know the cause, I don't remember knowing about kemonomimi until a bit later but I had delusions not only about seeing these body parts in myself but also felt them. I thought I broke my tail once, for example.
I've never read of any formal study of this, but given that someone must have tried explaining the Capgras delusion to Capgras patients I'm going to assume it doesn't work. Why not?
Off the top of my head, that people believe what their brain tells them above any outside evidence, c.f. religious conversion originating from what, to the outside view, is clearly a personal delusion - but, from the inside view, is incontrovertible evidence of God.
It takes very good and clear thinking for the lens to actually see its flaws even when you don't have brain damage to the bit that evaluates evidence. I'm somewhat surprised a rationalist with schizophrenia actually managed this. Though TheOtherDave has mentioned being able to work out that a weird perception was almost certainly due to the stroke he was recovering from, and Eliezer mentions someone else managing it as well.
John Nash claimed that he recovered from schizophrenia because "he decided to think rationally" - but this only happened after he took medications, so...
This provides a reasonable explanation of why we don't notice our dreams' implausibility while we're dreaming them - and Eliezer specifically mentions he can't use priors correctly in his dreams.
Have I ever mentioned my theory that it may be partially due to overloaded working memory?
"You have brain damage" is also a theory with perfect explanatory adequacy. If one were to explain the Capgras delusion to Capgras patients, it would provide just as good an explanation for their odd reactions as the imposter hypothesis. Although the patient might not be able to appreciate its decreased complexity, they should at least remain indifferent between the two hypotheses. I've never read of any formal study of this, but given that someone must have tried explaining the Capgras delusion to Capgras patients I'm going to assume it doesn't work. Why not?
Maybe it's really hard to really get that you are a brain on an intuitive level. Human intuitions seem to be pretty dualistic (well, at least mine do). So 'you have brain damage' doesn't sound very explanatory unless you've spent lot of time convincing yourself that it should.
By the way, the last link is broken.
For example, one male patient expressed the worry that his wife was actually someone else, who had somehow contrived to exactly copy his wife's appearance and mannerisms. This delusion sounds harmlessly hilarious ...
It's harmless to claim that someone is observationally equivalent to his wife, but not his wife? When that kind of thing happens on a large scale, it's called "the debate about p-zombies".
For what it's worth, the "Super Base Rate Fallacy" seems to line up with my own experiences, except that there's sometimes an independent part of my mind that can go "Okay, I have 99.999% confidence that the floor will eat us. But what's the actual odds of that confidence, and what evidence did I use to reach it?". While I can't just dismiss the absurd confidence value as absurd, I can still (sometimes) do a meta-evaluation about the precise confidence.
It's sort of like how if a friend says that global warming is 99.99% likely to be tru...
It seems improbable, but I recently heard about an n=1 personal experiment of a rationalist with schizophrenia who used successfully used Bayes to convince themselves that a delusion (or possibly hallucination; the story was unclear) was false. I don't have their permission to post their story here, but I hope they'll appear in the comments.
I was under the impression that learning to recognize hallucinations was a standard component of schizophrenia therapy.
Yvain, it seems like some of this is potentially answered by how this interacts with other cognitive biases present.
Re: specific delusions, when you have an entire class of equally-explanatory hypotheses, how do you choose between them? The availability heuristic! These hypotheses do have to come from somewhere inside the neural network after all. You could argue that availability is a form of "priors", but these "priors" are formed on the level of neurons themselves and not a specific brain region: some connection strengths are stronge...
A similar mechanism explains delusions of persecution, the classic "the CIA is after me" form of disease. We apply the Super Mind Projection Fallacy to a garden-variety anxiety disorder: "In what case would I be justified in feeling this anxious? Only if people were constantly watching me and plotting to kill me. Who could do that? The CIA."
My mom (a psychiatrist) was listening to a continuing education program on schizophrenia, and the lecturer said that schizophrenia tends to develop slowly, and in stages; before a person ends up with delusions of persecution, they usually start out by feeling intense fear and anxiety that they can't come up with any explanations for.
"Coltheart et al pretend that the prior is 1/100, but this implies that there is a base rate of your spouse being an imposter one out of every hundred times you see her (or perhaps one out of every hundred people has a fake spouse) either of which is preposterous."
What if their prior on not feeling anything upon seeing their wife is 0? What if most of the reason for reasonable people's prior on this being much lower it is low status, instrumentally bad, etc, but their rational sincere thinking about it prior is close to 50/50? I notice you call...
I wonder if the same mechanisms could be invovled in conspiracy theorists. Their way of thinking seems very similar. I also suspect a reinforcement mechanism: it becomes more and more difficult for the subject to deny his own beliefs, as it would require abandonning large parts of his present (and coherent) belief system, leaving him with almost nothing left.
This could explain why patients are reluctant to accept alternative versions afterwards (such as "you have a brain damage").
Prefrontal cortex damage can be really weird. I'd really like to see how these different syndromes manifest in an fMRI.
Contextual preface: my own brand of crazy tends to interfere with getting helped by professionals, so I've done a lot of amateur-level neurobiology research on my own, trying to pin it down. An "inability to update priors" does seem to be a component of it, but it seems primarily triggered by emotional intensity.
Anyone who would like to prod me with Science is extremely welcome to do so.
Every rule I see there seems to be you shooting yourself in the foot. I was thinking of something which would produce exactly one correct course of action under most reasonable circumstances, whereas you seem to have quite rigorously worked out a system with fewer correct courses of action than that.
How comfortable are you with arbitrarily redefining your code, voluntarily but with external prompting? I mean, given the ambient levels of doom already involved.
The thing is, my rules have evolved to deal with the fact that I've ALWAYS been low-status. Most of my rules have evolved to ensure that my self-esteem stays low, because as a child and young adult, I was repeatedly punished whenever my self-esteem exceeded that of my high-status superiors. So, for me, destroying my own self-esteem and status are defensive mechanisms, designed to prevent the pack from tearing me apart (sometimes literally and physically).
Also, rule 0 ("Do the impossible") is great if you're some kind of high-status wunderkind like Eliezer, but when you're some scrawny little know-it-all that no one WANTS to succeed, it's just an invitation to get lynched, or sprayed in the face with battery acid, or beaten with a lead pipe, or sodomized with a baseball bat.
And once you're in the domain of the "impossible", you lose access to even those systems that have been put in place explicitly to protect people from being sodomized with a baseball bat or sprayed in the face with battery acid, because the bad people want it to happen, and the good people are incapable of acknowledging that "modern society" is still that capable of savagery.
I've mi...
Sure, but the problem is that I still have all the status-seeking instincts of an above-average nerd. I'm no good serving a master, worthy or otherwise. When I was younger, my problem was that every master I served was demonstrably less intelligent than I was, so I spent a lot of time trying to grant the wishes they would have made if they were smart enough to wish right, rather than granting the wishes they did make.
In status-oriented situations, this is a HUGE FUCKING MISTAKE, and taught me to understand that I am a bad samurai.
In the past few years, I've been ronin for so long that my bushido has gone rusty - and anyways, in this corporate market, no one wants a ronin in the first place.
In the hospital where I worked, there was a woman who was able to articulate that it was very unlikely that her neighbor could read her mind. But, she reasoned, there were a lot of people in the world, so surely someone could read minds. And she had the bad luck to live next door to that person.
So sometimes people are able to acknowledge that their beliefs are statistically unlikely but still believe them.
Feedback: I thought that this post was interesting and at times quite amusing. However, I didn't upvote (nor downvote) because I felt that the concerns you discussed under the open questions section were serious enough that this post could basically be summed up as "here are some theories which feel like they might be on the right track, but basically we're still clueless".
I want to see more posts that explain the current state of knowledge of interesting rationality-related fields, and that explicitly state what questions are still troubling. Thus I upvoted the post.
[I am unsure, whether it makes sense to write a comment to this post after such a long time, but I think my experience could be helpful regarding the open questions. I am not trained in this subject, so my use of terms is probably off and confounded with personal interpretations]
My personal experience with arriving at and holding abstruse beliefs can actually be well described by the ideas described in this post, if complemented by something like the Multiagent models of Minds:
For describing my experience, I will regard the mind as consisting loosely of su...
Related Research:
Harvard did a study on LLI (Low latent inhibition. It means that you don't block as much stimulus and can mean having a lot more ideas to sort through) and discovered that people with high LLI and high IQs tend to be more creative whereas people with low IQs and high LLI are more likely to be schizophrenic. This may be because people with higher IQs are able to evaluate a larger number of ideas whereas those with lower IQs may find themselves overwhelmed trying to do so.
This suggests that schizophrenic people could benefit from assistanc...
A Related Experiment:
I once read about an experimental mental hospital for people with schizophrenic symptoms in California called Soteria House.
At Soteria house, the philosophy was to let the mental patients do whatever they wanted with the exception of hurting people. They got to run around naked if they wanted to, and there was a room for them to break things in (with breakable objects).
The staff was trained on a method to help the schizophrenics sort out reality from delusion. They were assisted by being told which things others couldn't see and we...
Do you have any evidence of brain damage in schizophrenia that isn't explainable by drug use (including antipsychotics especially) and is fairly common among schizophrenics?
Regarding arguing oneself out of delusion, cognitive therapy for schizophrenia has a decent track record. More info on request, after my wife gets home (she's a psychologist).
(Well-written post. There are more interesting subjects in the general 'schizophrenic reasoning' space though. If anyone ends up writing more on the subject I'd like if they sent me a draft; I know quite a bit, both theoretically and experientially.)
but it's also impossible to convince him he's Alexander the Great (at least I think so; I don't know if it's ever been tried).
At the very least (pretending that there are no ethical concerns), it seems that you ought to be able to exaggerate a patient's delusions. "We ran some tests, and it turns out that you're Jesus, John Lennon, and George Washington!".
To this same question, I can't help but notice that the brain damage being discussed is right-side brain aka "revolutionary" brain damage. So if it turns out that it isn't possible ...
The patient who believes he is Jesus and John Lennon will pretty much agree he is any famous figure you mention to him, but he never seems to make a big deal of it, whereas those two are the ones he's always going on about.
Are random people allowed to visit harmless psych patients with those patients' consent? This sounds fascinating.
"
"You have brain damage" is also a theory with perfect explanatory adequacy. If one were to explain the Capgras delusion to Capgras patients, it would provide just as good an explanation for their odd reactions as the imposter hypothesis. Although the patient might not be able to appreciate its decreased complexity, they should at least remain indifferent between the two hypotheses. I've never read of any formal study of this, but given that someone must have tried explaining the Capgras delusion to Capgras patients I'm going to assume it do...
Once I understood the theory, my first question was has this been explained to any delusional patient with a good grasp of probability theory? I know this sort of thing generally doesn't work, but the n=1 experiment you mention is intriguing. I suppose what is more often interesting to me is what sorts of things people come up with to dismiss conflicting evidence, since it is in a strange place between completely random and clever lie. If you have a dragon in your garage about something you tend to give the most plausible excuses because you know, deep dow...
Is it possible that what specific delusions a patient develops after their brain damage correlates with their experiences before the brain damage? Maybe paranoid schizophrenics in the US tend to think the CIA is after them, but those in Soviet Russia used to think the KGB was? How would these delusions have manifested in the past, before any such organizations existed? Perhaps some of them convinced themselves that God's wrath was being brought down upon them, or that Satan was haunting them.
Also, does Capgras delusion apply to everyone the patient has an...
I suspect that, especially in dreams, and to a lesser degree in déjà vu, the output of place cells have the ability to be combined in novel ways that normally might be rejected when fully conscious. I am not aware that anything similar has been discovered regarding familiar people, but if so, that would work in a surprisingly similar way ("Don't I know you from somewhere?"), and would accommodate the typical example. What the unconscious mind composes as a shorthand template for my mother is later detailed, but still contains the "my mother&...
"You have brain damage" is also a theory with perfect explanatory adequacy.... Why not?
This led me to think of two alternate hypotheses:
One is that the same problem underlying the second factor ("abnormal belief evaluation") is at fault, that self-evaluation for abnormal beliefs involves the same sort of self-modelling needed for a theory like "I have brain damage" to seem explanatory (or even coherent). The other is that there are separate systems for self-evaluation and belief-probability-evaluation that are both damaged...
There must be some fundamental difference between how one draws inferences from mental states versus everything else.
Talking about "drawing inferences from mental states" strikes me as a case of the homunculus fallacy, i.e., thinking that there's some kind of homunculus sitting inside our brains looking at the mental states and drawing inferences. Whereas in reality mental states are inferences.
" …modern man no longer communicates with the madman […] There is no common language: or rather, it no longer exists; the constitution of madness as mental illness, at the end of the eighteenth century, bears witness to a rupture in a dialogue, gives the separation as already enacted, and expels from the memory all those imperfect words, of no fixed syntax, spoken falteringly, in which the exchange between madness and reason was carried out. The language of psychiatry, which is a monologue by reason about madness, could only have come into existence in such a silence.
—Foucault, Preface to the 1961 edition[6]"
Related to: The Apologist and the Revolutionary, Dreams with Damaged Priors
Several years ago, I posted about V.S. Ramachandran's 1996 theory explaining anosognosia through an "apologist" and a "revolutionary".
Anosognosia, a condition in which extremely sick patients mysteriously deny their sickness, occurs during right-sided brain injury but not left-sided brain injury. It can be extraordinarily strange: for example, in one case, a woman whose left arm was paralyzed insisted she could move her left arm just fine, and when her doctor pointed out her immobile arm, she claimed that was her daughter's arm even though it was obviously attached to her own shoulder. Anosognosia can be temporarily alleviated by squirting cold water into the patient's left ear canal, after which the patient suddenly realizes her condition but later loses awareness again and reverts back to the bizarre excuses and confabulations.
Ramachandran suggested that the left brain is an "apologist", trying to justify existing theories, and the right brain is a "revolutionary" which changes existing theories when conditions warrant. If the right brain is damaged, patients are unable to change their beliefs; so when a patient's arm works fine until a right-brain stroke, the patient cannot discard the hypothesis that their arm is functional, and can only use the left brain to try to fit the facts to their belief.
In the almost twenty years since Ramachandran's theory was published, new research has kept some of the general outline while changing many of the specifics in the hopes of explaining a wider range of delusions in neurological and psychiatric patients. The newer model acknowledges the left-brain/right-brain divide, but adds some new twists based on the Mind Projection Fallacy and the brain as a Bayesian reasoner.
INTRODUCTION TO DELUSIONS
Strange as anosognosia is, it's only one of several types of delusions, which are broadly categorized into polythematic and monothematic. Patients with polythematic delusions have multiple unconnected odd ideas: for example, the famous schizophrenic game theorist John Nash believed that he was defending the Earth from alien attack, that he was the Emperor of Antarctica, and that he was the left foot of God. A patient with a monothematic delusion, on the other hand, usually only has one odd idea. Monothematic delusions vary less than polythematic ones: there are a few that are relatively common across multiple patients. For example:
In the Capgras delusion, the patient, usually a victim of brain injury but sometimes a schizophrenic, believes that one or more people close to her has been replaced by an identical imposter. For example, one male patient expressed the worry that his wife was actually someone else, who had somehow contrived to exactly copy his wife's appearance and mannerisms. This delusion sounds harmlessly hilarious, but it can get very ugly: in at least one case, a patient got so upset with the deceit that he murdered the hypothesized imposter - actually his wife.
The Fregoli delusion is the opposite: here the patient thinks that random strangers she meets are actually her friends and family members in disguise. Sometimes everyone may be the same person, who must be as masterful at quickly changing costumes as the famous Italian actor Fregoli (inspiring the condition's name).
In the Cotard delusion, the patient believes she is dead. Cotard patients will neglect personal hygiene, social relationships, and planning for the future - as the dead have no need to worry about such things. Occasionally they will be able to describe in detail the "decomposition" they believe they are undergoing.
Patients with all these types of delusions1 - as well as anosognosiacs - share a common feature: they usually have damage to the right frontal lobe of the brain (including in schizophrenia, where the brain damage is of unknown origin and usually generalized, but where it is still possible to analyze which areas are the most abnormal). It would be nice if a theory of anosognosia also offered us a place to start explaining these other conditions, but this Ramachandran's idea fails to do. He posits a problem with belief shift: going from the originally correct but now obsolete "my arm is healthy" to the updated "my arm is paralyzed". But these other delusions cannot be explained by simple failure to update: delusions like "the person who appears to be my wife is an identical imposter" never made sense. We will have to look harder.
ABNORMAL PERCEPTION: THE FIRST FACTOR
Coltheart, Langdon, and McKay posit what they call the "two-factor theory" of delusion. In the two-factor theory, one problem causes an abnormal perception, and a second problem causes the brain to come up with a bizarre instead of a reasonable explanation.
Abnormal perception has been best studied in the Capgras delusion. A series of experiments, including some by Ramachandran himself, demonstrate that Capgras patients lack a skin conductance response (usually used as a proxy of emotional reaction) to familiar faces. This meshes nicely with the brain damage pattern in Capgras, which seems to involve the connection between the face recognition areas in the temporal lobe and the emotional areas in the limibic system. So although the patient can recognize faces, and can feel emotions, the patient cannot feel emotions related to recognizing faces.
The older "one-factor" theories of delusion stopped here. The patient, they said, knows that his wife looks like his wife, but he doesn't feel any emotional reaction to her. If it was really his wife, he would feel something - love, irritation, whatever - but he feels only the same blankness that would accompany seeing a stranger. Therefore (the one-factor theory says) his brain gropes for an explanation and decides that she really is a stranger. Why does this stranger look like his wife? Well, she must be wearing a very good disguise.
One-factor theories also do a pretty good job of explaining many of the remaining monothematic delusions. A 1998 experiment shows that Cotard delusion sufferers have a globally decreased autonomic response: that is, nothing really makes them feel much of anything - a state consistent with being dead. And anosognosiacs have lost not only the nerve connections that would allow them to move their limbs, but the nerve connections that would send distress signals and even the connections that would send back "error messages" if the limb failed to move correctly - so the brain gets data that everything is fine.
The basic principle behind the first factor is "Assume that reality is such that my mental states are justified", a sort of Super Mind Projection Fallacy.
Although I have yet to find an official paper that says so, I think this same principle also explains many of the more typical schizophrenic delusions, of which two of the most common are delusions of grandeur and delusions of persecution. Delusions of grandeur are the belief that one is extremely important. In pop culture, they are typified by the psychiatric patient who believes he is Jesus or Napoleon - I've never met any Napoleons, but I know several Jesuses and recently worked with a man who thought he was Jesus and John Lennon at the same time. Here the first factor is probably an elevated mood (working through a miscalibrated sociometer). "Wow, I feel like I'm really awesome. In what case would I be justified in thinking so highly of myself? Only if I were Jesus and John Lennon at the same time!" A similar mechanism explains delusions of persecution, the classic "the CIA is after me" form of disease. We apply the Super Mind Projection Fallacy to a garden-variety anxiety disorder: "In what case would I be justified in feeling this anxious? Only if people were constantly watching me and plotting to kill me. Who could do that? The CIA."
But despite the explanatory power of the Super Mind Projection Fallacy, the one-factor model isn't enough.
ABNORMAL BELIEF EVALUATION: THE SECOND FACTOR
The one-factor model requires people to be really stupid. Many Capgras patients were normal intelligent people before their injuries. Surely they wouldn't leap straight from "I don't feel affection when I see my wife's face" to "And therefore this is a stranger who has managed to look exactly like my wife, sounds exactly like my wife, owns my wife's clothes and wedding ring and so on, and knows enough of my wife's secrets to answer any question I put to her exactly like my wife would." The lack of affection vaguely supports the stranger hypothesis, but the prior for the stranger hypothesis is so low that it should never even enter consideration (remember this phrasing: it will become important later.) Likewise, we've all felt really awesome at one point or another, but it's never occurred to most of us that maybe we are simultaneously Jesus and John Lennon.
Further, most psychiatric patients with the deficits involved don't develop delusions. People with damage to the ventromedial area suffer the same disconnection between face recognition and emotional processing as Capgras patients, but they don't draw any unreasonable conclusions from it. Most people who get paralyzed don't come down with anosognosia, and most people with mania or anxiety don't think they're Jesus or persecuted by the CIA. What's the difference between these people and the delusional patients?
The difference is the right dorsolateral prefrontal cortex, an area of the brain strongly associated with delusions. If whatever brain damage broke your emotional reactions to faces or paralyzed you or whatever spared the RDPC, you are unlikely to develop delusions. If your brain damage also damaged this area, you are correspondingly more likely to come up with a weird explanation.
In his first papers on the subject, Coltheart vaguely refers to the RDPC as a "belief evaluation" center. Later, he gets more specific and talks about its role in Bayesian updating. In his chronology, a person damages the connection between face recognition and emotion, and "rationally" concludes the Capgras hypothesis. In his model, even if there's only a 1% prior of your spouse being an imposter, if there's a 1000 times greater likelihood of you not feeling anything toward an imposter than to your real spouse, you can "rationally" come to believe in the delusion. In normal people, this rational belief then gets worn away by updating based on evidence: the imposter seems to know your spouse's personal details, her secrets, her email passwords. In most patients, this is sufficient to have them update back to the idea that it is really their spouse. In Capgras patients, the damage to the RDPC prevents updating on "exogenous evidence" (for some reason, the endogenous evidence of the lack of emotion itself still gets through) and so they maintain their delusion.
This theory has some trouble explaining why patients are still able to update about other situations, but Coltheart speculates that maybe the belief evaluation system is weakened but not totally broken, and can deal with anything except the ceaseless stream of contradictory endogenous information.
EXPLANATORY ADEQUACY BIAS
McKay makes an excellent critique of several questionable assumptions of this theory.
First, is the Capgras hypothesis ever plausible? Coltheart et al pretend that the prior is 1/100, but this implies that there is a base rate of your spouse being an imposter one out of every hundred times you see her (or perhaps one out of every hundred people has a fake spouse) either of which is preposterous. No reasonable person could entertain the Capgras hypothesis even for a second, let alone for long enough that it becomes their working hypothesis and develops immunity to further updating from the broken RDPC.
Second, there's no evidence that the ventromedial patients - the ones who lose face-related emotions but don't develop the Capgras delusion - once had the Capgras delusion but then successfully updated their way out of it. They just never develop the delusion to begin with.
McKay keeps the Bayesian model, but for him the second factor is not a deficit in updating in general, but a deficit in the use of priors. He lists two important criteria for reasonable belief: "explanatory adequacy" (what standard Bayesians call the likelihood ratio; the new data must be more likely if the new belief is true than if it is false) and "doxastic conservativism" (what standard Bayesians call the prior; the new belief must be reasonably likely to begin with given everything else the patient knows about the world).
Delusional patients with damage to their RDPC lose their ability to work with priors and so abandon all doxastic conservativism, essentially falling into a what we might term the Super Base Rate Fallacy. For them the only important criterion for a belief is explanatory adequacy. So when they notice their spouse's face no longer elicits any emotion, they decide that their spouse is not really their spouse at all. This does a great job of explaining the observed data - maybe the best job it's possible for an explanation to do. Its only minor problem is that it has a stupendously low prior, and this doesn't matter because they are no longer able to take priors into account.
This also explains why the delusional belief is impervious to new evidence. Suppose the patient's spouse tells personal details of their honeymoon that no one else could possibly know. There are several possible explanations: the patient's spouse really is the patient's spouse, or (says the left-brain Apologist) the patient's spouse is an alien who was able to telepathically extract the relevant details from the patient's mind. The telepathic alien imposter hypothesis has great explanatory adequacy: it explains why the person looks like the spouse (the alien is a very good imposter), why the spouse produces no emotional response (it's not the spouse at all) and why the spouse knows the details of the honeymoon (the alien is telepathic). The "it's really your spouse" explanation only explains the first and the third observations. Of course, we as sane people know that the telepathic alien hypothesis has a very low base rate plausibility because of its high complexity and violation of Occam's Razor, but these are exactly the factors that the RDPC-damaged2 patient can't take into account. Therefore, the seemingly convincing new evidence of the spouse's apparent memories only suffices to help the delusional patient infer that the imposter is telepathic.
The Super Base Rate Fallacy can explain the other delusional states as well. I recently met a patient who was, indeed, convinced the CIA were after her; of note she also had extreme anxiety to the point where her arms were constantly shaking and she was hiding under the covers of her bed. CIA pursuit is probably the best possible reason to be anxious; the only reason we don't use it more often is how few people are really pursued by the CIA (well, as far as we know). My mentor warned me not to try to argue with the patient or convince her that the CIA wasn't really after her, as (she said from long experience) it would just make her think I was in on the conspiracy. This makes sense. "The CIA is after you and your doctor is in on it" explains both anxiety and the doctor's denial of the CIA very well; "The CIA is not after you" explains only the doctor's denial of the CIA. For anyone with a pathological inability to handle Occam's Razor, the best solution to a challenge to your hypothesis is always to make your hypothesis more elaborate.
OPEN QUESTIONS
Although I think McKay's model is a serious improvement over its predecessors, there are a few loose ends that continue to bother me.
"You have brain damage" is also a theory with perfect explanatory adequacy. If one were to explain the Capgras delusion to Capgras patients, it would provide just as good an explanation for their odd reactions as the imposter hypothesis. Although the patient might not be able to appreciate its decreased complexity, they should at least remain indifferent between the two hypotheses. I've never read of any formal study of this, but given that someone must have tried explaining the Capgras delusion to Capgras patients I'm going to assume it doesn't work. Why not?
Likewise, how come delusions are so specific? It's impossible to convince someone who thinks he is Napoleon that he's really just a random non-famous mental patient, but it's also impossible to convince him he's Alexander the Great (at least I think so; I don't know if it's ever been tried). But him being Alexander the Great is also consistent with his observed data and his deranged inference abilities. Why decide it's the CIA who's after you, and not the KGB or Bavarian Illuminati?
Why is the failure so often limited to failed inference from mental states? That is, if a Capgras patient sees it is raining outside, the same process of base rate avoidance that made her fall for the Capgras delusion ought to make her think she's been transported to ther rainforest or something. This happens in polythematic delusion patients, where anything at all can generate a new delusion, but not those with monothematic delusions like Capgras. There must be some fundamental difference between how one draws inferences from mental states versus everything else.
This work also raises the question of whether one can one consciously use System II Bayesian reasoning to argue oneself out of a delusion. It seems improbable, but I recently heard about an n=1 personal experiment of a rationalist with schizophrenia who used successfully used Bayes to convince themselves that a delusion (or possibly hallucination; the story was unclear) was false. I don't have their permission to post their story here, but I hope they'll appear in the comments.
FOOTNOTES
1: I left out discussion of the Alien Hand Syndrome, even though it was in my sources, because I believe it's more complicated than a simple delusion. There's some evidence that the alien hand actually does move independently; for example it will sometimes attempt to thwart tasks that the patient performs voluntarily with their good hand. Some sort of "split brain" issues seem like a better explanation than simple Mind Projection.
2: The right dorsolateral prefrontal cortex also shows up in dream research, where it tends to be one of the parts of the brain shut down during dreaming. This provides a reasonable explanation of why we don't notice our dreams' implausibility while we're dreaming them - and Eliezer specifically mentions he can't use priors correctly in his dreams. It also highlights some interesting parallels between dreams and the monothematic delusions. For example, the typical "And then I saw my mother, but she was also somehow my fourth grade teacher at the same time" effect seems sort of like Capgras and Fregoli. Even more interestingly, the RDPC gets switched on during lucid dreaming, providing an explanation of why lucid dreamers are able to reason normally in dreams. Because lucid dreaming also involves a sudden "switching on" of "awareness", this makes the RDPC a good target area for consciousness research.