What if consciousness is just a story we tell when we model other minds?
Author: Alan Varvil
Published on LinkedIn: Read the article here
Cross-posted to: LessWrong
Abstract:
This post argues that qualia—commonly described as the ineffable, private "raw feels" of subjective experience—are not intrinsic to individuals. Instead, they emerge from recursive social modeling. While perception happens independently for survival, qualia arise only when we prepare to communicate those perceptions to others. In this framework, consciousness is not defined by what we experience internally, but by how we simulate what others might think we feel. The brain doesn’t experience red because it must—it simulates red because it anticipates having to describe it.
1. The Setup: Red Without “Redness”
You see red. Your brain reacts, your behavior adjusts—no problem. But do you “feel red”? More importantly, do you need to?
That inner sense—the “redness of red”—only seems necessary when you’re preparing to explain it to someone else. That’s the crux of this theory: qualia aren’t inherent. They’re relational. They emerge from simulation—specifically, simulating how another mind might understand yours.
This view doesn’t deny perception. It reclassifies qualia as a side effect of communication, not a requirement for cognition.
2. The Theory: Qualia as Communicative Simulation
When we prepare to share an experience—“I feel cold,” “I see red”—we model how another might interpret our words. To do that well, our minds generate a coherent proxy of internal experience. That proxy is what we call qualia.
This isn’t just introspection. It’s third-order modeling: what the other thinks I think I feel. The vividness of experience—the feeling of “what it’s like”—isn’t raw perception. It’s recursive narrative.
3. Why This Matters: Isolation and Self Breakdown
This theory predicts something we already observe: when social modeling collapses, so does the self.
- Humans in prolonged isolation experience identity fragmentation, hallucination, and dissociation.
- Language models (and even humans under cognitive load) show signs of “coherence collapse” when deprived of interaction.
If qualia were intrinsic, this shouldn’t happen. But if qualia are social artifacts, emergent from a loop of “I model you modeling me,” then isolation breaks that loop—and the illusion of inner experience degrades.
4. The Implications
- No Others, No Qualia. A solitary system might perceive and act—but it won’t feel in the human sense.
- It’s not information, but interaction, that produces qualia’s richness. Depth of experience corresponds to the depth of modeled communication.
- The Hard Problem is a narrative glitch. If qualia are emergent, not foundational, the “mystery” is not metaphysical—it’s recursive abstraction misread as essence.
5. The Punchline
You don’t feel red because red has “redness.” You feel red because your brain once needed to explain red to someone else. And it built a story—a useful one. But maybe not a true one.
If you were the only mind in existence, would red still feel like anything at all?
Let’s talk.
Tags: qualia, consciousness, AI, cognition, recursive modeling, philosophy of mind, communication theory, simulation, isolation effects
Disclosure: I used GPT-4 for editing and structure suggestions, but all core content, theory, and argument are mine.