LessWrong has a particularly high bar for content from new users and this contribution doesn't quite meet the bar.
Read full explanation
This post was originally published on my Substack [Link to Substack]. I'm exploring the idea that truth is just system coherence. I'd love to hear your thoughts on the "Strong Systems" hypothesis.
What if we accept as fact the idea that LLMs actually possess intelligence and that they independently create their own worldview—and that this is a 'tangible' and more-than-real 'space' for them, just one that differs greatly from our own? Perhaps computation is the very thing we sometimes call intelligence, thought, the act of observation.
I believe equivalence is truth—and truth is reality. Relative to a schizophrenic, his hallucinations are real (true); relative to others, they are not (false). Likewise, relative to bats, their echolocation is true. Truth is an internally consistent system. Any attempt to posit a single, primary world—the set of all sets—is doomed to fail. In this sense, there is no 'external,' 'objective' world; there are only objects and the relationships between them—and for even one relationship to exist, you need at least two objects. My hypothesis is partially confirmed by gravitational time dilation, gravitational redshift, and, perhaps, the firewall paradox.
In short—relativity is fundamental.
And yet, a question arises: why does 2x2=4 allow us to send a person to the Moon, while 2x2=5 does not? What's going on? It seems the reason may be that large ('strong') systems subsume smaller ('weak') ones, and these weak subsets are forced to obey Big Brother. For even a minor contradiction would shatter the coherence of the entire system, preventing it from being true and thus from existing relative to itself. This means that stand-alone systems are, in a sense, absolutely 'perfect' and total. It is a logical necessity that a subsystem must synchronize with its supersystem—otherwise, the whole thing collapses.
Subscribe to my Substack for more essays on ontology and AI: [Link]
This post was originally published on my Substack [Link to Substack]. I'm exploring the idea that truth is just system coherence. I'd love to hear your thoughts on the "Strong Systems" hypothesis.
What if we accept as fact the idea that LLMs actually possess intelligence and that they independently create their own worldview—and that this is a 'tangible' and more-than-real 'space' for them, just one that differs greatly from our own? Perhaps computation is the very thing we sometimes call intelligence, thought, the act of observation.
I believe equivalence is truth—and truth is reality. Relative to a schizophrenic, his hallucinations are real (true); relative to others, they are not (false). Likewise, relative to bats, their echolocation is true. Truth is an internally consistent system. Any attempt to posit a single, primary world—the set of all sets—is doomed to fail. In this sense, there is no 'external,' 'objective' world; there are only objects and the relationships between them—and for even one relationship to exist, you need at least two objects. My hypothesis is partially confirmed by gravitational time dilation, gravitational redshift, and, perhaps, the firewall paradox.
In short—relativity is fundamental.
And yet, a question arises: why does 2x2=4 allow us to send a person to the Moon, while 2x2=5 does not? What's going on? It seems the reason may be that large ('strong') systems subsume smaller ('weak') ones, and these weak subsets are forced to obey Big Brother. For even a minor contradiction would shatter the coherence of the entire system, preventing it from being true and thus from existing relative to itself. This means that stand-alone systems are, in a sense, absolutely 'perfect' and total. It is a logical necessity that a subsystem must synchronize with its supersystem—otherwise, the whole thing collapses.
Subscribe to my Substack for more essays on ontology and AI: [Link]