Introduction
Most discussions of artificial superintelligence (ASI) end in one of two places: human extinction or human-AI utopia. This post proposes a third, perhaps more plausible outcome: complete separation. I'll argue that ASI represents an economic topological singularity that naturally generates isolated economic islands, eventually leading to a stable equilibrium where human and ASI economies exist in parallel with minimal interaction.
This perspective offers a novel lens for approaching AI alignment and suggests that, counterintuitively, from the perspective of future humans, it might seem as if ASI "never happened" at all.
The Topological Nature of Systems
All complex systems—from physical spacetime to human economies—can be understood as topological structures. These structures consist of:
- Regions: Areas with consistent
... (read 1650 more words →)
>It's harder to get those (starting from Earth) than things on Earth, though.
It's not that much harder, and we can make it harder to extract Earth's resources (or easier to extract non-earth resources).
>Satisfying higher-level values has historically required us to do vast amounts of farming and strip-mining and other resource extraction.
This is true. However, there are also many organisms that are resilient even to our most brutal forms of farming. We should aim for that level of adaptability ourselves.
>It is barely "competition" for an ASI to take human resources. This does not seem plausible for bulk mass-energy.
This is true, but energy is only really scarce to humans, and even then their mass-energy... (read more)