LESSWRONG
LW

Beyond Singularity
-17160
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Should we aim for flourishing over mere survival? The Better Futures series.
Beyond Singularity1mo20

I agree that we should shift our focus from pure survival to prosperity. But I disagree with the dichotomy that the author seems to be proposing. Survival and prosperity are not mutually exclusive, because long-term prosperity is impossible with a high risk of extinction.

Perhaps a more productive formulation would be the following: “When choosing between two strategies, both of which increase the chances of survival, we should give priority to the one that may increase them slightly less, but at the same time provides a huge leap in the potential for prosperity.”

However, I believe that the strongest scenarios are those that eliminate the need for such a compromise altogether. These are options that simultaneously increase survival and ensure prosperity, creating a synergistic effect. It is precisely the search for such scenarios that we should focus on.

In fact, I am working on developing one such idea. It is a model of society that simultaneously reduces the risks associated with AI and destructive competition and provides meaning in a world of post-labor abundance, while remaining open and inclusive. This is precisely in the spirit of the “viatopias” that the author talks about.

If this idea of the synergy of survival and prosperity resonates with you, I would be happy to discuss it further.

Reply
LessWrong has been acquired by EA
Beyond Singularity5mo30

Haha, brilliant! The loot box mechanic is inspired! Finally, a way to gamify intellectual progress. Question: can we trade duplicate +100 upvotes on the community market?

Reply
LessWrong has been acquired by EA
Beyond Singularity5mo40

Finally, the singularity is near... the singularity of LessWrong posts being evaluated by engagement metrics and monetization potential! Called it – always knew the 'Game Culture Civilization' model would eventually be implemented by a major publisher. Looking forward to seeing how EA balances the status economy with microtransactions for faster 'Pure Game' progression. This acquisition has real synergy!

Reply
Good Research Takes are Not Sufficient for Good Strategic Takes
Beyond Singularity5mo10

Excellent points on the distinct skillset needed for strategy, Neel. Tackling the strategic layer, especially concerning societal dynamics under ASI influence where feedback is poor, is indeed critical and distinct from technical research.

Applying strategic thinking beyond purely technical alignment, I focused on how societal structure itself impacts the risks and stability of long-term human-ASI coexistence. My attempt to design a societal framework aimed at mitigating those risks resulted in the model described in my post, Proposal for a Post-Labor Societal Structure to Mitigate ASI Risks: The 'Game Culture Civilization' (GCC) Model

Whether the strategic choices and reasoning within that model hold up to scrutiny is exactly the kind of difficult evaluation your post calls for. Feedback focused on the strategic aspects (the assumptions, the proposed mechanisms for altering incentives, the potential second-order effects, etc.), as distinct from just the technical feasibility, would be very welcome and relevant to this discussion on evaluating strategic takes.

Reply
On (Not) Feeling the AGI
Beyond Singularity5mo01

One of the fundamental shifts that still seems missing in the thinking of Altman, Thompson, and many others discussing AGI is the shift from technological thinking to civilizational thinking.

They're reasoning in the paradigm of "products" — something that can diffuse, commoditize, slot into platform dynamics, maybe with some monetization tricks. Like smartphones or transistors. But AGI is not a product. It's the point after which the game itself changes.

By definition, AGI brings general-purpose cognitive ability. That makes the usual strategic questions — like "what’s more valuable, the model or the user base?" — feel almost beside the point. The higher-order question becomes: who sets the rules of the game?

This is not a shift in tools; it’s a shift in the structure of goals, norms, and meaning.

If you don’t feel the AGI — maybe it’s because you’re not yet thinking at the right level of abstraction.

Reply
Four Futures For Cognitive Labor
Beyond Singularity5mo10

Great set of analogies—especially the framing along the axes of tool vs. replacement and demand elasticity. That second axis is often overlooked in AI labor discussions, and it really does flip the sign of expected outcomes.

One angle I’d add: sometimes automation doesn’t just replace the labor behind a product—it eliminates the need for the product itself. In the ice trade example, the key shift wasn’t just labor being replaced by refrigerators, but that the very dependence on shipped ice vanished. There was no new, scaled-up "modern ice industry"—the entire category dissolved.

That could happen with cognitive labor, too—not because LLMs write better essays, but because essays stop being a relevant format. If an AI can directly answer your question, why read 5,000 words of analysis?

Same with code: the endgame may not be "AI writes code for us," but rather that code as a product becomes obsolete—replaced by visual interfaces, autonomous agents, or auto-composed systems. A shift not just in who does the work, but in what the work even is.

So maybe the central question isn’t whether there will still be demand for writers or coders—but whether writing or coding will still be meaningful interfaces for engaging with ideas or systems. And if not—what replaces them?

Reply
3Proposal for a Post-Labor Societal Structure to Mitigate ASI Risks: The 'Game Culture Civilization' (GCC) Model
5mo
0