Pain is the unit of Effort

Note 3: Just because you achieved your goal through hard work and dedication doesn't mean it was worth it in the end. This is what they don't tell you in self-help books.

Are we in an AI overhang?

So the God of Straight Lines dissolves into a puff of smoke at just the right time to bring about AI doom? Seems awfully convenient.

Nuclear war is unlikely to cause human extinction

Not even the right order of magnitude. Yellowstone magma chamber is 5km beneath the surface. If you had a nuke large enough to set off a supervolcano, you wouldn't need to set off a supervolcano. Not to mention Yellowstone isn't ready to blow anyway.

Nuclear war is unlikely to cause human extinction

My mainline expectation is that in a nuclear war scenario, chemical, biological, and conventional weapon effects would be dwarfed by the effects from nuclear weapons.

I would classify biological weapons as more dangerous than nuclear, but that's a different topic. Besides, biological and nuclear warfare don't mix well - without commercial air travel and trade biological agents don't spread well.

What hard science fiction stories also got the social sciences right?

There is no perfect match with Bostrom's vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the "nukes" are cheap, they would be destroying potentially useful resources.

Not relevant, if you succeed in hiding you simply fall off the vulnerability landscape. We only need to consider what happens when you've been exposed. Also, whose resources? It's a cosmic commons, so who cares if it gets destroyed.

The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason.

That's just Type-1 vulnerable world. No need for the contrived argumentation the author gave.

There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat.

Not really, cleaning up extinction threats is a public good that generally tends to fall prey to Tragedy of the Commons. Even if you made the numbers work out somehow - which is very difficult and requires certain conditions that the author has explicitly refuted (like the impossibility to colonize other stars or to send out spam messages) - it would still not be an example of Moloch. It would be an example of pan-galactic coordination, albeit a perverted one.

What hard science fiction stories also got the social sciences right?

Very much disagree. My sense is that the book series is pretty meagre on presenting "thoughtful hard science" as well as game theory and human sociology.

To pick the most obvious example - the title of the trilogy* - the three body problem was misrepresented in the books as "it's hard to find the general analytic solution" instead of "the end state is extremely sensitive to changes in the initial condition", and the characters in the book (both humans and Trisolarians) spend eons trying to solve the problem mathematically. 

But even if an exact solution was found - which does exist for some chaotic systems like logistic maps - it would have been useless since the initial condition cannot be known perfectly. This isn't a minor nitpick like the myriad other scientific problems with the Trisolarian system that can be more easily forgiven for artistic license; this is missing what chaotic systems are about. Why even invoke the three-body problem other than as attire?

*not technically the title of the book series, but frequently referred to as such

What hard science fiction stories also got the social sciences right?

Where exactly do you see Moloch in the books? It's quite the opposite if anything; the mature civilizations of the universe have coordinated around cleaning up the cosmos of nascent civilizations, somehow, without a clear coordination mechanism. Or perhaps it's a Type-1 vulnerable world, but it doesn't fit well with the author's argumentation. I'm not sure, and I'm not sure the author knows either.

I'm still a little puzzled by all the praises for the deep game theoretic insights the book series supposedly contains though. Maybe game theory as attire?

Covid 10/1: The Long Haul

You're also exposed to all sorts of risks if you're "below some wealth threshold where they cannot act as they would reasonably like to because their alternative is homelessness, malnourishment, etc." even before Corona came around. The situation hasn't changed all that much. 

But, as Elon Musk famously said: "If you don't make stuff, there is no stuff".

The Goddess of Everything Else

Spreading across the stars without number sounds more like a "KILL CONSUME MULTIPLY CONQUER" thing than it sounds like an "Everything Else" thing. I'm missing something of the point here.


Spreading across the stars without numbers


ETA: Is the point that over time Man evolved to be what he is today, we have a conception of right and wrong, and we're the first link in the chain that actually cares about making sure our morals propagate forward as we evolve? So now the force of evolution has been co-opted into spreading human morality?

No. I recommend reading Meditations on Moloch first, then everything becomes clear.

Load More