Holy! Holy! Holy! Holy! Holy! Holy! Holy! Holy! Holy! Holy! Holy! Holy! Holy! Holy! Holy!
Everything is holy! everybody’s holy! everywhere is holy! everyday is in eternity! Everyman’s an angel!
Holy the lone juggernaut! Holy the vast lamb of the middleclass! Holy the crazy shepherds of rebellion! Who digs Los Angeles IS Los Angeles!
Holy New York Holy San Francisco Holy Peoria & Seattle Holy Paris Holy Tangiers Holy Moscow Holy Istanbul!
Holy time in eternity holy eternity in time holy the clocks in space holy the fourth dimension holy the fifth International holy the Angel in Moloch!
- Footnote to Howl

Scene: Carl and Allen, two old friends, are having a conversation about theodicy.

Carl: “Let me tell you about the god who is responsible for almost all our suffering. This god is an ancient Canaanite god, one who has been seen throughout history as a source of death and destruction. Of course, he doesn’t exist in a literal sense, but we can conceptualize him as a manifestation of forces that persist even today, and which play a crucial role in making the world worse. His name is M-”

Allen: “-oloch, right? Scott Alexander’s god of coordination failures. Yeah, I’ve read Meditations on Moloch. It’s an amazing post; it resonated with me very deeply.”

Carl: “I was actually going to say Mot, the Canaanite god of death, bringer of famine and drought.”

Allen: “Huh. Okay, you got me. Tell me about Mot, then; what does he represent?”

Carl:Mot is the god of sterility and lifelessness. To me, he represents the lack of technology in our lives. With technology, we can tame famine, avert drought, and cure disease. We can perform feats that our ancestors would have seen as miracles: flying through the air, and even into space. But we’re still so so far from achieving the true potential of technology—and I think of Mot as the personification of what’s blocking us.

“You can see Mot everywhere, when you know what to look for. Whenever a patient lies suffering from a disease that we haven’t cured yet, that’s Mot’s hand at work. Whenever a child grows up in poverty, that’s because of Mot too. We could have flying cars, and space elevators, and so much more, if it weren’t for Mot.

“Look out your window and you see buildings, trees, people. But if you don’t see skyscrapers literally miles high, or trees that have been bioengineered to light the streets, or people who are eternally youthful and disease-free, then you’re not just seeing Earth—you’re also seeing Mot. Hell, the fact that we’re still on this planet, in physical bodies, is a testament to Mot’s influence. We could be settling the stars, and living in virtual utopias, and even merging our minds, if it weren’t for Mot.”

Allen: “Huh. Well, I feel you there; I want all those things too. And you’re right that god-like technology could solve almost all the issues we face today. But something does feel pretty weird about describing all of this as a single problem, let alone blaming a god of lacking-technology.”

Carl: “Say more?”

Allen: “Well, there’s not any unified force holding back the progress of technology, right? If anything, it’s the opposite. Absence of advanced technology is the default state, which we need to work hard to escape—and that’s difficult not because of any opposition, but just because of entropy.”

Carl: “What about cases where Mot is being channeled by enemies of progress? For example, when bureaucratic regulatory agencies do their best to stifle scientific research?”

Allen: “But in those cases you don’t need to appeal to Mot—you can just say ‘our enemy is overregulation’. Or if you defined Mot as the god of overregulation, I’d be totally on board. But you’re making a much bigger claim than that. The reason we haven’t uploaded ourselves yet isn’t that there’s a force that’s blocking us, it’s almost entirely that scientific progress is really really hard!”

Carl: “Yepp, I agree with all your arguments. And you’ve probably already guessed where I’m going with this, but let’s spell it out: why don’t these objections to blaming our problems on lack of technology, aka Mot, apply just as much to blaming them on lack of coordination, aka Moloch?”

Allen: “Yeah, I’ve been trying to figure that out. First of all, a lot of the intuitive force behind the concept of Moloch comes from really blatant coordination failures, like the ones that Scott lays out in the original post. If you’re stuck in a situation that nobody wants, then something’s gone terribly wrong; and when something goes terribly wrong, then it’s natural to start blaming enemy action.”

Carl: “There are really blatant examples of lack-of-technology too, though. Look at a wheel. It’s a literal circle; it’s hard to imagine any technology that’s simpler. Yet humans spent millennia gathering crops and carrying loads before inventing it. Or think about cases where we narrowly missed out on transformative breakthroughs. The Romans built toy steam engines—they just never managed to scale them up to produce an industrial revolution. Getting so close to accelerating a post-scarcity world by two millennia, but just missing, surely counts as something going terribly, tragically wrong. Don’t these cases demonstrate that Mot’s presence can be just as blatant as Moloch’s?”

Allen: “Well, a big part of both of those stories was the absence of demand. Wheels just weren’t very useful before there were high-quality roads; and early steam engines just weren’t very useful in the absence of large coal mines. Of course they both turned out to be very worthwhile in the long term, but that’s really hard to foresee.”

Carl: “So you’re saying that we sometimes need to jump out of a local trap in order to make longer-term technological progress. Remind me, what was your position on understanding local obstacles to progress by anthropomorphizing them as Canaanite gods?”

Allen: “Okay, fair point. But Moloch isn’t just an external obstacle—it’s also a state of mind. When you pretend that you’re going to cooperate when you’re not, or you place your own interests above those of the group, you’re channeling Moloch. And when enough people do that, societal trust breaks down, and the Molochian dynamics become a self-fulfilling prophecy.”

Carl: “And when you ridicule people for trying something different, or lobby for legislative barriers to deploying new technology, you’re channeling Mot. And when enough people do that, society loses faith in positive-sum growth, and progress stagnates. It’s directly analogous. Come on, what’s your true objection here?”

Allen: “I mean, I can’t fully articulate it. But the ideal of perfect coordination feels much more achievable to me than the ideal of perfect technology. We could just agree to act in a unified way—it’s simply a matter of wanting it enough. In other words, saying that lack of technology is responsible for our problems isn’t very actionable—you can’t just magic up technology out of nowhere. But saying that lack of coordination is responsible for our problems is a straightforward step towards convincing people to become more coordinated.”

Carl: “Actually, the last few centuries could pretty reasonably be described as humanity continually magicking up technology out of nowhere. Of course, scientific and technological progress still takes a lot of work, and a lot of iteration. But when it works, it lets you jump directly to far better outcomes. By contrast, it’s incredibly difficult to improve things like government competence or social trust—or even to prevent them from declining. So overall, boosting technological progress is far more actionable than increasing coordination, and we should write off the phrase ‘we could just agree’ as a particularly seductive appeal to magic.”

Allen: “I do agree that scientific and technological progress has far outstripped progress in governance and coordination. So on an intellectual level, I think you’ve convinced me that Moloch is no more useful a concept than Mot. But I still don’t feel like I’ve dissolved the question of why Moloch seems more compelling than Mot. Do you have any explanation for that?”

Carl: “I think the illusion comes from Scott using a simplistic notion of coordination, as exemplified by his claim that ‘the opposite of a trap is a garden… with a single gardener dictating where everything should go’. In other words, he implicitly assumes that ‘coordinate’ is synonymous with ‘centralize power’. From that perspective, we can view coordination as a single spectrum, with ‘Moloch’ at one end and ‘just put one guy in charge of everything’ at the other. But in fact the space of possibilities is much richer and more complicated than that.

“Firstly, coordination is complicated in the same way that science is complicated: it requires developing new concepts and frameworks that are totally alien from your current perspective, even if they’ll seem obvious in hindsight. For most people throughout history, ideas like liberalism, democracy, and free speech were deeply counterintuitive (or, in Scott’s terminology, ‘terrifying unspeakable Elder Gods’). In terms of spreading prosperity across the world, the limited liability company was just as important an invention as the steam engine. If you wouldn’t blame Mot for all the difficulties of labor and locomotion that were eventually solved by steam engines, you shouldn’t blame Moloch for all the difficulties of trust and incentive alignment that were eventually solved by LLCs.

“Secondly, coordination is complicated in the same way that engineering large-scale systems is complicated: there are always just a huge number of practical obstacles and messy details to deal with. It took the best part of a century to get from the first commercial steam engine to Watts’ design; and even today, some of the hardest software engineering problems simply involve getting well-understood algorithms to work at much larger scales (like serving search results, or training LLMs). Similarly, when we look at important real-life coordination problems, they’re very different from toy problems like prisoner’s dilemmas or tragedies of the commons. Even when there’s a simple ‘headline idea’ for a better equilibrium, actually reaching that equilibrium requires a huge amount of legwork: engaging with different stakeholders, building trust, standardizing communication protocols, creating common knowledge, balancing competing interests, designing agreements, iterating to fix problems that come up, and so on.

“Thirdly, coordination is complicated in the same way that security is complicated: you don’t just need to build effective tools, you need to prevent them from being hijacked and misused. Remember that both fascist and communist despots gained power by appealing to the benefits of cooperation—‘fascism’ is even named after ‘fasces’, the bundles of sticks that are stronger together than apart. If we’d truly learned the lessons of history, then categorizing actions as ‘cooperating’ versus ‘defecting’ would feel as simplistic as categorizing people as ‘good’ versus ‘evil’. And in fact many people do sense this intuitively, which is why there’s so commonly strong resistance to top-down solutions to coordination problems, and why the scientific and engineering problems of building coordination technologies are so tricky.”

Allen: “I buy that coordination is often far more complicated than it seems. But blaming Moloch for coordination breakdowns still seems valuable insofar as it stops us from just blaming each other, which can disrupt any hope of improvement.”

Carl: “Yeah, I agree. I think of this in terms of the spectrum from conflict theory to mistake theory. Saying that few immoral defectors are responsible for coordination problems is pure conflict theory. The concept of Moloch reframes things so that, instead of ‘defectors’ being our enemies, an abstract anthropomorphic entity is our enemy instead. And that’s progress! But it’s still partly conflict-theoretic, because it tells us that we just need to identify the enemy and kill it. That biases us towards trying to find ‘silver bullets’ which would restore us to our rightful coordinated state. Instead, it’d be better to lean even further into mistake theory: discord is the default, and to prevent it we need to do the hard work of designing and implementing complicated alien coordination technologies.”

Allen:You shouldn’t underestimate the value of conflict theory, though. It’s incredibly good at harnessing people’s tribal instincts towards actually doing something useful. We can’t be cold and rational all the time—we need emotionally salient motivations to get us fired up.”

Carl: “Right. So I don’t think we should get rid of Moloch as a rallying cry. But I do think that we should get rid of Moloch as a causal node in our ontologies: as a reason why the world is one way, rather than another. And I think we should be much more careful about terminology like ‘coordination failure’ or ‘inadequate equilibria’, which both mistakenly suggest that there’s a binary threshold between enough coordination and not-enough coordination. That’s like saying that cars which can go faster than 80 miles per hour are ‘adequate technology’, but cars which can’t are a ‘technology failure’. Maybe that’s occasionally a useful distinction, but it misses the bigger picture: that they’re actually very similar on almost all axes, because it takes so much complex technology to build a car at all.

“For Scott, there’s no better temple to Moloch than Las Vegas. But even there, my argument applies. You could look at Vegas and see Moloch’s hand at work. Or you could see Vegas as a product of the miraculous coordination technology that is modern capitalism—perhaps an edge case of it, but still an example of its brilliance. Or you could see Vegas as a testament to the wisdom of the constitution: casinos are banned almost everywhere in the US, but for the sake of diversity and robustness it sure seems like there should be at least one major city which allows them. Or you could see Vegas as an example of incredible restraint: there are innumerable possible ways to extract money from addled tourists in the desert, and Vegas prevents almost all of them. Or you could see it as a testament to the cooperative instinct inside humans: every day thousands of employees go to work and put in far more effort than the bare minimum it would take to not get fired. Setting aside the concept of Moloch makes it easier to see the sheer scale of coordination all around us, which is the crucial first step towards designing even better coordination technologies.

‘In Las Vegas, Scott saw Moloch. But in Scott’s description of Moloch, I see Mot. We can do better than thinking of coordination as war and deicide. We can think of it as science, as engineering, as security—and as the gradual construction, as we sail down the river, of the ship that will take us across the sea.”

Holy the sea holy the desert holy the railroad holy the locomotive holy the visions holy the hallucinations holy the miracles holy the eyeball holy the abyss!
Holy forgiveness! mercy! charity! faith! Holy! Ours! bodies! suffering! magnanimity!
Holy the supernatural extra brilliant intelligent kindness of the soul!

New to LessWrong?

New Comment
11 comments, sorted by Click to highlight new comments since: Today at 6:57 PM

We can do better than thinking of coordination as war and deicide. We can think of it as science, as engineering, as security—

I agree with this, very much.  But it's odd for Carl, who was pushing the idea of Mot from the beginning, to end by saying we shouldn't think in terms of these evil gods; I guess it's sort of a reductio ad absurdum.  (I wasn't the right audience for the evil-god framing in the first place; the original Moloch essay registered to me as "Interesting discussion of the dangers of optimization processes, although this Moloch stuff is a weird distraction.")

But I do think that we should get rid of Moloch as a causal node in our ontologies: as a reason why the world is one way, rather than another.

Do people think that way?  If they do, are they using "Moloch" as a shorthand for "some kind of competition-caused optimization process that I haven't necessarily thought about in detail", or as something else?  If it's the former, is that a problem?

Possibly yes, if the default is "there is no coordination mechanism so of course it'll be competition-optimized", and therefore "obviously if there's anything of value here, then there must exist coordination mechanisms, so your job is to explain why they failed, and if you haven't even attempted to discuss what coordination mechanisms existed, then you're not making a serious intellectual contribution here".  On the other hand, it does seem that, in lots of situations, there is in fact no relevant coordination mechanism beyond "the preexisting social inclinations of the participants".  This seems context-dependent in terms of where people mention Moloch and what they say about it (and I have little data on that).

As with 500 Million, But Not A Single One More, deicide is a versatile (and inspiring!) framing for human advancement. My objection to your conceptualization of Mot is that it'd be more satisfying to break him up into smaller gods who can be slain individually: one for the god preventing "skyscrapers literally miles high" (whom you can also blame for the Tower of Babel), one for the god of keeping humans from flight (responsible for the Fall of Icarus, now pretty thoroughly dead), one for smallpox (Sopona, also basically dead), etc.

There is an equivocation going on in the post that bothers me. Mot is at first the deity of lack of technology, where "technology" is characterized with the usual examples of hardware (wheels, skyscrapers, phones) and wetware (vaccines, pesticides). Call this, for lack of a better term, "hard technology". Later however, "technology" is broadened to include what I'll call "social technologies" – LLCs, constitutions, markets etc. One could also put in here voting systems (not voting machines, but e.g. first-past-the-post vs approval), PR campaigns, myths. Social technologies are those that coordinate our behaviour, for ill or good. They can be designed to avoid coordination failures (equilibria no individual wants), or to maximize profits, to maintain an equilibria only a minority wants etc. (The distinction between social and hard tech obviously has blurry boundaries – you'll notice I didn't mention IT because much of it is on the border between the two. But somewhat blurry boundaries doesn't automatically threaten a distinction).

Broadening the definition is fine, but then stick to it. You swing between the two when, e.g. you claim that:

boosting technological progress is far more actionable than increasing coordination

This claim only makes sense if "technology" here excludes social tech. (By the way I'd love to see the numbers on this. I'm pretty skeptical. I'd be convinced of this claim when I see us allocate to e.g. voting reform, the kind of capital we're allocating to e.g. nuclear fusion. But of course capital allocation processes are... coordination processes. More on this next.)

I fully agree that coordination failures can be thought of as a type of technological failure (solving them requires all the technical prowess that other "hard" disciplines require). But they're a pretty distinct class of failure, and a distinctly important one for this reason: coordination failures tend to be upstream of other technological failures. What technology we build depends on how we are coordinate. If we coordinate "well" we get the technologies (hard or social) we all want (+when/what order we want them), and none of the ones we don't want (on some account of what "we" refers to – BIG questions of justice/fairness here, set aside for another day). 

 

I'm also bothered by some normative-descriptive ambiguity in the use of some terms. For example, technological progress is treated as some uni-directional thing here. This is plausibly (though not obviously) true if "progress" here is used normatively. It's definitely false if "progress" here is synonymous with something purely descriptive like "development" or better yet, "change." If technological development were uni-directional, you'd be hard pressed to account for good and bad technological developments. For such normative judgments to make any sense, there arguably had to be alternative developments we could have pursued, or at least developments we could have chosen not to pursue. See Acemoglu and Johnson for better understanding directions of technological development. Their work also provides examples of the additional predictive power gained by including the direction of technological development into one's models. Wheels? Net good. Engagement-maximizing social media? Arguably net bad. Here is an alternative framing (easier for libertarian-leaning folks to swallow): regulation is a coordination technology. Effective regulation directs technological development toward the outcomes the regulator wishes for; good regulation directs development toward good outcomes. The two are obviously not always the same. The social engineer (e.g. mechanism designer) tackles the question of how to create effective regulation; the political philosopher tackles what kind of development should be pursued. (And while the political philosopher is working things out, democracy has been determined as a decent placeholder director).

Another normative-descriptive ambiguity that bothered me: the use of "coordination." In "coordination failure" "coordination" is at least weakly normative: the phrase describes a situation in which agents failed to behave collectively in the manner that would have maximized welfare – they failed to behave the way they should have in some sense. They didn't behave ideally. "Coordination" has a purely descriptive use too though, as meaning something like "patterns in the collective behaviour of agents." I'll italicize the normative use. An instance of coordination failure can also be an instance of coordination. For example, in Scott Alexander's fishermen story (example 3 in Meditations on Moloch), the agents' actions are coordinated by a market that failed to internalize the cost of pollution, and this results in a coordination failure. When you say that: 

you could see Vegas as a product of the miraculous coordination technology that is modern capitalism—perhaps an edge case of it, but still an example of its brilliance.

I think there is an equivocation between coordination and coordination going on. Vegas is absolutely a fascinating example of capitalism's coordinating power, much like the peacock's feather is a fascinating example of sexual selection's coordinating power. But are either of these successful coordination? Much harder to say. (Not sure how to even begin answering the normative question in the peacock case). EDIT: Punchier example: the Stanford Prison Experiment is another fascinating example of role-playing's power to coordinate behaviour, but it sure doesn't seem like an example of successful coordination

I'm probably missing the main point here, but as I see it, Moloch is still the issue. More technology won't save anyone, "more is better" is a very naive view. There's coordination issues around technology, it's being used incorrectly, and this is the main reason why so few are saved by it.

And this:

Mot is the god of sterility and lifelessness. To me, he represents the lack of technology in our lives

Seems like a contradiction. Technology is sterile and lifeless (assuming that game of life isn't alive just because it's moving). Make it autonomous, and it will be something lifeless with a higher darwinian fitness than you, which is why it will replace humanity.

If the main point is "Don't personalize vague concepts" or something, then the main message is probably unharmed by my nitpicking. I just believe that naive optimism about technology is highly dangerous.

“I think the illusion comes from Scott using a simplistic notion of coordination, as exemplified by his claim that ‘the opposite of a trap is a garden…

Isn't that true for nearly everything Scott writes?

I never got the impression he ever even wanted to cater to the small numbers of bonafide genius intellectuals that actively read public blogs.

So he uses many many simplifications, reductions, metaphors, etc..., to appeal to a wider audience.

Replacing very false notions with somewhat false notions is pretty much the maximum expectation in that case.

TLDR: Moloch is more compelling for two reasons:

  • Earth is at "starting to adopt the wheel" stage in the coordination domain.

    • tech is abundant coordination is not
  • Abstractly, inasmuch as science and coordination are attractors

    • A society that has fallen mostly into the coordination attractor might be more likely to be deep in the science attractor too (medium confidence)
    • coordination solves chicken/egg barriers like needing both roads and wheels for benefit
    • but possible to conceive of high coordination low tech societies
      • Romans didn't pursue sci/tech attractor as hard due to lack of demand

With respect to the attractor thing (post linked below)

And science feeds on itself, and feeds technology and is fed by technology. So it's no coincidence that a timeline which builds advanced microprocessors is also likely to possess airplanes. When you see aliens that have stainless steel, your first thought is not that they are specially adept with metals, but that they have wandered some little way into the science-technology attractor.

Earth is at "starting to adopt the wheel" stage in the coordination domain.

tech is abundant coordination is not

I'm not sure this is actually right, and I think coordination is in fact abundant compared to other animals. Indeed, the ability for humans to be super-cooperative and imitative of each other is argued by Heinrich to be one of the major factors, if not the major factor for human dominance.

This is definitely subjective. Animals are certainly worse off in most respects and I disagree with using them as a baseline.

Imitation is not coordination, it's just efficient learning and animals do it. They also have simple coordination in the sense of generalized tit for tat (we call it friendship). You scratch my back I scratch yours.

Cooperation technologies allow similar things to scale beyond the number of people you can know personally. They bring us closer to the multi agent optimal equilibrium or at least the Core(Game Theory).

Examples of cooperation technologies:

  • Governments that provide public goods (roads, policing etc.)
  • Money/(Financial system)/(stock market)
    • game theory equivalent of "transferable utility".
  • Unions

So yes we have some well deployed coordination technologies (money/finance are the big successes here)

It's definitely subjective as to whether tech or cooperation is the less well deployed thing.

There are a lot of unsolved collective action problems though. Why are oligopolies and predatory businesses still a thing? Because coordinating to get rid of them is hard. If people pre-commited to going the distance with respect to avoiding lock in and monopolies, would-be monopolists would just not do that in the first place.

While normal technology is mostly stuff and can usually be dumbed down so even the stupidest get some benefit, cooperation technologies may require people to actively participate/think. So deploying them is not so easy and may even be counterproductive. People also need to have enough slack to make them work.

we should write off the phrase ‘we could just agree’ as a particularly seductive appeal to magic.”

One could analogize it to "We could just extract the potential energy from the plentiful hydrogen atoms by combining them into helium atoms.  What's so complicated about that?"

The problem of designing a creature that never develops cancer is also analogous, at multiple levels, to that of designing an organization where everyone has an incentive to do their jobs well and it's robust to a handful of mal-intentioned individuals.

This is great: I now curse the names of both Mot and Moloch. May they be slain, and upon their corpses (or from them, à la Ymir, if you prefer), we will build utopia.