Posts

Sorted by New

Wiki Contributions

Comments

61i72h4y30

Currently rereading the series, so I thought I'd point out something. Thanks for the reading suggestions in the text :).

At the beginning of Dune, we are told that the Great Schools come into existence after the Butlerian Jihad to replace the functions served by machines. While these are claimed to be 'human' functions they bear remarkable similarity to the functions of machines, and we can roughly categorize these as follows:

Navigators: Mathematics, pure abstraction.

Mentats: Computation, simulation and rejection of inconsistent alternatives. Logic. Bayesian reasoning.

Suks: Moral apathy.

Bene Gesserit: Command architecture.

My main problem is the suggestion that the Dune Universe is internally aware of and attempts to rectify Goodhart's Law, and that the process of doing so is the rejection of automation.

The existence of the Bene Gesserit and the ubiquitous spread of religion are to be taken as counterexamples to this point. We know that their main tools are politics and religion, two fields of rhetoric which can be seen as essentially manipulative in nature; their design is to enforce behaviorisms favorable to the Bene Gesserit using sex, mind control (Voice), and indoctrination. At numerous points in the Dune universe, the Bene Gesserit 'hack' human cognition with these tools, turning the people around them into pliable robots which do not think for themselves. We also know that there are multiple avenues by which this is done, through the OC Bible, the Missionaria Protectiva, the Zensunni and Zenshiite teachings, and the super-religion developed and realized by the Atreides. So to a large degree we can say that automation exists (in the sense of unthinking work), it is just that it is humans who are doing it, not machines. To reiterate, it is unthinkingly bureaucratic because laws are not open to interpretation, they are just shifted from processing signals mechanically to processing them religious-fundamentally.

Of course, the Bene Gesserit also make an important distinction: those who are not Bene Gesserit are not 'human'. They have not been tested in the manner of the Gom Jabbar. So the BG have no qualms about how they treat the 'subhumans' around them, as they are not the ultimate benefactors of their long-term vision of human survival. Ordinary humans are tools, but complex ones. Machines.

We may also take a moment to look at the Gom Jabbar as an example of an unthinking measurement of a signal which probably doesn't say all that much about humanity, and really just something about pain thresholds and early childhood training. We know that the Bene Gesserit disdain love and strictly follow their codified beliefs to the point that many would not consider rejecting their programming. So here they are put in a position of imposing a bad test which 'could' be wrong, but which is impassably automated by doctrine. The Fremen have the same custom. The Bene Gesserit, despite being 'human', are not immune to political programming via their Mother Superior.

With this in mind, I'd say that 'if' the Butlerian Jihad was aimed at ending automation, it totally failed. In fact, it failed so miserably that all it really did is make humans more like the machines they were trying to overcome. The ultimate realization of this is Muad'Dib's Imperium, where a man is trapped into a deterministic course by prescience and holds the Imperium under an extended theo-bureaucratic hegemony whose citizens are programmed by religious law. Arguably the point of Paul's story is that he finds a middle path between rejection of a rational objective, its acceptance, and a personal, irrational desire (irrational in the sense that it betrays the objective upheld by House Atreides, including the unwed Leto). He marries the woman he loves, when what may be better for Humanity is that he marry Irulan Corrino.

That is possibly the decidedly human thing the Butlerian Jihad was fought for. A thinking being with a subjective rationale.


So what else might have happened in the B. Jihad?

If we ignore the prequels, we have this of Leto II's experiences of the Jihad through firsthand memory:

"We must reject the machines-that-think. Humans must set their own guidelines. This is not something machines can do. Reasoning depends upon programming, not on hardware, and we are the ultimate program!"


1. We are the ultimate program

From this we could say that it was never the intention to remove programming, and thus automation. It was to follow the best program. So who had the best program? We can assume that the best program is the most reasonable and rational one.


2. Reasoning depends upon programming, not on hardware. This is not something machines can do.

Advances in computing can roughly be divided between hardware and software. The point made by this jihadi is that advances kept being made in machine hardware, so that they were faster and more efficient, but that at some point, humans failed to create reasoning machines. At least, the jihadis believed so. The thinking machines they created were imperfect in some way, whether lacking in emotion or creativity or consciousness or even so much as the ability to choose, or tell the difference between a turtle and a rifle. Humanity proved itself incapable of creating a machine as sophisticated as a human being.


3. We might guess at a reason for this by borrowing from other aspects of the Dune universe:

“Without change something sleeps inside us, and seldom awakens. The sleeper must awaken.”

Almost entirely throughout the Dune series we are met with the idea of meeting our full potential through the application of stressors. It's basically a multi-book manifesto on post-traumatic growth. Whether it's the Fremen being shaped by Arrakis or the Atreides being shaped by the kindly tyranny of Leto II, the idea is that comfort, a lack of problems, leads downward to eternal stagnation. I would suggest that this is what happened in the Butlerian Jihad: humans reached a point where machines were as smart as they were going to get. Humans were not smart enough to make machines smarter, and humans weren't getting any smarter themselves. Because they had reached their technological epitome, they lived decadent lives with few stressors. In a utopian society, they were essentially ham-stringed, incapable of speeding up space-flight beyond pre-spice levels. Much like at the end of Leto II's reign, their pent-up wanderlust (which we can assume is a real thing in the Dune universe) reached a tipping point. We don't know if the spice had been discovered at this point. But it is probable, given the nature of the Dune universe, that in such a condition humanity looked for ways beyond machines, such as the spice, to break through the barriers imposed by technological dependency. Thus there was the first scattering that created the Corrino Imperium.

I basically believe this because in explains the most perplexing and otherwise inconsistent phrases in the series:

"They were all caught up in the need of their race to renew its scattered inheritance, to cross and mingle and infuse their bloodlines in a great new pooling of genes. And the race knew only one sure way for this - the ancient way, the tried and certain way that rolled over everything in its path: jihad."

One might accept that there was resistance from machines led by humans who opposed the Jihad, leading to a war (remember, machines do not have volition), or, as in the prequels, operating according to the programming of the power-hungry.

It's almost like an anti-singularity: rather than machines outpacing human thinking, humans reach a point where they cannot compute a way forward that includes the existence of machines. So they removed that constant in the equation, and formulate a new one. Much like with spice, they developed a dangerous dependency that prevented them from expanding. The jihad was an erasure of this dependency, just as the Atreides Jihad eventually erased dependency on spice.

If there is any lesson in that for budding rationalists, I'd say it is this: it is dangerous to have a single source of prosperity, be it a machine or a drug. This creates a fragile system with an upper limit on progress. Instead, we need to have an options-mindset that recognizes rationality works best when paired with numerous choices. Accept that AI is not the sole means of automation available to us, nor is it inherently a bad option; there are just other means we must consider in parallel, or the way before us may narrow into a single, terrible way forward: backward.