The more time that passes, the likelier it becomes that transhumanism and Singularity futurism will eventually find political expression. It's also likely that the various forms of rationalistic utilitarian altruism existing in certain corners of the Internet will eventually give rise to a distinctive ideology that will take its place in the spectrum of political views that count. It is even possible that some intersection of these two currents - the futurological rationalism on display at this site - will give rise to a politically minded movement or organization. This post, the earlier "Altruist Support" sequence by Giles, a few others show that there's some desire to do this. However, as things stand, this desire is still too weak and formless for anyone to actually do anything, and if anyone did become worked-up and fanatical enough to organize seriously, the result would most likely be an irrelevant farce, a psychodrama only meaningful to half a dozen people.
The current post combines: complete blindness with respect to what's involved in acquiring power at a national or international level; no sense of how embattled and precarious is the situation of futurist causes...
If it were that simple to take over the world, someone would have already done it. Whether this should update you in the direction of things not being so simple or in the direction of other conspiracies already controlling the world has been left as an exercise to the reader.
Conditional on a Conspiracy existing, the probability that they'd reveal themselves to an unknown person asking via e-mail has to be pretty low. What you obviously should have done instead is to brainstorm for five minutes on how you would really recruit new members if you were the Conspiracy, or alternately on what courses of action you could take to benefit the Conspiracy if it existed. But, like I said, it's too late now- instead, you've signaled that you're clever enough to come up with an idea but not disciplined enough to think it through properly, and that's precisely the type of member a Bayesian Conspiracy would wish to avoid.
This needs a safety hatch.
It is a recurring pattern in history for determined, well-intentioned people to seize power and then do damage. Certainly we're different because we're rational, but they were different because they were ${virtueTheyValueMost}. See also The Outside View and The Sorting Hat's Warning.
A conspiracy of rationalists is even more disturbing because of how closely it resembles an AI. As individuals, we balance more logic based on our admittedly underspecified terminal values against moral intuition. But our intuitions do not match, nor do we communicate them easily. So collectively moral logic dominates. Pure moral logic without really good terminal values... we've been over this.
Don't worry. This is exactly what the Contrarian Conspiracy was designed to prevent.
Everything is going according to plan.
Imagine, also, how many lives are lost every day due to governmental negligence, and war, and poverty, and hunger
I was watching a hockey game with my ex-girlfriend when a fight broke out (on the ice, not between us). "That shouldn't be allowed!" she said. "It isn't," I responded. "It's a five minute penalty." "But the referees are just watching them fight. They should stop them from fighting!" "That's not an action. They can move their bodies and arms, and step between them, or pull them from behind. But 'making them stop' isn't something that a person can just decide to do. If they step between them now, someone could get hurt."
"Ending negligence" unfortunately isn't an action, unlike, say, typing. It's more like "stopping fighting".
Wow, this post shot LW's "politics is a mind-killer" policy in the head and jumped up and down on its corpse. That said, I'm at loss about how I feel. This seems to me at once dangerously naive and blissfully idealistic. I do feel, though, that having a government/system like this in place will increase the chances of positive singularity by a good margin, and that's nothing to scuff at.
The term “world domination” is, unfortunately, rather socially charged, bringing to mind an image of the archetypal mad scientist with marching robot armies.
Or a pair of laboratory mice, whose genes have been spliced.
Not feasible. Let's aim for a more modest goal, say, better PR and functional communities.
Moreover, not this community's comparative advantage. Why do we think we'd be any better than anyone else at running the world? And why wouldn't we be subject to free-riders, power-seekers, and rationalists-of-fortune if we started winning?
In addition to everything thats already been said: when the median rationalist is still struggling to get a date the idea of winning popularity contests and infiltrating the domain of charismatic, glad-handing net-workers is preposterous.
I submit that the primary mid-term goal of the Bayesian Conspiracy, at this stage of its existence, is and/or ought to be nothing less than world domination.
Before the rotten fruit begins to fly, let me make a brief clarification.
Is it odd that I laughed out loud at the idea that this should even be controversial?
I suppose the biggest question is, is all this realistic? Or is just an idealist's dream?
While beautifully written; it does sound all an idealist's dream. Or at least you have said very little to suggest otherwise.
More downvotes would send you to negative karma if there is such a place, and that's a harsh punishment for someone so eloquent. In sparing you a downvote, I encourage you to figure out what went wrong with this post and learn from it.
If there's three things I've found in my little time here it is that the community strongly admires in posts...
That's quite all right; I'm sure the naivete blossoming forth from the OP makes that an easy mistake to make. :P
I'm well aware of the Discussion Section... which only compounds my error. Yes, this should have been posted there. Losing some eighty Karma (by the way, apparently negative Karma does not exist per se, but perhaps it does de facto... is as good a wakeup call as any for the sin of overconfidence.
I would have traded my karma simply for the advice you've given here. Thank you. And thank you for the compliment on my writing style; nice to see not everything about this experience was negative. I assure you that I will not be leaving any time soon. When I first saw that this post was getting a negative response, I made a split-second decision: should I flee, or should I learn? I choose to learn.
I agree that more rationality in politics would be a good thing, but I think this post is making too big of a deal out of it. Eliezer said essentially the same thing, "rationalists ought to be more active in politics", much more succintly here.
Are you a rationalist who feels like you could go into politics? Well then, go into politics if you think that's where your comparative advantage lies. See if you can get your local friends to support you. Getting the support of fellow rationalists is good, but the main thing is getting emotional support fr...
It seems pretty obvious that Eliezer's view is that FAI is the quick ticket to world domination (in the sense of world states that you talk about), and he seems to be structuring his conspiracy accordingly.
It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions.
Really? How would one demonstrate this? What does it mean for a definition to be "correct"? If something is true by definition, is it really demonstrable?
...we have a moral obligation to work o
I read it and I thought it was amazingly similar to a lot of the thoughts and feelings I've had going through my head recently. Maybe this is just the emotion and fallow of youth but I feel like the world as a whole is very apathetic towards the suffering that exists outside of the bubble of the First World that LW exists in. How can you honestly choose cryonics over the utility of an organization built to protect human life until the singularity along with Eliezer's group which works to ensure a positive singularity.
I recently saw a movie about governmen...
What if we had the infiltration of non rationalists who, through their cunning, gained control and we ended up with what we've got now? lol Or what if the road to being a rationalist is paved by the mistakes of the non rationalists eg the original thought to live forever may not have come from a rationalist, but can only be determined by a rationalist.? lolx2 Or worse, the richness of life is felt, or only made real by the exquisite angst that is caused by the tension between rationality and mendacity?
Suppose that there were to exist such an entity as the Bayesian Conspiracy.
I speak not of the social group of that name, the banner under which rationalists meet at various conventions – though I do not intend to disparage that group! Indeed, it is my fervent hope that they may in due time grow into the entity which I am setting out to describe. No, I speak of something more like the “shadowy group of scientists” which Yudkowsky describes, tongue (one might assume) firmly in cheek. I speak of such an organization which has been described in Yudkowsky's various fictional works, the secret and sacred cabal of mathematicians and empiricists who seek unwaveringly for truth... but set in the modern-day world, perhaps merely the seed of such a school, an organization which can survive and thrive in the midst of, yet isolated from, our worldwide sociopolitical mess. I ask you, if such an organization existed, right now, what would – indeed, what should – be its primary mid-term (say, 50-100 yrs.) goal?
I submit that the primary mid-term goal of the Bayesian Conspiracy, at this stage of its existence, is and/or ought to be nothing less than world domination.
Before the rotten fruit begins to fly, let me make a brief clarification.
The term “world domination” is, unfortunately, rather socially charged, bringing to mind an image of the archetypal mad scientist with marching robot armies. That's not what I'm talking about. My usage of the phrase is intended to evoke something slightly less dramatic, and far less sinister. “World domination”, to me, actually describes rather a loosely packed set of possible world-states. One example would be the one I term “One World Government”, wherein the Conspiracy (either openly or in secret) is in charge of all nations via an explicit central meta-government. Another would be a simple infiltration of the world's extant political systems, followed by policy-making and cooperation which would ensure the general welfare of the world's entire population – control de facto, but without changing too much outwardly. The common thread is simply that the Conspiracy becomes the only major influence in world politics.
(Forgive my less-than-rigorous definition, but a thorough examination of the exact definition of the word “influence” is far, far outside the scope of this article.)
So there is my claim. Let me tell you why I believe this is the morally correct course of action.
Let us examine, for a moment, the numerous major good works which are currently being openly done by rationalists, or with those who may not self-identify as rationalists, but whose dogmas and goals accord with ours. We have the Singularity Institute, which is concerned with ensuring that our technological, transhumanistic advent happens smoothly and with a minimum of carnage. We have various institutions worldwide advocating and practicing cryonics, which offers a non-zero probability of recovery from death. We have various institutions also who are working on life extension technologies and procedures, which offer to one day remove the threat of death entirely from our world.
All good things, I say. I also say: too slow!
Imagine what more could be accomplished if the United States, for example, granted to the Life Extension Foundation or to Alcor the amount of money and social prominence currently reserved for military purposes. Imagine what would happen if every scientist around the world were perhaps able to contribute under a unified institution, working on this vitally important problem of overcoming death, with all the money and time the world's governments could offer at their disposal.
Imagine, also, how many lives are lost every day due to governmental negligence, and war, and poverty, and hunger. What does it profit the world, if we offer to freeze the heads of those who can afford it, while all around us there are people who can't even afford their bread and water?
I have what is, perhaps, to some who are particularly invested, an appalling and frightening proposition: for the moment, we should devote fewer of our resources to cryonics and life extension, and focus on saving the lives of those to whom these technologies are currently beyond even a fevered dream. This means holding the reins of the world, that we might fix the problems inherent in our society. Only when significant steps have been taken in the direction of saving life can we turn our focus toward extending life.
What should the Bayesian Conspiracy do, once it comes to power? It should stop war. It should usurp murderous despots, and feed the hungry and wretched who suffered under them. Again: before we work on extending the lives of the healthy and affluent beyond what we've so far achieved, we should, for example, bring the average life expectancy in Africa above the 50-year mark, where it currently sits (according to a 2006 study in the BMJ). This is what will bring about the maximum level of happiness in the world; not cryonics for those who can afford it.
Does this mean that we should stop researching these anti-death technologies? No! Of course not! Consider: even if cryonics drops to, say, priority 3 or 4 under this system, once the Conspiracy comes to power, that will still be far more support than it's currently receiving from world governments. The work will end up progressing at a far faster rate than it currently does.
Some of you may have qualms about this plan of action. You may ask, what about individual choice? What about the peoples' right to choose who leads them? Well, for those of us who live in the United States, at least, this is already a bit of a naïve question: due to color politics, you already do not have much of a choice in who leads you. But that's a matter for another time. Even if you think that dictatorship – even benevolent, rationalist dictatorship – would be inherently morally worse than even the flawed democratic system we enjoy here – a notion that may not even necessarily be the case! – do not worry: there's no reason why world domination need entail dictatorships. In countries where there are democratic systems in place, we will work within the system, placing Conspirators into positions where they can convince the people, via legitimate means, to give them public office. Once we have attained a sufficient level of power over this democratic system, we will effect change, and thence the work will go forth until this victory of rationalist dogma covers all the earth. When there are dictators, they will be removed and replaced with democratic systems... under the initial control of Conspirators, of course, and ideally under their continued control as time passes – but legitimately obtained control.
It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions. Therefore, the people who make decisions that affect large numbers of people ought to be those who have the highest level of rationality. In this way we can seek to avoid the many, many, many pitfalls of politics, including the inefficiency which Yudkowsky has again and again railed against. If all the politicians are on the same side, who's to argue?
In fact, even if two rationalists disagree on a particular point (which they shouldn't, but hey, even the best rationalists aren't perfect yet), they'll be able to operate more efficiently than two non-rationalists in the same position. Is the disagreement able to be settled by experiment? If it's important, throw funds at a lab to conduct such an experiment! After all, we're in charge of the money and the scientists. Is it not? Find a compromise that has the maximum expected utility for the constituents. We can do that with a high degree of accuracy; we have access to the pollsters and sociologists, and know about reliable versus unreliable polling methods!
What about non-rationalist aspiring politicians? Well, under an ideal Conspiracy takeover, there would be no such thing. Lessons on politics would include rationality as a basis; graduation from law school would entail induction into the Conspiracy, and access to the truths had therein.
I suppose the biggest question is, is all this realistic? Or is just an idealist's dream? Well, there's a non-zero probability that the Conspiracy already exists, in which case, I hope that they will consider my proposal... or, even better, I hope that I've correctly deduced and adequately explained the master plan. If the Conspiracy does not currently exist, then if my position is correct, we have a moral obligation to work our hardest on this project.
“But I don't want to be a politician,” you exclaim! “I have no skill with people, and I'd much rather tinker with the Collatz Conjecture at my desk for a few years!” I'm inclined to say that that's just too bad; sacrifices must be made for the common good, and after all, it's often said that anyone who actually wants a political office is by the fact unfit for the position. But in all realism, I'm quite sure that there will be enough room in the Conspiracy for non-politicians. We're all scientists and mathematicians at heart, anyway.
So! Here is our order of business. We must draw up a charter for the Bayesian Conspiracy. We must invent a testing system able to keep a distinction between those who are and are not ready for the Truths the Conspiracy will hold. We must find our strongest Rationalists – via a testing procedure we have not yet come up with – and put them in charge, and subordinate ourselves to them (not blindly, of course! The strength of community, even rationalist community, is in debate!). We must establish schools and structured lesson plans for the purpose of training fresh students; we must also take advantage of those systems which are already in place, and utilize them for (or turn them to) our purposes. I expect to have the infrastructure set up in no more than five years.
At that point, our real work will begin.