Suppose that there were to exist such an entity as the Bayesian Conspiracy.

I speak not of the social group of that name, the banner under which rationalists meet at various conventions – though I do not intend to disparage that group! Indeed, it is my fervent hope that they may in due time grow into the entity which I am setting out to describe. No, I speak of something more like the “shadowy group of scientists” which Yudkowsky describes, tongue (one might assume) firmly in cheek. I speak of such an organization which has been described in Yudkowsky's various fictional works, the secret and sacred cabal of mathematicians and empiricists who seek unwaveringly for truth... but set in the modern-day world, perhaps merely the seed of such a school, an organization which can survive and thrive in the midst of, yet isolated from, our worldwide sociopolitical mess. I ask you, if such an organization existed, right now, what would – indeed, what should – be its primary mid-term (say, 50-100 yrs.) goal?

I submit that the primary mid-term goal of the Bayesian Conspiracy, at this stage of its existence, is and/or ought to be nothing less than world domination.

Before the rotten fruit begins to fly, let me make a brief clarification.

The term “world domination” is, unfortunately, rather socially charged, bringing to mind an image of the archetypal mad scientist with marching robot armies. That's not what I'm talking about. My usage of the phrase is intended to evoke something slightly less dramatic, and far less sinister. “World domination”, to me, actually describes rather a loosely packed set of possible world-states. One example would be the one I term “One World Government”, wherein the Conspiracy (either openly or in secret) is in charge of all nations via an explicit central meta-government. Another would be a simple infiltration of the world's extant political systems, followed by policy-making and cooperation which would ensure the general welfare of the world's entire population – control de facto, but without changing too much outwardly. The common thread is simply that the Conspiracy becomes the only major influence in world politics.

(Forgive my less-than-rigorous definition, but a thorough examination of the exact definition of the word “influence” is far, far outside the scope of this article.)

So there is my claim. Let me tell you why I believe this is the morally correct course of action.

Let us examine, for a moment, the numerous major good works which are currently being openly done by rationalists, or with those who may not self-identify as rationalists, but whose dogmas and goals accord with ours. We have the Singularity Institute, which is concerned with ensuring that our technological, transhumanistic advent happens smoothly and with a minimum of carnage. We have various institutions worldwide advocating and practicing cryonics, which offers a non-zero probability of recovery from death. We have various institutions also who are working on life extension technologies and procedures, which offer to one day remove the threat of death entirely from our world.

All good things, I say. I also say: too slow!

Imagine what more could be accomplished if the United States, for example, granted to the Life Extension Foundation or to Alcor the amount of money and social prominence currently reserved for military purposes. Imagine what would happen if every scientist around the world were perhaps able to contribute under a unified institution, working on this vitally important problem of overcoming death, with all the money and time the world's governments could offer at their disposal.

Imagine, also, how many lives are lost every day due to governmental negligence, and war, and poverty, and hunger. What does it profit the world, if we offer to freeze the heads of those who can afford it, while all around us there are people who can't even afford their bread and water?

I have what is, perhaps, to some who are particularly invested, an appalling and frightening proposition: for the moment, we should devote fewer of our resources to cryonics and life extension, and focus on saving the lives of those to whom these technologies are currently beyond even a fevered dream. This means holding the reins of the world, that we might fix the problems inherent in our society. Only when significant steps have been taken in the direction of saving life can we turn our focus toward extending life.

What should the Bayesian Conspiracy do, once it comes to power? It should stop war. It should usurp murderous despots, and feed the hungry and wretched who suffered under them. Again: before we work on extending the lives of the healthy and affluent beyond what we've so far achieved, we should, for example, bring the average life expectancy in Africa above the 50-year mark, where it currently sits (according to a 2006 study in the BMJ). This is what will bring about the maximum level of happiness in the world; not cryonics for those who can afford it.

Does this mean that we should stop researching these anti-death technologies? No! Of course not! Consider: even if cryonics drops to, say, priority 3 or 4 under this system, once the Conspiracy comes to power, that will still be far more support than it's currently receiving from world governments. The work will end up progressing at a far faster rate than it currently does.

Some of you may have qualms about this plan of action. You may ask, what about individual choice? What about the peoples' right to choose who leads them? Well, for those of us who live in the United States, at least, this is already a bit of a naïve question: due to color politics, you already do not have much of a choice in who leads you. But that's a matter for another time. Even if you think that dictatorship – even benevolent, rationalist dictatorship – would be inherently morally worse than even the flawed democratic system we enjoy here – a notion that may not even necessarily be the case!  do not worry: there's no reason why world domination need entail dictatorships. In countries where there are democratic systems in place, we will work within the system, placing Conspirators into positions where they can convince the people, via legitimate means, to give them public office. Once we have attained a sufficient level of power over this democratic system, we will effect change, and thence the work will go forth until this victory of rationalist dogma covers all the earth. When there are dictators, they will be removed and replaced with democratic systems... under the initial control of Conspirators, of course, and ideally under their continued control as time passes – but legitimately obtained control.

It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions. Therefore, the people who make decisions that affect large numbers of people ought to be those who have the highest level of rationality. In this way we can seek to avoid the many, many, many pitfalls of politics, including the inefficiency which Yudkowsky has again and again railed against. If all the politicians are on the same side, who's to argue?

In fact, even if two rationalists disagree on a particular point (which they shouldn't, but hey, even the best rationalists aren't perfect yet), they'll be able to operate more efficiently than two non-rationalists in the same position. Is the disagreement able to be settled by experiment? If it's important, throw funds at a lab to conduct such an experiment! After all, we're in charge of the money and the scientists. Is it not? Find a compromise that has the maximum expected utility for the constituents. We can do that with a high degree of accuracy; we have access to the pollsters and sociologists, and know about reliable versus unreliable polling methods!

What about non-rationalist aspiring politicians? Well, under an ideal Conspiracy takeover, there would be no such thing. Lessons on politics would include rationality as a basis; graduation from law school would entail induction into the Conspiracy, and access to the truths had therein.

I suppose the biggest question is, is all this realistic? Or is just an idealist's dream? Well, there's a non-zero probability that the Conspiracy already exists, in which case, I hope that they will consider my proposal... or, even better, I hope that I've correctly deduced and adequately explained the master plan. If the Conspiracy does not currently exist, then if my position is correct, we have a moral obligation to work our hardest on this project.

“But I don't want to be a politician,” you exclaim! “I have no skill with people, and I'd much rather tinker with the Collatz Conjecture at my desk for a few years!” I'm inclined to say that that's just too bad; sacrifices must be made for the common good, and after all, it's often said that anyone who actually wants a political office is by the fact unfit for the position. But in all realism, I'm quite sure that there will be enough room in the Conspiracy for non-politicians. We're all scientists and mathematicians at heart, anyway.

So! Here is our order of business. We must draw up a charter for the Bayesian Conspiracy. We must invent a testing system able to keep a distinction between those who are and are not ready for the Truths the Conspiracy will hold. We must find our strongest Rationalists – via a testing procedure we have not yet come up with – and put them in charge, and subordinate ourselves to them (not blindly, of course! The strength of community, even rationalist community, is in debate!). We must establish schools and structured lesson plans for the purpose of training fresh students; we must also take advantage of those systems which are already in place, and utilize them for (or turn them to) our purposes. I expect to have the infrastructure set up in no more than five years.

At that point, our real work will begin.

New Comment
80 comments, sorted by Click to highlight new comments since: Today at 7:47 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

The more time that passes, the likelier it becomes that transhumanism and Singularity futurism will eventually find political expression. It's also likely that the various forms of rationalistic utilitarian altruism existing in certain corners of the Internet will eventually give rise to a distinctive ideology that will take its place in the spectrum of political views that count. It is even possible that some intersection of these two currents - the futurological rationalism on display at this site - will give rise to a politically minded movement or organization. This post, the earlier "Altruist Support" sequence by Giles, a few others show that there's some desire to do this. However, as things stand, this desire is still too weak and formless for anyone to actually do anything, and if anyone did become worked-up and fanatical enough to organize seriously, the result would most likely be an irrelevant farce, a psychodrama only meaningful to half a dozen people.

The current post combines: complete blindness with respect to what's involved in acquiring power at a national or international level; no sense of how embattled and precarious is the situation of futurist causes... (read more)

I'm being pulled off to bed, but from my skimming this looks like a very, very helpful critique. Thank you for posting it; I'll peruse it as soon as I'm able. One note: I did note after posting this, but too late to make a meaningful change, that "we should support cryonics less" is rather a ridiculous notion, considering the people I'm talking to are probably not the same people who are working hardest on cryonics. So: oops.
What does this mean, exactly? It's something that without thinking about it I seem to intuitively understand, but my thinking falls apart when I try to examine it closely. It's like zooming in on a picture and finding that no further pixels are part of the data.
Originally I wrote "It is inevitable that" there will be a politics of the Singularity. But it's possible (e.g. AI hard takeoff) that a singularity could happen before the political culture digests the concept. So there are two ways in which more time equals more futurism in politics. First, the further into the human future we go, the more futurist becomes the general sensibility. Second, the longer the posthuman future holds off, the more time there is for this evolution of human culture to occur.
It's interesting reading this old comment in light of Effective Altruism.

If it were that simple to take over the world, someone would have already done it. Whether this should update you in the direction of things not being so simple or in the direction of other conspiracies already controlling the world has been left as an exercise to the reader.

If there is such a Conspiracy, you've already failed the most obvious membership test.

  • First Rule .... p(A|X) =p(X|A)p(A) / (p(X|A)p(A) + p(X|~A)*p(~A))
  • Second Rule You DO NOT talk about Bayes Club.
Ha! Yes, I had this thought as well. I actually messaged Yudkowsky, warning him that I was considering posting this, on the off chance that a) the Conspiracy existed, b) he was among their ranks, and c) he wanted me to not stir up this possibility. I waited for a response for a period of time consistent with my estimation of the probability of the Conspiracy existing in an incarnation that would meaningfully object.

Conditional on a Conspiracy existing, the probability that they'd reveal themselves to an unknown person asking via e-mail has to be pretty low. What you obviously should have done instead is to brainstorm for five minutes on how you would really recruit new members if you were the Conspiracy, or alternately on what courses of action you could take to benefit the Conspiracy if it existed. But, like I said, it's too late now- instead, you've signaled that you're clever enough to come up with an idea but not disciplined enough to think it through properly, and that's precisely the type of member a Bayesian Conspiracy would wish to avoid.

Indeed, to the extent that members of the Conspiracy reason similarly, they do not need to communicate at all.
Your chastisement is well taken. Thank you.

This needs a safety hatch.

It is a recurring pattern in history for determined, well-intentioned people to seize power and then do damage. Certainly we're different because we're rational, but they were different because they were ${virtueTheyValueMost}. See also The Outside View and The Sorting Hat's Warning.

A conspiracy of rationalists is even more disturbing because of how closely it resembles an AI. As individuals, we balance more logic based on our admittedly underspecified terminal values against moral intuition. But our intuitions do not match, nor do we communicate them easily. So collectively moral logic dominates. Pure moral logic without really good terminal values... we've been over this.


Don't worry. This is exactly what the Contrarian Conspiracy was designed to prevent.

Everything is going according to plan.

Huh. An interesting point, and one that I should have considered. So what would you suggest as a safety hatch?
I don't know, but I'll throw some ideas up. These aren't all the possibilities and probably don't include the best possibility. Each step must be moral taken in isolation. No it'll-be-worth-it-in-ten-years reasoning, since that can go especially horribly wrong. Work honestly within the existing systems. This allows existing safeguards to apply. On the other hand, it assumes it's possible to get anything done within existing systems by being honest. Establish some mechanism to keep moral intuition. Secret-ballot mandatory does-this-feel-right votes. Divide into several conspiracies, which are forbidden to have discuss issues with eachother, preventing groupthink. Have an oversight conspiracy, with the power to shut us down if they believe we've gone evil.

Imagine, also, how many lives are lost every day due to governmental negligence, and war, and poverty, and hunger

I was watching a hockey game with my ex-girlfriend when a fight broke out (on the ice, not between us). "That shouldn't be allowed!" she said. "It isn't," I responded. "It's a five minute penalty." "But the referees are just watching them fight. They should stop them from fighting!" "That's not an action. They can move their bodies and arms, and step between them, or pull them from behind. But 'making them stop' isn't something that a person can just decide to do. If they step between them now, someone could get hurt."

"Ending negligence" unfortunately isn't an action, unlike, say, typing. It's more like "stopping fighting".

That's quite true. But I have a hunch (warning: bare assertion) that much governmental negligence is due to a) self-interest and b) corruption (see: corrupt African dictatorships).
Somehow I missed seeing your comment (I think), and said what amounts to basically the same thing a few hours later elsewhere. The way I put it was more hopeless and forgiving though, implying that a lot of corruption is inevitable and we should judge actual governments against the ideal government that would also have a lot of negligence, just less. (Warning: political comment ahead.) I had an insight recently about why I approved of the conclusions of certain conservative or libertarian arguments less often than one would think given my agreement with the premises. (I'm not giving the percentages or my aggregate leanings here, I think it works regardless of what they are.) Namely, I realized that many valid anti-government arguments are mostly anti-bureaucracy arguments. Bureaucracy is still a cost of privatization, just less of one, and it is roughly inversely proportional to the number of businesses that would fill the economic function if the government didn't. So my intuitions (far view, compartmentalizations) were correct this time, and accounting for the hidden cost of the options that lessened or minimized bureaucracy. Baselines are very important, and its also important to note victories of the compartmentalization heuristic for those like me who are inclined the other way. Now I will indulge in a few words about the role of fighting in professional hockey. It would be easy for me to say that all of the anti-fighting arguments I've heard are either foolish, naive, dismissive of obvious unintended consequences, contemptuous towards evidence, deontological, and/or unaware of human nature. Some genuinely militate against fighting, but are weak, so I don't believe I'm seeing arguments as soldiers too much However, one connotation of the above hypothetical statement would be false, for I have generated an argument from the wreckage of what I have heard, in an attempt to have sound conclusions. This doesn't happen very often so it's worth noticing and mention
Not to go too far off-topic here, but it would be trivial for the league to prevent fighting; just impose real penalties, like ejection from the game and/or suspension from future games. That's how most other professional sports work, and, not surprisingly, there aren't typically fights during the game in those sports (even in physically aggressive ones like football and basketball.) I don't see why one would expect the implementation of such a rule in hockey to result in anything different. Whether or not you think that ice hockey without fighting would have a "dramatically slash[ed]" entertainment value, is, I suppose, a matter of opinion.
I didn't say that fighting is entertaining, but that fighting maintains safety, and many unrelated safety measures would reduce entertainment. Less fighting is probably a means, yes? The end is well-being and safety? It's how most hockey leagues and tournaments work, allowing for even better comparisons.
Got it, I think I misunderstood your position about fighting and safety. I get your point now. Thanks!
In the's not often that a link to is appropriate and on topic at LW. So let's savor the moment! There was no norm to stop the conflict with a one-on-one fight ended by referees after the parties were tired, nor a secondary one for the conflict to be between all members on the court paired off 5v5, so it went straight to a bench clearing brawl. Assuming nuclear arsenals were universal and impossible to disarm, I would be wary of extremist conventional arms control.
2Peter Wildeford13y
Don't underestimate the concept of people just not thinking through their actions. People who are guilty of negligence are the ones who simply failed to properly secure their beliefs, not the ones who deliberately decided they benefited from killing others.

Wow, this post shot LW's "politics is a mind-killer" policy in the head and jumped up and down on its corpse. That said, I'm at loss about how I feel. This seems to me at once dangerously naive and blissfully idealistic. I do feel, though, that having a government/system like this in place will increase the chances of positive singularity by a good margin, and that's nothing to scuff at.

The term “world domination” is, unfortunately, rather socially charged, bringing to mind an image of the archetypal mad scientist with marching robot armies.

Or a pair of laboratory mice, whose genes have been spliced.


Not feasible. Let's aim for a more modest goal, say, better PR and functional communities.

Moreover, not this community's comparative advantage. Why do we think we'd be any better than anyone else at running the world? And why wouldn't we be subject to free-riders, power-seekers, and rationalists-of-fortune if we started winning?

We think we'd be better at running the world because we think rationalists should be better at pretty much everything that benefits from knowing the truth. If we didn't believe that we wouldn't be (aspiring) rationalists. And just because we couldn't do it perfectly doesn't mean we're not better than the alternatives.
Overconfidence seems like a poor qualification.
And yet confidence seems a good one. The question is how much is too much, which can really only be verified after the fact.
I wonder how well a group whose members didn't study how to think and instead devoted themselves to not letting emotions interfere with their decisions would do. All its work would be advances, I think - there would be no analog to the "valley of rationality" in which people lost touch with their intuitions and made poor decisions.
I dispute your claim. In fact, I would assert the exact opposite: that attempting to remove emotions from decisionmaking is what causes the "valley of rationality." Furthermore, I suspect it is a necessary transitional phase, comparable in it's horrific necessity to the process of re-breaking a mangled shinbone so that it can heal straight.
I'm well disposed towards your viewpoint on that. I disagree with the implication of this. I think the main causes are misusing tools like Bayesian updating and considering what a rationalist would do, and trying to do that. Insofar as poorly calibrated emotions are part of the problem, one must subtract the problems that would have been caused by them under non-aspiring rationalist conditions from those under aspiring-rationalist conditions to calculate what the aspiring rationalism is responsible for. I don't think this usually leaves much left over, positive or negative.
Functional communities would be nice. I'm not so sure that better PR is the way to go. Why not no PR? Why not subtle induction via existing infrastructure? Let the people who most deserve to be here be the ones who will find us. Let us not go out with blaring trumpet, but with fishing lure.
What specific concerns make you disagree with its feasibility?
We have neither the numbers, the organizational skill, nor the social skills to be good at this. There is a joke that organizing libertarians is like herding cats and the same principle seems to be partly true here for the same reason: Lw draws a lot of smart contrarian people. Unless there is a technological way to conquer the world, say the Singularity, but that demands an entirely different organizational strategy, namely channeling all efforts into FAI.

In addition to everything thats already been said: when the median rationalist is still struggling to get a date the idea of winning popularity contests and infiltrating the domain of charismatic, glad-handing net-workers is preposterous.

First, [citation needed]. Second, if it's true, perhaps one should look at oneself and ask why.
First, it really isn't. I'm making a generalization about a group I'm familiar with. Second, I don't struggle to get dates.
More importantly, if works such as "The Thick Of It" are to be believed, politicians actually don't get much tail at all, on average, what with being married to the job.

I submit that the primary mid-term goal of the Bayesian Conspiracy, at this stage of its existence, is and/or ought to be nothing less than world domination.

Before the rotten fruit begins to fly, let me make a brief clarification.

Is it odd that I laughed out loud at the idea that this should even be controversial?

I suppose the biggest question is, is all this realistic? Or is just an idealist's dream?

While beautifully written; it does sound all an idealist's dream. Or at least you have said very little to suggest otherwise.

More downvotes would send you to negative karma if there is such a place, and that's a harsh punishment for someone so eloquent. In sparing you a downvote, I encourage you to figure out what went wrong with this post and learn from it.

If there's three things I've found in my little time here it is that the community strongly admires in posts... (read more)

..... I will meditate on this constructive criticism. Thank you very much; I think this is the most useful response I've seen.
8Peter Wildeford13y
I feel sorry you had to learn this by being taken for every karma point you own. I strongly suggest you make use of the Discussion Seciton for your future posts; that's a great place to learn what does work and what doesn't. My first two posts got downvoted, but I didn't lose out because votes are only -1 karma there. Read the LW About Page if you haven't already. And remember that karma is not the end-all be-all of LW. I think you benefitted a lot by trading your karma for knowledge of how the LW community works. Karma itself is not a terminal value, but a means to fitting in with LW, which is also not a terminal value, but a means to furthering your rationality, which is also likely not a terminal value, but a means to getting better at satisfying your goals. To further clarify my criticism just to make sure your karma freefall was worth it, this post would have benefitted by, among other things, being fifty times more practical -- what do you think is the first step toward gaining control of all the world's institutions? If you don't even know that, why are you writing about world domination? Maybe you can talk a lot more about how to sell rationality to the public so that they react to the conspiracy favorably rather than negatively, for instance. I think you have a gift for writing in a very eloquent, enjoyable manner, so I would hate for you to leave just because of one fiasco. I implore you to reflect, refocus, and give it another shot.
2Peter Wildeford13y
By the way, sorry that this comment treats you like you're new to LW -- I can see from going through your comment and post history that you're not. My mistake.

That's quite all right; I'm sure the naivete blossoming forth from the OP makes that an easy mistake to make. :P

I'm well aware of the Discussion Section... which only compounds my error. Yes, this should have been posted there. Losing some eighty Karma (by the way, apparently negative Karma does not exist per se, but perhaps it does de facto... is as good a wakeup call as any for the sin of overconfidence.

I would have traded my karma simply for the advice you've given here. Thank you. And thank you for the compliment on my writing style; nice to see not everything about this experience was negative. I assure you that I will not be leaving any time soon. When I first saw that this post was getting a negative response, I made a split-second decision: should I flee, or should I learn? I choose to learn.

I think even though the karma counter never goes below zero, downvotes still count and it won't go above zero until you get enough upvotes to cancel them out.
I can confirm that hypothesis; I'm still at zero, even though the grandfather to this post has received 4 points, given after I lost all my karma. Actually, this is a bit of an annoyance; I have no way to gauge how far I have to go to get into the positives...
As long as you didn't delete any other comments/posts, you can figure out what your karma is by adding up everything else.
I am still relatively new to LW, though - or else I'm just not very good at picking up on social values - so I'll ask this question of you: What stigma would be attached to my decision to delete this post? I don't want to do it just to get my Karma back; I'm willing to accept the consequences of my mistake. On the pro side, this would no longer come up under my posts, and so people who have not already seen it would fail to judge me by it. This is only a positive because I have in fact learned much from the response, and plan to act upon those lessons. On the con side, it might be viewed as... I almost want to say cowardly? Failing to take responsibility for my actions? Running away? I'm not sure, though, what the implications of that action would be to the rest of the community, so I need an outside opinion. EDIT: I recognize that it is good to recognize that I have made stupid decisions for bad reasons. I do not know if it is a virtue to keep your mistakes around and visible.
Good, because it wouldn't do that.
Oh, good. :3 I was worried that doing so would give that false implication.
If this potential confusion is your real reason and not a convenient rationalization, I would suggest an EDIT along the lines of " convinced me that was not a good one to hold, and I no longer think that Bayesian conspiracy is a good idea outside of the HPMoR fanfiction". If you still hold that it is, then bear it like a rationalist you aspire to be, since you presumably examined this model of action with an utmost care, to avoid any biases. EDIT: I certainly do not plan to delete my discussion post with negative karma, though I did retract (not delete) one rather poorly thought out comment previously.
Ha! Now I feel like a noob. How do I edit a top-level post? :3
If you click on your nick, you will see it among your other posts, and you can edit it there, I suppose.
Thank you.
0Peter Wildeford13y
I'm newer than you and have not yet braved into the "Main" section, so I don't really know. I didn't know deleting a post could get you the karma back, that seems like a bad policy and counterproductive to what karma is supposed to do. Still, I think you've "learned your lesson", so to speak, so I personally wouldn't mind at all.
Apparently it can't, which is a good thing, upon reflection.
Your list is good. I would also add that references to relevant studies are valued. The OP was novel enough. On the A-F scale I give it a B in the novelty category. No mercy, a cold blooded-judgement, neither a B- nor a B+.
0Peter Wildeford13y
I would put references to relevant studies under "specifics" but it is definitely something I should of highlighted.
I downvoted the OP. A major turn-off for me was the amount of rhetorical flourish. While well-written posts should include some embellishment for clarity and engagement, when there's this much of it, the alarm bells go off...what is this person trying to convince me of by means other than reasoned argument? See also: the dark arts.
3Peter Wildeford13y
Maybe, but I think that's just because the post was also low on specifics. If Arandur brought the flourish and the specifics, I think it would be great, and would balance out the other stuff that can appear boring, dry, and overly technical. Though it could just be a difference in preferences.

I agree that more rationality in politics would be a good thing, but I think this post is making too big of a deal out of it. Eliezer said essentially the same thing, "rationalists ought to be more active in politics", much more succintly here.

Are you a rationalist who feels like you could go into politics? Well then, go into politics if you think that's where your comparative advantage lies. See if you can get your local friends to support you. Getting the support of fellow rationalists is good, but the main thing is getting emotional support fr... (read more)

It seems pretty obvious that Eliezer's view is that FAI is the quick ticket to world domination (in the sense of world states that you talk about), and he seems to be structuring his conspiracy accordingly.

It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions.

Really? How would one demonstrate this? What does it mean for a definition to be "correct"? If something is true by definition, is it really demonstrable?

we have a moral obligation to work o

... (read more)
I hadn't considered that, but now I see it clearly. How interesting. Ha! If that would work, maybe it'd be a good idea. But no, pointing out a moral obligation is not the same as guilting. Guilting would be me messaging you, saying "See that poor starving African woman? if you had listened to my plan, she'd be happier." But I won't be doing that.

I read it and I thought it was amazingly similar to a lot of the thoughts and feelings I've had going through my head recently. Maybe this is just the emotion and fallow of youth but I feel like the world as a whole is very apathetic towards the suffering that exists outside of the bubble of the First World that LW exists in. How can you honestly choose cryonics over the utility of an organization built to protect human life until the singularity along with Eliezer's group which works to ensure a positive singularity.

I recently saw a movie about governmen... (read more)

Why was this downvoted?

What if we had the infiltration of non rationalists who, through their cunning, gained control and we ended up with what we've got now? lol Or what if the road to being a rationalist is paved by the mistakes of the non rationalists eg the original thought to live forever may not have come from a rationalist, but can only be determined by a rationalist.? lolx2 Or worse, the richness of life is felt, or only made real by the exquisite angst that is caused by the tension between rationality and mendacity?

(glances left and right)


Bayesian Conspiracy @ Burning Man 2011, a social group? Ha.

I do apologize if I've given offense; not having had the opportunity yet to attend, I used the broadest term I could conjure while maintaining applicability.