I'm pretty new here and would like to learn more about Roko's Basilisk and the drama that happened here. Could someone of you explain this?

New Answer
Ask Related Question
New Comment

2 Answers sorted by

There is a tag for that, although not all related articles are tagged. There is an article on RationalWiki that made the whole thing popular, and there is also a mention at Wikipedia, because the RationalWiki admin also happens to be a Wikipedia admin.

Long story short, a Less Wrong user called Roko once wrote a comment containing a thought experiment about an artificial intelligence that would incentivize its creation by torturing everyone who did not contribute to its creation. (So once people started building this monstrosity, it would be a Prisonner's Dilemma kind of situation where everyone has a selfish motive to contribute to the project, even if they actually wish it never existed.) Some people were triggered, Eliezer Yudkowsky deleted the comment, then of course people talked about it more; then Eliezer nuked the entire discussion saying that brainstorming publicly about machines that would torture people is a really stupid idea regardless of whether the plan is feasible or not, and banned further discussion on similar topics, just in case.

Then our friends at RationalWiki noticed this, and rationally concluded that because this comment was posted on Less Wrong, it must represent what all Less Wrong users actually believe, and because Eliezer deleted it, it must be Eliezer's darkest secret. And therefore, it is their civil duty to document it for the posterity, and make sure everyone knows about it, and everyone knows this is what Less Wrong is actually about -- especially Wikipedia.

Because at that time RationalWiki had quite high ranking in Google results, everyone who googled about Less Wrong found this, and every journalist who wrote an article mentioning Less Wrong or the rationalist community, made sure to also mention this. Some people call this Streisand effect, some call it citogenesis.

Since then, about once in a year someone comes and asks people to explain what the Roko's Basilisk is all about. In 2022, it happens to be you. Congratulations! As you can clearly see, we talk about this topic all day, every day, because we deeply care. But we also never talk about it, because it is our deepest held secret. Sorry if this is confusing.

EDIT: If you click on the tag, and then click "Read More", there is actually a long explanation of the idea.

EDIT2: I just learned that there is a Reddit forum where about once a week someone new asks about Roko's basilisk. (And XiXiDu is its moderator, why am I not surprised?) Seems like there is a huge basilisk fan community out there.

And what are/were the problems with RationalWiki? I am also pretty new here and was aware of the Basilisk controversy, but I don't know about that seemingly related problem... Probably I just forgot I read about RationalWiki when reading about the Basilik because i don't know what it is. If it can be shortly summarised, please do it as I am now quite curious. If not, please don't bother --it is not way the effort!

Long story short, RationalWiki tries to promote rationality (more precisely, mainstream science), and oppose pseudoscience and religious fanaticism, which is nice. But they are also politically woke left, and it clouds their judgment -- they sometimes treat "left-wing" and "science" as synonyms; and "politically incorrect" and "pseudoscience" also as synonyms. (It makes sense historically, they started as an opposite to Conservapedia.) For example, among their 10 longest pages, 2 are currently about "gamergate", which is completely unrelated to science or pseudoscience, it's just a thing that people like to have a political opinion about. So, kinda, started as "against pseudoscience", but ended up as "against anything we don't like". RationalWiki only has a few editors, and some of them have too much free time, so if one of them has a personal grudge against something, then that kinda becomes the official position of RationalWiki. The specific editor that has a problem with Less Wrong and rationalist community in general, is David Gerard; he is an admin at both RationalWiki and Wikipedia, and he often abuses his position: writes something negative about someone on RW, and when a journalist quotes him (a few years ago, RW was famous and often quoted by journalists), he then quotes this on Wikipedia (so he is effectively quoting his own words, but coming from an independent source). After much time, it seems like some other Wikipedia admins finally start getting tired of this, and he was recently banned [https://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/IncidentArchive1061#David_Gerard_and_Scott_Siskind] from attacking Scott Alexander (the author of Astral Codex Ten) on Wikipedia.
Long story short, lesswrong tries to promote rationality (more precisely, mainstream science), and oppose pseudoscience and religious fanaticism, which is nice. But they are also politically libertarian, and it clouds their judgment.
Well, Eliezer is. Couldn't find more recent survey, but here [https://www.lesswrong.com/posts/E6He9bTPnafc2N9HD/2016-lesswrong-diaspora-survey-analysis-part-four-politics] is a 2016 analysis of politics of LW community, and if we can trust the answers, there are 10× as many Democrats as Libertarians among the American members. And yes, it would be nice if RationalWiki made it clear that this is their true objection. ;)
And "rationalwiki is woke" is hard empirical fact...or a "vibe"?
Not sure what would be the most convincing evidence in this case. As far as I know, they do not have a page saying "yes, we are woke", or anything like this. But if you read the content, there are many articles that are just about politics, unrelated to the "science - pseudoscience" dimension. And I don't mean articles like "Donald Trump", because those are relevant: as a president he could increase or decrease some government spending on science, or promote some pseudoscientific opinion on TV, etc. But why is it necessary to have a page e.g. on "men's movement" (keywords: "non-existent problem", "ridiculously absurd", "neo-reactionary", "overtly conspiratorial view", "reactionary" -- that was just the short summary at the top of the article) or "gamergate"? How is this, like, related to science? The level of charity is exactly as much as you would expect from "a snarky point of view", i.e. don't let reality stand in the way of a good jab at the enemy. If you e.g. characterize the man's rights advocacy as "bros before hoes", you make it clear that their arguments are going to get a fair treatment, right? Ok, let's look at the specific ideas: MRAs complain about draft, but this "borders on red herring" because almost no one is doing it these days. (Meanwhile, in actual universe, there are quite many 18-years old kids drafted on both Russian and Ukrainian sides these days; you can watch them dying on Reddit.) According to Rational Wiki, even the argument that men were conscripted during the two World Wars is not valid, because... wait, I am not making this up... the disabled men were exempt. (Checkmate, misogynist!) With gamergate, I will skip the object-level claims for the sake of brevity, and focus on the process. If you look at the Wikipedia article on gamergate, it clearly depicts the whole affair as utterly negative and without any merit whatsoever. Yet there was one editor, I think his name was Ryulong or something like that, who was kicked out from Wikipe
So, vibe. If you can assess them as woke based on vibe, they can assess you as libertarian based on vibe. If the sole purpose of lesswrong is to prevent an AI apocalypse, why are there so many articles about government regulation (bad) and stock trading (good)? So they are not failing at what they say they are doing...they are failing at what you think they should be doing.
Thanks for the answer!
If you ask that question here, you will get an answer that assumes that rationality is being done entirely correctly here, and is therefore being done wrongly at rationalwiki, inasmuch as rationalwiki does anything differently. Of course, rationalwiki has its own objections to lesswrong [https://rationalwiki.org/wiki/LessWrong] and Yudkowksy [https://rationalwiki.org/wiki/Eliezer_Yudkowsky] It may be possible to settle the issue , but first people would need to get out of the mindset that says "my tribe is right because it's my tribe".
Thanks for the answer. Although I've went to quickly read the 2 post you linked and... they are informative and I do agree with some of the critiques but they are far from objective. Although I don't like the tone and style, they would be alright for a personal blog, but not for a wiki and much less for a rational wiki. It felt really weird and very untrustworthy.
It is. Rationality is their flag, not their method.

This community has a virtue of taking weird ideas seriously. Roko came up with a weird idea which, the more seriously you took it, the more horrifying it became. This was deemed an info hazard, and censored in some way, I don't know how. But the people who didn't take it seriously in the first place weren't horrified by the idea and thus were confused about why it should have been censored, and thus boosted the Streisand effect.

the more seriously you took it, the more horrifying it became. 

Eh. Up to a point. And then if you take it more seriously than that, it becomes less horrifying again.

Arguments for why it's scary are the decision theory equivalent to someone describing how scary knives are, and how to make your own sharp knives, but never mentioning any knife safety tips.

"Sharp knives," in this metaphor, is the recognition that other people might try to manipulate us, and the decision theory of why they'd do it and how it would work. "Knife safety" is our own ability to... (read more)

I don't know how. 

If you don't know, why try to answer?

In general, your post is pretty misleading. It was not censored because the idea itself horrified people. 

The idea was either wrong, in which case preventing people from reading a wrong idea is net beneficial by getting them better ideas or the idea was right and that suggests it's dangerous. EY censored it because he believed that neither state would make it valuable to have the post on LessWrong and maybe out of a general precautionary principle. You don't need to be horrified by things to use the precautionary principle.

I recall that among the reasons given were that it had triggered severe reactions in some members of the community. "Horrified" would be a mild way to describe the reaction that was claimed at the time. There was discussion about whether the idea itself could be self-fulfilling and thus inherently dangerous (in addition to the pain caused in some people thinking about it), but that didn't last long. I don't think simply being wrong would have been enough to try to censor it.
The precautionary principle mattered. Jessica wrote a post [https://www.lesswrong.com/posts/pQGFeKvjydztpgnsY/occupational-infohazards] about how the people at MIRI at the time thought about keeping information that's a potentially dangerous secret. It was pretty extreme and they're trying to keep information that Jessica would have seen as a more trivial secret drove her into schizophrenia. It was not an intellectual atmosphere of keeping ideas that are horrifying people, dangerous because those are horrifying ideas. MIRI had complex ideas about secrecy that they considered very serious and if you ignore those but treat the motivation like "people did something because they are horrified" you project decision heuristics onto EY that he didn't use.
Fair enough. I'm not part of the Bay Area rationalist community, and I suspect there was a lot of stuff going on that didn't appear in public posts or discussion on the topic. People (including Eliezer and others) are complicated, and there are both private and public reasons for actions, as well as reasons that aren't easily understood, even by the actors. BTW, none of this explains why lincolnquirk's comment was strong-downvoted. Even if it's incorrect (though it doesn't seem to me to be - more incomplete), it's not harmful or wasteful.

I don't think that someone who by their own admission doesn't think that they have a good understanding should offer up their explanation on a rumor in a case like this.