Pictured Left: Enter launch codes to destroy the EA Forum
Pictured Right: Enter launch codes to destroy LessWrong

Petrov Day

Today we celebrate not destroying the world. We do so today because 38 years ago, Stanislav Petrov made a decision that averted tremendous calamity. It's possible that an all-out nuclear exchange between the US and USSR would not have actually destroyed the world, but there are few things with an equal chance of doing so.

As a Lieutenant Colonel of the Soviet Army, Petrov manned the system built to detect whether the US government had fired nuclear weapons on Russia. On September 26th, 1983, the system reported five incoming missiles. Petrov’s job was to report this as an attack to his superiors, who would launch a retaliative nuclear response. But instead, contrary to the evidence the systems were giving him, he called it in as a false alarm, for he did not wish to instigate nuclear armageddon. 

For more information see 1983 Soviet nuclear false alarm incident

Petrov is not alone in having made decisions that averted destruction–presidents, generals, commanders of nuclear submarines, and similar also made brave and fortunate calls–but Petrov's story is salient, so today we celebrate him and all those who chose equally well. 

As the world progresses, likely many more people will face decisions like Petrov's. Let's celebrate in advance that they'll make good decisions! And if we expect to face decisions ourselves, let us resolve to decide wisely!

Mutually Assured Destruction (??)

The Petrov Day tradition is to celebrate Petrov's decisions and also to practice not destroying things, even when it's tempting.

In both 2019 and 2020, LessWrong placed a large red button on the frontpage and distributed "launch codes" to a few hundred "trustworthy" people. A launch would bring down the frontpage for the duration of Petrov Day, denying hundreds to thousands of people access to LessWrong. In 2019, all was fine. In 2020...let's just say some bad decisions were made.

Yet having a button on your own page that brings down your own site doesn't make much sense! Why would you have nukes pointed at yourself? It's also not very analogous to the cold war nuclear scenario between major world powers.

For those reasons, in 2021, LessWrong is teaming up with the Effective Altruism Forum to play a little game of mutual destruction. Two buttons, two sets of codes, and two sets of hopefully trustworthy users.

If LessWrong has trusted launch code recipients poorly, the EA Forum will go down, and vice versa. One of the sites going down means hundreds to thousands of people being denied to important resources: the destruction of significant real value. What's more it will damage trust between the two sites ("I guess your most trusted users couldn't be trusted to not take down our site") and also for each site itself ("I guess the admins couldn't find a hundred people who could be trusted".)

For exact rules of the game, see the final section below.

Last year it emerged that there was ambiguity about how serious the Petrov Day exercise was. I'll be clear as I can via text: there is real value on the line here and this is a real  trust-building exercise that is not undertaken lightly by either LessWrong or the EA Forum. Both sites have chosen recipients who we believe we can trust to not destroy each others' communal resources.

How Do I Celebrate?

If you were one of the two hundred people to receive launch codes for LessWrong or the EA Forum, celebrate by doing nothing!

Other ways of celebrating:

  • You can discuss Petrov Day and threats to humanity with your friends.
  • You can hold a quiet, dignified ceremony with candles and the beautiful booklets created by Jim Babcock.
  • And you can also play on hard mode: "During said ceremony, unveil a large red button. If anybody presses the button, the ceremony is over. Go home. Do not speak."
    • This has been a common practice at Petrov Day celebrations in Oxford, Berkeley, New York, and in other rationalist communities. It is often done with pairs of celebrations, each whose red button (when pressed) brings an end to the partner celebration.

Rules of the Exercise

The following email was sent last night to 100 users from LessWrong and 100 from the EA Forum:

Dear {{Username}}

I am inviting you to participate in an exercise to determine whether LessWrong can find 100 site members that it can trust under genuine stakes. 

I would describe the relationship between LessWrong and the EA Forum as vastly better than the historical relationship between the US and the former Soviet Union. Our two sites cater to slightly different audiences with slightly different content, but ultimately both have shared values: an interest in understanding the world and improving it.

This year on Petrov Day, to prove the goodwill and trust between us, each site is sending out “Nuclear Launch Codes” to 100 members. You have been selected. 

If your launch codes are entered into the launch console on LessWrong, they will cause the EA Forum homepage to go down for the duration of Petrov Day.  For the rest of the day, thousands of people will have a hard time using the site, some posts and comments will go unwritten, and I’ll have failed in my mission to find 100 people I could trust not to take down our friendly compatriots.

If a code is entered into the launch console, both the owner of the code and the LessWrong account used to submit it will be publicly identified.

LessWrong and the EA Forum both have second-strike capability that will last one hour. In the event that missiles are launched at the LessWrong homepage, we ask that you very carefully consider whether or not it is correct to retaliate (but slips happen after all).

I hope you’ll help us all keep the EA Forum safe, and that they’ll do the same for us.

Your personalized launch code: {{Code}}

Thank you,

PS: The Petrov Day launch console will become visible on September 26th at 8:00 AM Pacific Daylight Time and will remain visible (assuming the site has not been attacked) for 24 hours.  The full Petrov Day announcement is here.

To all, I wish you a safe and stable Petrov Day.


Here is the mirror of this post on the EA Forum. You may wish to view it for the discussion there.


98 comments, sorted by Click to highlight new comments since: Today at 7:36 PM
New Comment
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I deliberately refrained from mentioning this in public yesterday out of respect for the spirit of the game, but I was disappointed that the SHA-256 hashes of the launch codes were publicly visible in the source code (such that someone could try to crack them offline) rather than being stored in the database (such that all guesses have to go online through the server).

Okay, yes, I understand that this wasn't a "serious" vulnerability because the launch codes were sufficiently high-entropy (actually random, not chosen by a human) that no one is going to be able to crack the hash. (I didn't make the list this year, but my launch code from last year was 16 alphanumeric characters, which is bits of entropy, which I think is being an expectation of 7 billion years at 100 billion hashes per second? Oh, except knock off a couple orders of magnitude, because there were a hundred codes.)

Still, in principle, as a matter of security mindset, putting the hashes in the database lets you depend on one fewer assumption (about how much computing power the adversary has). It seems like the sort of thing to do as part of the spirit of the game, given that this ki... (read more)

I made this decision a while ago, mostly because it was something like an hour faster to get things going this way. I did the fermi, did some research into SHA-256 to make sure it was sufficiently pre-image resistant, and decided to go ahead. I stand by this decision, and think your comment is a straightforward misapplication of security mindset. 

I think going through the motions of making sure the hashes weren't publicly available would have been just virtue signaling, and the real security risks almost certainly live somewhere else in our stack (some out-of-date-library that has a security vulnerability, some OS-level issue, some configuration issue in our VPN, a social engineering attack on someone in our team or a past collaborator). There is no point in pursuing a security mindset if you are virtually certain that the thing you would be investing resources into would not be your weakest attack point. I know that LessWrong is not robust against a sophisticated dedicated attacker. I think it would be dumb of me to look at an insulated part of our stack and harden that to withstand a sophisticated attacker, when there are many other attack vectors that are much more fragile.

I sort of disagree. Not necessarily that it was the wrong choice to invest your security resources elsewhere--I think your threat model is approximately correct--but I disagree that it's wrong to invest in that part of your stack.

My argument here is that following best practices is a good principle, and that you can and should make exceptions sometimes, but Zack is right to point it out as a vulnerability. Security best practices exist to help you reduce attack surface without having to be aware of every attack vector. You might look at this instance and rightly think "okay but SHA-256 is very hard to crack with keys this long, and there is more low hanging fruit". But sometimes you're going to make a wrong assumption when evaluating things like this, and best practices help protect you from limitations of your ability to model this. Maybe your SHA-256 implementation has a known vulnerability that you didn't check, maybe your secret generation wasn't actually random, etc. I don't think any of these apply in this case, but I think sometimes you're likely to be mistaken about your assumptions. The question becomes a more general one about when it makes sense to follow best practices ... (read more)

I agree that this is a correct application of security mindset; exposures like these can compound with, for example, someone's automatic search of the 100 most common ways to screw up secure random number generation such as by using the current time as a seed. Deep security is about reducing the amount of thinking you have to do and your exposure to wrong models and stuff you didn't think of.

Furthermore, It is also not inconceivable to me that an adversary might be able to use the hash itself without cracking it. For example, the sha256 hash of some information is commonly used to prove that someone has that information without revealing it, so an adversary, using the hash, could credibly lie that he already possesses a launch code and in a possible counterfactual world where no one found about the client side leaking the hash except this adversary, use this lie to acquire an actual code with some social engineering.


"Attention Lesswrong! With trickery I have acquired a launch code capable of destroying your site. As proof here is the sha256 hash of it: <hash>.

This is not a trick, I will leave plenty of time for you to check with your EA buddies that the hash is valid before you need to meet my demands.

I demand a launch code capable of destroying the EA forum sent to me until <time> or I will nuke this site and to this I precommitted. I won't reveal what I plan to do with the launch code you will send to me, but by basic game theory your interest is in sending it to me as your site's destruction is not certain that way.

I can't prove it to you, but irl I precommitted to nuking the site if my demands are not met and also that I won't send any more messages to prevent useless debating.

I hope you will make the correct choice!"

I do not know what the difference here is. Presumably one implies the other?
Epistemic note: I'm not Taleuntum. I think the difference is: Tale: Today it may be best, to invest elsewhere. But not forever. Security against a dedicated adversary is not always and forever impossible, but a result of continuous investment that is a) eventually achieved, b) requires continuing work going forward (bugs are discovered in resources once thought secure and need patching, malware and threats change, etc.) In other words, just because it's not your top priority, doesn't mean it shouldn't be improved a little bit, now and then. i.e. the difference between 0% and 5% of invested effort, compounds over time.

going through the motions of making sure the hashes weren't publicly available would have been just virtue signaling

Yes. That's what I meant by "the sort of thing to do as part of the spirit of the game": in an actual nuclear (or AI) application, you'd want to pick the straightforwardly best design, not the design which was "something like an hour faster to get things going this way", right?

So as part of the wargame ritual, maybe you should expect people to leave annoying nitpicky comments in the genre of "Your hashes are visible", even if you don't think there's any real risk?

Does that seem weird? For more context on why I'm thinking this way, I thought last year's phishing attack provided us with a very valuable and educational "red team" service that it'd be fun to see continued in some form. ("Coordinate to not destroy the world" is an appropriate premise for an existential-risk-reduction community ritual, but so is intelligent adversaries making that difficult.) I'm not personally vicious enough to try to get the site nuked, but putting on a white hat and thinking about how it could be done feels on-theme.

your comment is a straightforward misapplication of security mindse

... (read more)
I think your conclusion is reasonable that the investment of effort in security improvements is not justified by the risk of it being exploited, but I want to pull out a tiny part of your post and suggest refining: "There is no point in pursuing a security mindset if you are virtually certain that the thing you would be investing resources into would not be your weakest attack point." Different attackers will target different points depending on their capability and what they care about, and which attacker will go after you depends on their motivations. Your weakest point may be lower real risk than others simply because the type of attackers who would exploit that point don't care about you. Organisations will regularly invest resources into what is not necessarily the weakest attack point but based on their assessment of the most cost effective way to reduce overall risk. This plays into defence in depth, where multiple layers of overall security features can provide better risk reduction, especially where the weakest attack points are expensive or impossible to address. This may seem like a inconsequential point as it doesn't make any difference to your conclusions, but I do see people focussing on weak attack points without considering whether their money is being well spent. To me, a better framing would be: You shouldn't invest resources into measures where there are alternatives that are more effective at reducing risk.
Yep, this seems right to me. In this case, the set of attackers is also quite narrow, since the codes are only relevant for 24 hours, and then only in a somewhat bounded way that's easy to revert.

I had one of the EA Forum's launch codes, but I decided to permanently delete it as an arms-reduction measure. I no longer have access to my launch code, though I admit that I cannot convincingly demonstrate this.

Attention LessWrong - I do not have any sort of power as I do not have a code. I also do not know anybody who has the code.

I would like to say, though, that I had a very good apple pie last night.

That’s about it. Have a great Petrov day :)

I am opposed to the implementation of this exercise, I believe its basic concept seriously undercuts the moral lesson we should take from Petrov Day.

The best way to not blow ourselves up is to not make nuclear weapons. On a day dedicated to not blowing ourselves up, LW has decided to manufacture a bunch of completely unneeded nuclear weapons, hand them out to many people, and then hope really hard that no one uses them. This is like a recovering addict carrying drugs on his person in order to make a point about resisting temptation: he is at best bragging and at worst courting disaster so boldly that one should wonder if he really wants to avoid self-destruction. This makes a good allegory for the senseless near-apocalypse of the Cold War, but deliberately creating a senseless risk does not seem like an appropriate way of celebrating the time we narrowly avoided triggering a senseless risk.

My perspective is that the ritual has more than one dimension: I claim that this is low-risk training for future events, rather than only a celebration of a past event. Many senseless risks remain (including nuclear weapons), and we have no control whatsoever over whether they persist. Petrov is a foundation story because rationalist-as-we-use-it means making good decisions; even when we did not make the risks; even when the situation fundamentally stupid.

If we never even attempt to simulate these situations, then I believe we're not giving the problem its due.

I think to the extent that the Petrov Day game is training anything, it's training the opposite of what we should want. In the game, all the social pressure is unanimously and strongly opposed to pressing the button (sometimes to the extent of ostracizing people and threatening their careers). But in real life, if everyone were unanimously opposed to pressing the button, the button would never have been constructed in the first place. The real Petrov was not rewarded for his actions but demoted and sidelined. In the real situation that's supposedly being trained for, the social pressure will be ambivalent at best, but more likely telling you that you should press the button, and only your own moral compass and fear of death would be telling you not to.

5Svyatoslav Usachev1y
A thousand times this! I haven't seen anyone pointing out what's wrong with this ritual more clearly. Exactly, we turn the celebration of individual courage into a celebration of unity/conformity, what an irony.

So it looks like we survived? (Yay)

Today I learned that loss aversion is very weird.

I had a code for this last year. I thought this game was cool, and I faithfully refrained from using them. Then today I saw this post, checked my email, and when I didn't find any codes I noticed that I felt sad. I'm not deeply sad, to be clear, but I definitely am a little disappointed that I don't have a string of numbers which have exactly one use which I have next to zero intention of ever using, that only two hundred people get, and which would be useless after a day. On reflection, this disappointment is obviously silly, and in particular the way this initially parsed as a thing being taken away from me is very silly.

So yeah, that's how my day is going so far. To those who got codes, thank you for not temporarily destroying a small part of the internet!

Attention LessWrong - I am a chosen user of EA Forum and I have the codes needed to destroy LessWrong. I hereby make a no first use pledge and I will not enter my codes for any reason, even if asked to do so. I also hereby pledge to second strike - if the EA Forum is taken down, I will retaliate.

Regarding your second strike pledge: it would of course be wildly disingenuous to remember Petrov's action, which was not jumping to retaliation, by doing the opposite and jumping to retaliation.

I believe you know this, and would guess that if in fact one of the sites went down, you'd do nothing but instead later post about your moral choice of not retaliating.

(I'd also guess, if you choose to respond to this comment, it'd be to reiterate the pledge to retaliate, as you've done elsewhere. This does make sense--threats must be unequivocal to be believed, even if follow through is illogical.)

Furthermore, the way for someone to test your intention to follow through or not... is to push the red button. By posting this here and on the EA forum, you may have actually increased the motivation to push the button, so that the pushee can see what you do in response.

4Neel Nanda1y
Mutual Assured Destruction just isn't the same when you can see for sure whether you were nuked
1Peter Wildeford1y
I can see whether the site is down or not. Seems pretty clear.
1Forged Invariant1y
Just be aware that other users have already noticed messages which could be deliberate false alarms: https://www.lesswrong.com/posts/EW8yZYcu3Kff2qShS/petrov-day-2021-mutually-assured-destruction?commentId=JbsutYRotfPDLNskK [https://www.lesswrong.com/posts/EW8yZYcu3Kff2qShS/petrov-day-2021-mutually-assured-destruction?commentId=JbsutYRotfPDLNskK]
3Peter Wildeford1y
I will be on the lookout for false alarms.
I don't think you'll be able to retaliate if the site is down.
In the message sent to holders of launch codes that's repeated in this post, it says:
The site will remain up for one hour with a message that a missile is incoming (based on what I described here [https://www.lesswrong.com/posts/EW8yZYcu3Kff2qShS/petrov-day-2021-mutually-assured-destruction?commentId=JbsutYRotfPDLNskK] ), and that message could be a false alarm.
Hmm, actually, it's not clear to me whether the site will go down immediately (with the button in tact) or after an hour.
1[comment deleted]1y

For what it's worth, this game and the past reactions to losing it have burnt the last of my willingness to identify as a LW rationalist. Calling a website going down for a bit "destruction of real value" is technically true, but connotationally just so over the top. A website going down is just not that big a deal. I'm sorry, but it's not. Go outside or something. It will make you feel good, I promise.

Then getting upset at other people when they don't a take strange ritual as seriously as you do? As you've decided to, seemingly arbitrarily? When you've deliberately given them the means to upset you? It's tantamount to emotional blackmail. It's just obnoxious and strange behaviour.

As a trust building exercise, this reduces my confidence in the average lesswronger's ability to have perspective about how important things are, and to be responsible for their own emotional wellbeing.

I understand why this was downvoted and I think it is harsh, but I also think it might be good if people take the sentiment seriously rather than bury+ignore it.

If I received a code, I would do nothing, because it's clear by now that pressing the button would seriously upset some people. (And the consequences seem potentially more significant this year than last.) And I think the parent commenter undervalues the efforts the pro-taking-it-seriously people made to keep their emotions in check and explain why they take the ritual seriously and would like others to do so too.

But I share the instinctive reaction that the whole thing is a bit overblown and pompous, and even on reflection I think it's at least reasonable to hold that it was obnoxious to throw unconsenting people into a situation that looked like a game, where the stakes appeared (and IMO were) very low, only to reveal after the fact that playing the game -- by taking an action explicitly enabled by the people who run and probably care most about the site -- had apparently caused non-trivial distress to others and significant reputational harm to the player.

You make good points. I, for one, strong-downvoted OP because “emotional blackmail” seems not at all accurate, and the criticism itself was shaded “go outside, nerd”, when I would have been more interested in OP’s actual arguments.

Emotional blackmail would be if Ruby emailed me and said “TurnTrout, unless you participate in this ritual, I will be upset at you.” In this situation, if I do nothing, nothing happens to me, whereas Ruby may feel differently about me if I choose to participate in the game by entering launch codes.

It’s like if I built a sand castle, put some light explosives inside, and handed 100 people detonators. If someone blows it up, I could be mad at them. Sure, that might be foreseeable, and probably “my fault” in a sense.

but it seems unnatural to describe this kind of situation as “tantamount to emotional blackmail.”

I agree that "emotional blackmail" is inaccurate, but this exercise is pulling reader's emotional strings in a bad way. The label was wrong but the overall criticism has merits. Would relabeling it into "gratuitous drama" be a good steelmaning?
"Gratuitous drama" sounds more plausible and appropriate, sure. "Is"? But to me it just feels like an interesting yearly event, with some real thought put into it. I certainly appreciate it. If you claim it's "pulling strings", I think that you should explain why, or link to an explanation, or at least acknowledge that you don't have time to explain why you feel that way. If not, these simple "is" statements work to establish (the perception of) social agreement around the "fact" that "this exercise is pulling reader's emotional strings in a bad way", without that point actually having been established.
"Pulling strings" by exaggerating the importance of the stakes, by forcing some members to participate in a game where there is nothing to win personnally and a lot to lose (maybe not this year, but I remember previous year's organisers suggesting to ban the culprit from some rationalist circles) and having all readership witness the totally artificially created drama. To me too, but my 'interesting' would be something like "I'm glad it exists even if it's flawed". The most important problem for me is that in its current shape it does not allow to draw useful conclusions from the outcome (thanks https://www.lesswrong.com/posts/EW8yZYcu3Kff2qShS/?commentId=C97ngHSu6iHmdCjPc [https://www.lesswrong.com/posts/EW8yZYcu3Kff2qShS/?commentId=C97ngHSu6iHmdCjPc] for clarifying that point for me)

I don't want to have a ton of meta discussion on the day of the experiment, but I am pretty interested in ideas from people on how to reduce the bad parts of the social ritual. I think the benefits of doing a thing like this are pretty high, and I am pretty excited about the benefits of the trust exercise, but also don't want to needlessly distress people. So if people have any ideas on wording or additional text we could add to the announcements or emails, I think that would be a productive use of time.

The obvious thing is to ask people to consent before entering the game? It's weird to get an email, out of the blue, with launch codes, telling you that you are now part of this game. While an email that spells out some of the explicit norms, and asks people to opt-in, seems great.

A light-touch intervention could just be giving people a link to click to get the launch codes, that shows some text spelling out norms like this, and ask people to only click the link if they actually want to participate.

EDIT: To be clear, I am participating in this, and would have opted-in - I just think it's a really bad norm to not ask for consent first, when we're putting people in a situation with real risks and social consequences, and with wildly differing perceptions of the depth of meaning in this event.

Well, I agree that in general you should ask consent before pulling people into any game, but I suppose that part of the purpose was precisely to see how people react to random responsibilities (which can definitely happen in real life). I mean, the Soviet probably didn't bother to get Petrov's consent before putting him in the control room. And all that's required in our case is basically "please do nothing"... I didn't receive the email, but I don't think I would be upset by one message just asking to be ignored (an email asking me to actively do something to prevent destruction would have been a different kettle of fish). Full disclosure: I clicked the button. Actually, I misclicked the button while hovering on it. I suppose that's the reason why GitHub and similar services are very careful to hide the "delete repository" button behind long page scrolls and also add an additional "are you absolutely sure?" popup. For the next Petrov Day, I think we should at least add the blocking popup instead of just having an "Are you sure?" title over the button. Being tricked into pushing the button is one thing, but it should not be possible to push the button purely by accident.
8Neel Nanda1y
The problem is that people are entered aa a situation where they don't necessarily understand the context and cultural expectations other people may have, could very reasonably misunderstand things, but are exposed to dede real and meaningful social risks if they do misunderstand things. Framings lakelike "sometimes you get random responsibilities" ONLY make sense a mutual understanding that thesethe situation is taken seriously, which empirically was obviously ot universal here.
I agree, but the people who actually received the codes are supposed to be carefully selected LW users, not totally random people. I would be quite impressed to learn that someone between those 100 users didn't actually understand the context (on the other hand, I do expect random LW users who didn't get codes to press the Red Button for the lulz without necessarily knowing the context, and I agree they shouldn't be blamed for this). That said, adding more things clarifying the context is probably good. Petrov himself surely didn't have the context problem.

I was one of 270 last year and am one of 100 this year, I did not understand the context last year. Empirically, neither did Chris last year. Multiple people on the EA Forum have commented about not understanding the context

Ok, then I publicly declare to be quite impressed. (I'll treat this as further evidence that inferential distances tend to be longer than expected)
5Neel Nanda1y
If it helps, here's a comment [https://www.lesswrong.com/posts/K7jrkyKArvxJ224GD/on-destroying-the-world?commentId=3fj4xxM2B324WRcqQ] I wrote last year trying to narrate my internal experience of reading the email (I then read the 2019 threads and eventually twigged how seriously people took it, but that was strongly not my prior - it wouldn't even have occurred to me to ask the question 'do people take this more seriously than a game?')

I don't know if this would defeat part of the purpose, but what about making it opt-in over a long time period, e.g. giving people all year to put themselves on the list of people who might be chosen to receive codes? 

Other than that, I think it's mostly a question of (to the extent possible without undermining what you're trying to do) making it pretty clear to the recipients that people take this seriously and would genuinely like them to refrain from using the codes. As far as I can tell, that has already improved from last year. (It seems like there might have been some tonal ambiguity last year, with phrasing intended to be heightened but mostly serious coming across to some readers as playful and mostly joking.)

One way to make it seem more serious (to me) would be to make the effects bigger. E.g. taking down the frontpage (or the whole site?) for a whole week rather than just a day.
I wonder if you are anchoring at the wrong point of comparison here. The point is that it is technically true, as distinct from button-whose-only-function-is-to-disable-the-button. Your post reads like you worry that we are all comparing this to actual nuclear destruction, which I agree would be deeply absurd. In my view, the stakes are being a bit of a dick. The standard is: can we all agree to not to be a bit of a dick? It's a goofy sort of game, but we have it because of its similarity to the nuclear case: the winning move is not to play [https://www.youtube.com/watch?v=NHWjlCaIrQo].
5Matt Goldenberg1y
It took me a while to grasp how people see LW in the rationalist community, but after grokking it I get the exercise better.
This is an interesting comment! There are a number of things that could be said in response to this, but perhaps the best place to start is with this part: I would like to register that this description, as written, could equally be applied to any norm or set of norms, including such basic ones as making and keeping promises (!!). Now, perhaps your intent is to imply that e.g. the act of making and following through on commitments (and expecting others to do likewise) is a "strange ritual" by which humans "seeming arbitrarily" decide to "give [others] the means to upset [them]"; such an interpretation would at the very least be consistent with your rhetoric and tone. But if this is your position, I submit that you are in the extreme minority, and that your position requires more (and better!) defending before you are licensed to behave in a way that supposes it as the default. Conversely, if your intent was not to imply that this (rather enormous) class of universal human practices is "obnoxious and strange behavior", then perhaps it would behoove you to explain the seeming inconsistency between that and what you wrote. If there is more nuance to your position than I am perceiving, I would love to know about it! -------------------------------------------------------------------------------- Unfortunately, however, in this case I suspect that things are in fact as they first appear—that your comment constitutes little more than a naked attempt at a put-down, and that there is no further nuance to be found. This impression is strengthened by lines such as the following which attempt to convey ingroup membership while simultaneously signaling disdain and disappointment (but which are unfortunately undercut by the fact that the second-most recent comment on your account is upwards of 4 years old).

You're right, I haven't been active in a long time. I'm mostly a lurker on this site. That's always been partly the case, but as I mentioned, it was the last of my willingness to identify as a LWer that was burnt, not the entire thing. I was already hanging by a thread.

My last comment was a while ago, but my first comment is from 8 years ago. I've been a Lesswronger for a long time. HPMOR and the sequences were pretty profound influences on my development. I bought the sequence compendiums. I still go to local LW meetups regularly, because I have a lot of friends there.

So, you can dismiss me as some random who has just come here to hate if you want to, I guess, but I don't think that makes much sense. Definitely the fact that I was a bit obnoxious with my criticism probably makes it tempting to. You can tell I'm here in bad faith from all the downvotes, right?

I think the audience seeing this comment is heavily self selected to care about the Petrov day celebration and think it's good and important. These present core LWers risk severely underestimating how off-putting this stuff is. How many people would be interested in participating in this community, constructively, if the vibes were a little less weird. These people, unlike me, mostly don't care enough to rock up and criticize.

The reason I was rude was because I am frustrated at feeling like I have to abandon my identification as an LW rat, because I just don't want to be associated with it anymore. I got so much value from less wrong, and it feels so unnecessary.

Thank you for clarifying. I think your stance is a reasonable one, and (although I maintain that your initial comment was a poor vehicle for conveying them) I am largely sympathetic to your frustrations. Knowing that your initial comment came from a place of frustration also helps to recontextualize it, which in turn helps to look past some of the rougher wording.

Having said that: while I can't claim to speak for the mods or the admins of LW, or what they want to accomplish with the site and larger community surrounding it, I think that I personally would like to offer some further pushback. In particular, I think that there is a tension between what you term "making the vibes a little less weird", and something that I might term "being able to visibly, publicly care about things most people haven't thought about".

There is an argument, perhaps, to be had about whether the Petrov Day game is something "worth" caring about, even for a group of people with a history of caring about strange things. As I wrote in a separate comment response, I don't necessarily have a strong opinion about this. I am less involved in the "rationalist community" than many members of this site; I have not ... (read more)

I want to be clear that it's not having rituals and taking them seriously that I object to. It's sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you've assigned them. They didn't ask for this.

In my opinion Chris Leong showed incredible patience in writing a thoughtful post in the face of people being upset at him for doing the wrong thing in a game he didn't ask to be involved in. If I'd been in his position I would have told the people who were upset at me that this was their own problem and they could quite frankly fuck off.

Nobody has any right to involve other people in a game like that without consulting them, given the emotional investment in this that people seem to have.

I want to be clear that it's not having rituals and taking them seriously that I object to. It's sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you've assigned them. They didn't ask for this. [...]

Nobody has any right to involve other people in a game like that without consulting them, given the emotional investment in this that people seem to have.

Cool. So, on the object level, there is a discussion to be had about this... but I want to point out the extent to which, if this was your concern, your initial comment entirely failed to convey it. Not to put too fine a point on it, but there is a stark difference between what you wrote here, and and what you wrote here:

Calling a website going down for a bit "destruction of real value" is technically true, but connotationally just so over the top. A website going down is just not that big a deal. I'm sorry, but it's not. Go outside or something. It will make you feel good, I promise. [...]

As a trust building exercise, this reduces my confidence in the average lesswronger's ability to have perspective about how important things are, and to be respon

... (read more)

On to the object level:

I want to be clear that it's not having rituals and taking them seriously that I object to. It's sending the keys to people who may or may not care about that ritual, and then castigating them for not playing by rules that you've assigned them. They didn't ask for this. [...]

Nobody has any right to involve other people in a game like that without consulting them, given the emotional investment in this that people seem to have.

(with the disclaimer that—again—I am not strongly invested in the Petrov Day game as practiced, nor do I have a strong opinion on whether the mods are doing it right)

I think it is an entirely reasonable thing to do, if you are attempting to establish a high-trust community, to assume a certain level of "buy-in" among core members of said community. I think one of the things that having a high-trust community gives you, is precisely the ability to coordinate actions and activities in ways more subtle and less legible than "opt-in only" (and to be clear, I view this as a positive externality; I would like more communities to have this ability!). I think, to the extent that a community is not yet at the level where [it is common know... (read more)

Sure, I don't disagree.
Life is like that. You will be tested on things that you never prepared for and could never foresee, things that you must handle even if you can't. The tests will come without warning. There is no-one to complain to that it is not fair. There are no retakes. And everyone fails in the end. The Petrov Day button is a doddle in comparison.

This is just such a bizarre tack to take. You can go down the "toughen up" route if you want to, but it's then not looking good for the people who have strong emotional reactions to people not playing along with their little game. I'm really not sure what point you're trying to make here. It seems like this is a fully general argument for treating people however the hell you want. After all, it's not worse than the vagaries of life, right? Is this really the argument you're going with, that if something is a good simulation of life, we should just unilaterally inflict it on people?

The Petrov Day event is a trivial to nonexistent burden to place on those who received the launch code. They were told the background and the launch code and told what it would do if they used it. They were not even asked to do or not do anything in particular. Similar events have been run in the past, and those selected are likely to have been around long enough to have seen at least the last such event.

The obvious way to not participate is to ignore the whole matter.

I don't think there is any violation of consent here.

I think it's reasonable to take the position that there's no violation of consent, but it's unreasonable to then socially censure someone for participating in the wrong way.

I agree that life is like that. However, the game still violates consent, the same way as if I assaulted you on the street because I think it's good preparation for being assaulted "for real".

To me, this game falls in the same category as gift giving, surprise parties, pranks, rude/aggressive jokes etc. There needs to be a meta-level agreement that this kind of thing is ok, even though "being a surprise" is an essential part of the thing itself.

my willingness to identify as a LWer that was burnt [...] HPMOR and the sequences were pretty profound influences on my development [...] frustrated at feeling like I have to abandon my identifaction as an LW rat

I've struggled a lot with this, too. The thing I keep trying to remember is that identification with the social group "shouldn't matter": you can still cherish the knowledge you gained from the group's core texts, without having to be loyal to the group as a collective (as contrasted to your loyalty to individual friends).

I don't think I've been very successful; multiple years after my alienation from "the community", I'm still hanging around leaving bitter comments about how terrible they/we are. It's very dysfunctional! Why can't I just write it off as a loss and move on?

I guess the main difficulty is that, for humans, knowledge actually isn't easily separable from a community of other people with shared vocabulary. I can't just ignore what the central rat social hierarchy is doing, because the central hierarchy nodes exert a lot of control over everyone in the world who I can really talk to.

6Ben Pace1y
What's the value you get from it, and how does this once-a-year event affect the value you get from LW?
What? No, it can't be applied equally. The norm of "Keep your promises" serves the function of making it possible for people to plan around each other's behavior. (When I say, "I'll be there," you can confidently predict that I'll be there.) It's a very general and powerful social technology. The norm of "Take the Petrov Day game on our website very seriously" is a lot more arbitrary because it doesn't serve that general function. If people had to proactively sign up for the game and sign a sworn statement saying that they promise not to press the button, then someone who subsequently pressed the button could be busted on the grounds of violating the more basic norm of breaking a promise that they proactively and voluntarily made—but that would be a different, and much less interesting, game. In the actual game, the mods unilaterally send out codes to users whom they predict will take the game seriously. If the mods guess wrong about that, many observers would say that's "on them." I mean, to be fair, the ingroup is a massive disappointment that is genuinely worthy of disdain.
On a purely factual level: note that what I wrote was that the description as written could be applied equally to such things as "keeping promises". Once more, the quoted description: This description indeed does nothing to distinguish between different norms like "keeping promises" or "taking the Petrov Day game seriously" (or, in fact, other, stupider norms such as "giving yourself electric shocks eight hours a day [https://slatestarcodex.com/2014/07/30/meditations-on-moloch/]"). Every norm involves expecting people to choose to do something they are not [counterfactually] constrained to do, meaning every norm involves "giving someone else the means to upset you". Every norm carries with it the expectation that people will comply with it without explicit enforcement, meaning every norm is a "ritual that needs to be taken seriously". And every norm starts out being "seemingly arbitrary", until and unless it is adopted more generally, at which point it becomes "the way things are". In other words, the OP's description is a list of conditions that always returns true, regardless of input. And as an argument that does not produce differing outputs given different inputs provides no discriminatory power [https://www.lesswrong.com/posts/6s3xABaXKPdFwA3FS/what-is-evidence], such an argument should be highlighted as bad regardless of whether you happen to agree or disagree with its target in any particular case. (This much you know, because this much you have argued yourself elsewhere [https://www.lesswrong.com/posts/WwTPSkNwC89g3Afnd/comment-section-from-05-19-2019?commentId=32GPaijsSwX2NSFJi] . If we are to speak of "disappointment", then I am indeed dismayed and disappointed that you in particular appear to be applying significantly less of a critical eye to comments with which you happen to share a premise, given what you've written in the past.) -------------------------------------------------------------------------------- Regarding the (separate and additio
Okay, I think I'm reading a lot more into Sullyj3's use of the phrase "seemingly arbitrarily" than you are. Very specific things like "taking the Petrov Day game seriously" or "giving yourself electric shocks eight hours a day" are the kind of cognitive content that we expect to come with attached justifications [https://www.lesswrong.com/posts/HacgrDxJx3Xr7uwCR/arbitrary]: depending on the game and the community, I could see either of "Take this game seriously; it's a ritual" or "Don't take this game seriously; it's just a game" being the norm. On encountering a community with such a norm, I would expect to be able to ask why they do things that way and receive a non-circular answer other than an appeal to "the way things are" (here). In contrast, the very concept of a "promise" seems to have an inherent asymmetry to it; I'm not sure what making a "promise" would mean in a world where the norm is "You shouldn't keep promises." (This feels like the enactive [http://benjaminrosshoffman.com/actors-and-scribes-words-and-deeds/#Enactive_language] analogue of the asymmetry between truth and lies [https://www.lesswrong.com/posts/YptSN8riyXJjJ8Qp8/maybe-lying-can-t-exist], where you need a convention grounding the "true" meaning of a signal in order to even contemplate sending the signal "falsely".)
While I agree with both the letter and the sentiment, I'd temper it by adding that this year feels like a step in the right direction compared to last year, by introducing an 'opposing' groupe to mimick a MAD situation more closely. And I like that this exercise exists just for its uniqueness, and because I agree with the premise that existential risk preparedness is important.

I briefly saw a "Missile Incoming" message with a 60:00 timer (that wasn't updating) on the buttons on the front pages of both LW and the EA Forum, at around 12pm EST, on mobile. Both messages were gone when I refreshed. Was this a bug or were they testing the functionality, testing us or preparing to test us?

7Tomás B.1y
I suspect it was supposed to be a "false alarm".
3Neel Nanda1y
Same happened with me, I thought it was an issue with page loading (I was using a very slow browser, and it took a few seconds to correct)
Since the timer wasn't updating on either site, I assume they weren't testing us (yet).
Same thing happened to me. Might've been a bug with page loading? I've had similar things happen with other sites.
[-]Ruby1y Moderator Comment10

Typically the mods put out a Petrov Day Retrospective the day after; however some of this year's organizers are now away at a conference, so we're going to wait a few days until they're so they can have input before we publish.

Sorry for the delay. Stay tuned.

(And thanks to everyone who participated and/or weighed in on the discussion.)

I'm sad I didn't get a launch code. 


Reasons: power hungry 

An attempted non-mystical justification for Petrov day sensitivity, for those who think it's ridiculous:

If the LW home page were 'nuked', my day would be slightly worse: there would be interesting posts and comments I wouldn't as easily find out about (e.g. this one by Katja about literal and metaphorical fire alarms). So it makes sense for me to feel a bit bummed if it gets taken down. In addition, if someone else takes the page down, I should feel more bummed: not only did this slightly bad thing happen, but I just learned that someone will make my life ... (read more)

As a counter-point, my day was made significantly better by the front page being nuked in 2020 - it was exciting, novel, hilarious (by my lights - clearly not to some people), made some excellent points about phishing and security, and gave me opportunities to dissect why people oriented to this event differently from me. I expect my experience would have been less good last year had the phishing attempt not happened, and we all simply coordinated. More generally, when a website does something unusual and novel like this, I feel like the value of novelty and interestingness can outweigh the costs of a single day of disrupted use?

I'd further argue that the people highly invested in this seem much more invested in the abstract ideas of trust, community, shared ritual and cohesion, more so than the object level of the frontpage being down (besides, people can always use greaterwrong.com )

I note that the feelings you describe are the underlying assumption which makes the risk real: if no one thought the consequences of pushing the button was entertaining or a learning opportunity, then no one would push the button, and the tension goes away.

And you can also play on hard mode: "During said ceremony, unveil a large red button. If anybody presses the button, the ceremony is over. Go home. Do not speak."

   This has been a common practice at Petrov Day celebrations in Oxford, Boston, Berkeley, New York, and in other rationalist communities. It is often done with pairs of celebrations, each whose red button (when pressed) brings an end to the partner celebration.

First time I heard this idea and I really like it. I would like to organize something like it next locally for my friends, hopefully I remember.

Grammar: I think there's a few words missing in:

there is real value on the line here and this is a real trust-building exercise that [is not?] undertaken lightly by either LessWrong or the EA Forum.

Good catch, thanks!

I don't have codes. However if I did have the codes, I would be pressing the button. Objective being to raise visibility of my post as follows here.

I think the concept of such an experiment is great, however the exact implementation could have been much better - if the aim of the experiment is to understand anything meaningful about actual nuclear war.

I'd much rather the smart people on this site spend time designing a good experiment such that it would provide clear learnings depending on outcome. Like how good experiments are typically designed. Rather t... (read more)

I have pushed the button.

I do not have launch codes.

I did not attempt to launch nukes.

I pushed the button to see if there was any exploitable way to brute-force nuke codes or otherwise determine a valid code.

I got bored and gave up after seeing the launch code textbox had no ID.

I don't know what you should take away from this. Beware chaos, I guess, and those who think it would be funny to do damage or try to find flaws in a system. To many, taking down a website for a day would be funny. Don't even show buttons to people unless you trust them with the responsibility and know them unreasonably well.

Ok, that's satisfied my curiosity as to what happens if you push the button without codes, and so I am not going to push the button.

Social deduction games

  • with clear final objectives: Mafia, Tank Tactics, Neptune's Pride. These games have clear winning conditions, thus final objectives for the players. The meta objectives are open ended, which gives the players a more opened way to play the game. These games have very little rules and mechanics to limit how the game would be played.
  • with ambiguous final objectives: Petrov Day, Reddit's The Button. These games have no clear winning conditions, thus the final objectives are open ended. They are the same as above, with little rules and open
... (read more)
4Yoav Ravid1y
For me Petrov Day is obviously in the first category, the objective is to not destroy value. But as another comment said, part of the coordination in Petrov Day is agreeing what the objective is.

I saw the button, moused over it and got the "Are you sure?" popup, then carefully refrained from pushing the button. (I do not have launch codes.) As Batman says after destroying a supposed magical idol that probably does not actually summon demons, some things you don't take chances with.

What is the purpose of showing the red button to those without launch codes?

It means that you don't need to have the other person's login credentials to launch the nukes (I don't want to encourage password theft, and also think that the case of someone sharing just their codes is more interesting than someone sharing full access to their account). It also creates common-knowledge of what is happening on the site, in a pretty clear and obvious way.

2Rafael Harth1y
Du you have statistics on how many people who do not have codes press the button anyway?

Why only 100 members? Shouldn't we want to be able to trust our entire community?  

Why not give codes to everyone or everyone who'd registered before X date with X karma to avoid trolls?

More trustworthy people is good but it doesn't follow that acting as if more people are trustworthy is good.

Personally, I want to be able to trust the community of people I work with. Would it be good to somehow be able to trust the entire community? Yeah, I guess. But unless X karma is very high, I strongly predict that codes would be entered. I do not think we can trust—or should realistically expect to be able to trust—so many people. Many people have posted over the years. Probably, some high karma long-time users are no longer aligned with community goals and would enjoy watching the post-launch drama.
I mean, yes, but don't be too sure that's due to value-drift on the users' part rather than "the community." [http://benjaminrosshoffman.com/construction-beacons/]

I expect the button can be pressed without lunching nukes (though I didn't check, won't check, and encourage others not to check - even if admins confirm this to be the case) because launch codes are required. My question is, do you track how many times the button gets clicked without someone entering a code?

In past years we have kept track of who presses the button without entering the codes (or who entered wrong codes), and have published aggregate statistics about them.
I feel like that's part of the point. I'm not going to lie - the thought definitely crossed my mind to press the button and see if anything would happen even without launch codes. There's a sort of... allure about it. But knowing that potentially just pressing the button, could bring down the EA forum? that was enough to discourage me from trying it out. Our stakes are much much smaller, of course, but I still feel some weight of the responsibility.
2Yoav Ravid1y
Yes, it's definitely part of the point. That's why I expected they would track it.

Correction: The annual Petrov Day celebration in Boston has never used the button.


I'm glad one apparently can't take down the site merely by pressing the Big Red Button. I didn't press it, but I have tons of muscle memory on sites like reddit of opening a gazillion links in new tabs, so I wouldn't have been surprised if I'd clicked it before even noticing what it was about. Which is a far cry from intentionally pressing the button, of course, but still.

New to LessWrong?