LessWrong has been and is planning to devote a significant amount of attention to the coronavirus. But is that what we should be focused on? Is it more important than things like existential risk reduction and malaria treatments?

New to LessWrong?

New Answer
New Comment

5 Answers sorted by

habryka

Mar 19, 2020

750

My current sense is yes, though I really don't think it's obvious and think that this is a pretty high-stakes call. 

My sense is that work in a crisis generally has really high-leverage, and I think there are reasonable arguments that this is the biggest global crisis since World War 2, at least in terms of how the world will be shaped by it, and how much is at stake (you don't see a 30% drop in the stock market that often, and the number of people who will die is quite plausibly more than WW1). I do indeed think that during World War 2 it would have been reasonable for many people on LessWrong to participate in the war effort, and think the same is true in this case. I do think that on an all-things-considered view this is likely going to be much less big of a deal than World War 2, but I think the basic argument is plausible enough that it seems worth betting quite a bit on. 

I also think this topic is a much better fit than usual political and news-related topics for LessWrong, because we are ultimately dealing with a "Player vs. Environment" type of threat, and not a "Player vs. Player" type of threat. I think in cases like this, our tools for epistemic rationality and general scientific inquiry are in a good place to shine, and there is less risk of us getting sucked into an adversarial epistemic environment, because the questions to be asked and answered are primarily about pretty stable ground-truths. 

The second argument is tractability. I think it's pretty key that people on LessWrong noticed that this was important much earlier than the vast majority of the world, and even the vast majority of the world's intellectual elite. I think this made LessWrong a natural Schelling point of attention, and I don't think it's obvious that a separate Schelling point would emerge if we were to deemphasize coronavirus related topics on LessWrong. This makes me think that LessWrong has at least some responsibility in not damaging the communication around this topic by suddenly deemphasizing the topic, at least not without creating a separate hub where discussion can coalesce instead.

The third argument is that taking a global perspective, I think there is a good argument that you should help out in crises like this, even if working on them is not directly related to your goals, because many other players in the world care a lot about it, and will be deeply grateful for your assistance. I think from a perspective of cooperating with other powers in the world, it's good form to help out with this as much as possible, given the overwhelming importance other people put on this. 

The fourth argument is just relevance to all of our wellbeing. I don't think we are at the stage where we can just rely on local governments or standard expert hierarchies to give us advice and tell us what to do. Most governments and municipalities have not yet announced safety measures that seem sufficient to me, and so it's still up to the individual and small communities like LessWrong to figure out what the appropriate level of safety is, and I sadly expect this to be the case for a while. 

I think this made LessWrong a natural Schelling point of attention

Outsiders are paying attention to our coverage of the coronavirus? To a significant degree?

Traffic is up 30% from last month (which is very significant given that usually most of LessWrong's traffic is driven by the large number of distributed links strewn all around the internet, and so is very stable and already quite high). 

We also are being linked from a lot of places. A lot of Facebook links and a lot of Twitter links. I am not sure how to translate Google Sheets visitor numbers into traffic, but we've had pretty consistently 70+ people concurrently on the LessWrong Coronavirus Link Database, which is more than I've seen on basically any other Google Doc linked, and LessWrong, which usually has around 120k unique users a month usually gets around 40 concurrent users. So the additional traffic here seems pretty significant.

I have also heard of at least one higher up place in the UK government which said that they were actively following LessWrong to keep track of the Coronavirus stuff, though this is now a second-degree rumor, so treat this with some salt.

My sense is also that there is a broader ecosystem of people around LessWrong and the rationality community (which includes the EA and SSC communities), which ended up putting a lot of their coronavirus related thoughts here, and who are referring a lot to our content.

I don't think any of this indicates overwhelming amounts of attention on us (which I am mostly glad for), but I do think it indicates attention from a significant subset of smart and informed people that I care a lot about.

I do indeed think that during World War 2 it would have been reasonable for many people on LessWrong to participate in the war effort, and think the same is true in this case.

It feels to me like there are three reasons this could be the case:

  1. Counterfactual impact on the war; if the LWers of the time chose to act instead of not act, they shift the probabilities of who ends up winning / what collateral damage happens over the course of the resolution.
  2. Social obligation; if LW conscientiously objected from doing its part, or thought other things were more impo
... (read more)
2habryka4y
Yep, I think these three perspectives roughly cover why I think it might have been a good idea. I also think that a good number of people we now think of as having had a large impact on x-risk and who were kind of similar to rationalists (e.g. some of the Manhattan Project scientists) had that impact because they participated in that effort (and the followup cold-war period) for roughly the three reasons you cite.
2Eli Tyre4y
It seems important to note that, from my reading of the Making of the Atomic Bomb, the biggest motivator for most of the physicists was the fear that the Nazis would get to the bomb first. This is technically under Vaniver's first point above, but it has a different tenor: it wasn't a dispassionate assessment of counterfactual impact, it was visceral fear. Relevant quote: I'm not sure what the relevance to the current corona situation is.

I'm curious about the 3rd argument. I'm curious about why you think it is likely that significant players will notice the contribution of Less Wrong?

This is a good answer.

The fourth argument is just relevance to all of our wellbeing.

My intuition is that from here on out it's going to be hard to find steps we can take that will have even a moderate impact on our wellbeing.

1) We know that we need to avoid contact with others, so I assume we'll all being staying home. Given that we're at home isolated from others, is there much left to do? Things that go beyond common sense and standard advice, like opening packages outside and disinfecting them?

2) Eventually we'll face the question of when it is safe to en... (read more)

5habryka4y
The value of people working on x-risk, from an x-risk perspective, is quite high. So while I sympathize with the conflict in broader terms, in this case, it just seems pretty obvious to me that I care quite a lot about protecting the people in this community, from both a personal and an altruistic perspective.
1Kenny4y
It's going to be hard now but it was easy before now? I think the site regulars have a comparative advantage in thinking (and writing about those thoughts) and that we'll make (relatively) good judgements about how much attention we should be paying to the pandemic. I think it's just as likely that we will continue to help as much as we have already, which I think has been an effective impact. A lot of this 'work' seems broadly useful too, beyond just this current crisis.

Richard_Ngo

Mar 19, 2020

370

I think LW has way too much coronavirus coverage. It was probably useful for us to marshal information when very few others were focusing on it. That was the "exam" component Raemon mentioned. Now, though, we're stuck in a memetic trap where this high-profile event will massively distract us from things that really matter. I think we should treat this similarly to Slate Star Codex's culture wars, because it seems to have a similar effect: recognise that our brains are built to overengage with this sort of topic, put it in an isolated thread, and quarantine it from the rest of the site as much as possible.

I agree. Although it's interesting, it feels like it's getting outsized attention because it's a "near" threat. I think this made more sense when the wider public wasn't taking COVID-19 seriously, but now that they are I don't think there's a lot of value in LW continuing to focus on it to the extent that it does. I'm not sure if this is an endorsed policy though or just my personal annoyance at all the COVID-19 stuff taking up space and attention and away from what I still consider a bigger threat in expectation, unaligned AI.

I'm much more inclined to accept, uncritically, posts and questions from users on whatever topics they want to discuss.

I think the pandemic is likely to motivate a lot more non-pandemic work and activity here even if the pandemic continues to account for most of the total activity.

Raemon

Mar 18, 2020

330

Yes, though I think it's important to be asking this question, both now and every few weeks, to check "Hmm, do we actually have comparative advantage here? Have we picked all the low-hanging fruit?"

There are roughly three reasons I see to focus on this:

To make sure we're safe. 

You can't research x-risk if you're dead, or your life is disrupted. Right now a lot of stuff is up in the air. Having an accurate model of both the coronavirus itself, and possibly downstream economic/political turmoil seem important, at least until we've narrowed down the scope of how bad things area. (Maybe in a month it turns out things aren't that bad, but I think the error bars are wide enough to justify investing another month of thinking and preparation)

For standard EA Reasons. 

I read your initial question as mostly asking within this frame. Is Coronavirus important, neglected, and tractable? Do we have comparative advantage at it? 

I'm not sure about the answer to this question. On one hand, it's definitely not neglected. But it does seem important and tractable, and I think it is a quite achievable goal for LessWrong but to be one of the best places on the internet to discuss it and get information. 

My guess is that people who were working professionally on x-risk should most likely continue focusing on that, but I think for a lot of "freelance EA research" types, coronavirus is at least worth considering within the standard EA paradigm. 

But I would not be surprised if the answer was "no, when you factor in the non-neglectedness, the QALYs or other impact here isn't on par with usual EA effort."

Tight feedback loops. 

This is my biggest crux (after ensuring personal safety). What seems very significant about coronavirus to me is that it gives us a situation where:

a) there is clearly value to marginal thought, from people who aren't necessarily specialists.

b) you will probably get an answer to "was I right?" on the timescale of months or a year, rather than years/decades, which is how most EA is. 

I currently think it's worth marshalling LW and EA towards coronavirus, mostly as an Exam to see how competent we are, intellectually and logistically. It's a particularly good time to practice forecasting, research, first principles thinking, fermi-calculations, and collaboration. In the end, we'll a) have a clear sense of our own capabilities, and b) moreso than usual times, it'll be easier to signal our competence (assuming we turn out to be competent) to the rest of the world, possibly leveraging it into more people trusting us when it comes to more confusing domains.

On 1: How much time do people need to spend reading & arguing about coronavirus before they hit dramatically diminishing marginal returns? How many LW-ers have already reached that point?

On 3a: I'm pretty skeptical about marginal thought from people who aren't specialists actually doing anything - unless you're planning to organise tests or similar. What reason do you have to think LW posts will be useful?

On 3b: It feels like you could cross-apply this logic pretty straightforwardly to argue that LW should have a lot of political discuss... (read more)

Overall: my current estimate is that there's about one more month of useful high-focus COVID work to do. Meanwhile, we'll be shipping the "block covid content from frontpage" option within 24 hours, so the people sick of COVID content can easily tune it out (assuming they're using LessWrong. Hmm, this is a good reminder that we should probably check in with GreaterWrong peeps about implementing the new tagging features)

Also, if you're using LessWrong and you haven't yet turned off the "Coronavirus" section at the top of the page, you already have the option in your recommendation settings to turn that off.

...

Answering other comments in more detail:

I do think #1 probably has the least remaining value for people who for whom it's a live option to "get supplies and hole-up somewhere for months."

The two reasons that I think at least some more thought is worthwhile here are:

  • Some people can't actually hole up forever, and I think those people benefit from having good, up-to-date models that inform them of how risky things actually are
  • Some people may be worried about economic/political turmoil. I am more worried about those discussions turning Mindkiller-y, and not quite sure what to do a
... (read more)

Even for people that are working professionally on x-risk, it's probably harder now, and likely to continue to be harder, to focus on x-risk anyways.

Chris_Leong

Mar 19, 2020

120

Before it was even clear it'd be this big a threat, I wrote: EA Should Wargame Coronavirus. Now I think there's an even stronger argument for it.

I think that this offers us valuable experience in dealing with one plausible source of existential risk. We don't want AI Safety people distracted from AI Safety, but at the same time I think the community will learn a lot by embarking on this project together.

lc

Mar 18, 2020

50

We're going to head into a great depression. This will genuinely, not bullshitting, for reals this time have enormous ripple effects on all those topics, and Lesswrong is right for emphasizing it. I never thought I'd say this, but I'm glad we have stopped talking about Malaria and X-Risk and start talking about what's on CNN.

2 comments, sorted by Click to highlight new comments since: Today at 9:00 AM

Not really an answer, but a statement and a question - I imagine this is literally the least neglected issue in the world right now. How much does that affect the calculus? How much should we defer to people with more domain expertise?

We should defer to people with more domain expertise exactly as much as we would normally do (all else being equal).

Almost all of what's posted to and discussed on this site is 'non-original work' (or, at best, original derivative work). That's our comparative advantage! Interpreting and synthesizing other's work is what we do best and this single issue affects both every regular user and any potentially visitor immensely.

There's no reason why we can't continue to focus long-term on our current priorities – but the pandemic affects all of our abilities to do so and I don't think any of us can completely ignore this crisis.