[ Question ]

Is microCOVID "badness" superlinear?

by Optimization Process1 min read13th Aug 20212 comments

4

Covid-19
Frontpage

That is: if I have a choice between 200 microCOVIDs now, or 100 now and 100 next month, does it matter which one I pick? (For bonus points: how big is the difference? And does the answer change if it's 20000 vs 10000/10000?)

(Alternatively: what keywords would I search to find studies on this? I'm sure data exists, but I'm coming up empty on Google.)

On the one hand, straightforwardly no: in my cohort of infinite parallel selves, every microCOVID we take on makes almost exactly one-in-a-million of us get sick (at least until a substantial fraction of us are sick, i.e. I've taken on tens of thousands of microCOVIDs), so every microCOVID is equally bad. (Something something axiom of independence, and we're all VNM-rational agents, right?)

On the other hand, maybe yes, because of complicated biology reasons, where if you inhale 1M viruses over the course of a year, at some point you'll probably get COVID, but if you snort them all at once then you're, I dunno, effectively giving the infection a head start of several doubling times, and you're gonna get COVID real bad.

New Answer
Ask Related Question
New Comment

2 Answers

There is a sublinear aspect actually, which is easier to see (because it is a larger effect) for large risks. So lets say you do two 100,000 microcovid (or 10% chance of catching covid) events. Your chances of not catching covid from one event are 90%, so your chances of not catching covid from either event are 90% * 90%, or 81%. So your chances of catching covid from at least one of these events is 19%. 100,000 microcovids + 100,000 microcovids = 190,000 microcovids. The arithmetic we do of adding microcovids by the ordinary rules of arithmetic is a good approximation for relatively small numbers of microcovids, but the actual addition of the risks is sublinear, it does not follow normal arithmetic.

That said, in principle I think you are right that two covid-inhaling events close enough in time will add in a superlinear way for biology reasons. That could be, as you suggest, getting a worse case of covid. It could also be catching covid, when neither event was bad enough on its own to do that. Lets say the threshold to be infected is 100 virus particles (I have no idea the actual number), and each event gives you 75 virus particles. If they happen far enough apart, you don't get covid, but if they happen one right after the other, you do get covid. Overall, I expect this effect to be small. Unless you are doing super risky things, or working in the covid ward of your local hospital, you probably aren't encountering two infected people right in a row.

The if the probabilities of catching COVID on two occasions are x and y, the the probability of catching it at least once is 1 - (1 - x)(1 - y) which equals x + y - xy. So if x and y are large enough for xy to be significant, then splitting is better because even though catching it the second time will increase your viral load, it's not going to make it twice as bad as it already was.