Covid 5/13: Moving On

Honest question: Why are people not concerned about 1) long COVID and 2) variants?


Is there something(s) that I haven't read that other people have? I haven't been following closely...


My best guess is:

1) There's good reason to believe vaccines protect you from it (but I haven't personally seen that)

2) We'll hear about them if they start to be a problem


1/2) Enough people are getting vaccinated that rates of COVID and infectiousness are low, so it's becoming unlikely to be exposed to a significant amount of it in the first place.



What does vaccine effectiveness as a function of time look like?

From that figure, it looks to me like roughly 0 protection until day 10 or 11, and then near perfect protection after that.  Surprisingly non-smooth!

How many micromorts do you get per UV-index-hour?

Oh yeah, sorry I was not clear about this...

I am actually trying to just consider the effects via cancer risk in isolation, and ignoring the potential benefits (which I think do go beyond just Vitamin D... probably a lot of stuff happening that we don't understand... certainly seems to have effect on mood, e.g.)

How many micromorts do you get per UV-index-hour?

Looks like just correlations, tho(?)
I basically wouldn't update on a single study that only looks at correlation.

AI x-risk reduction: why I chose academia over industry

You can try to partner with industry, and/or advocate for big government $$$.
I am generally more optimistic about toy problems than most people, I think, even for things like Debate.
Also, scaling laws can probably help here.

AI x-risk reduction: why I chose academia over industry

um sorta modulo a type error... risk is risk.  It doesn't mean the thing has happened (we need to start using some sort of phrase like "x-event" or something for that, I think).

AI x-risk reduction: why I chose academia over industry

Yeah we've definitely discussed it!  Rereading what I wrote, I did not clearly communicate what I intended to...I wanted to say that "I think the average trend was for people to update in my direction".  I will edit it accordingly.

I think the strength of the "usual reasons" has a lot to do with personal fit and what kind of research one wants to do.  Personally, I basically didn't consider salary as a factor.

AI x-risk reduction: why I chose academia over industry

When you say academia looks like a clear win within 5-10 years, is that assuming "academia" means "starting a tenure-track job now?" If instead one is considering whether to begin a PhD program, for example, would you say that the clear win range is more like 10-15 years?


Also, how important is being at a top-20 institution? If the tenure track offer was instead from University of Nowhere, would you change your recommendation and say go to industry?

My cut-off was probably somewhere between top-50 and top-100, and I was prepared to go anywhere in the world.  If I couldn't make into top 100, I think I would definitely have reconsidered academia.  If you're ready to go anywhere, I think it makes it much easier to find somewhere with high EV (but might have to move up the risk/reward curve a lot).

Would you agree that if the industry project you could work on is the one that will eventually build TAI (or be one of the leading builders, if there are multiple) then you have more influence from inside than from outside in academia?

Yes.  But ofc it's hard to know if that's the case.  I also think TAI is a less important category for me than x-risk inducing AI.

"Beliefs" vs. "Notions"

Thanks!  Quick question: how do you think these notions compare to factors in an undirected graphical model?  (This is the closest thing I know of to how I imagine "notions" being formalized).

"Beliefs" vs. "Notions"

Cool!  Can you give a more specific link please?

Load More