All of abukeki's Comments + Replies

Some abstract, non-technical reasons to be non-maximally-pessimistic about AI alignment

MIRI had a strategic explanation in their 2017 fundraiser post which I found very insightful. This was called the "acute risk period".

I currently translate AGI-related texts to Russian. Is that useful?

Yes, but I think much more useful might be for someone to do this for Chinese.

First Strike and Second Strike

Those 3 new silo fields are the most visible but I'd guess China is expanding the mobile arm of its land-based DF-41 force (TELs) a similar amount. You just don't see that on satellite images. The infrastructure enabling Launch on Warning is also being implemented which will make those silos much more survivable, though this also of course greatly increases the risk of accidental nuclear war. I'd argue that those silo fields are destabilizing, especially if China decides to deploy the majority of their land-based force that way, because even with a Launch ... (read more)

Postmodern Warfare

Can you give some examples of who in the "rationalist-adjacent spheres" are discussing it?

How to think about and deal with OpenAI

I'm aware. I'm just saying a new effort is still needed because his thoughts on alignment/AI risk are still clearly very misguided listening to all his recent public comments on the topic and what he's trying to do with Neuralink etc. so someone really needs to reach out and set him straight.

How to think about and deal with OpenAI

Agree with we should reach out to him & the community is connected enough to do so. If he's concerned about AI risk but either being misguided or doing harm (see e.g. here/here and here), then someone should just... talk to him about it? The richest man in the world can do a lot either way. (Especially someone as addicted to launching things as him, who knows what detrimental thing he might do next if we're not more proactive.)

I get the impression the folks at FLI are closest to him so maybe they are the best ones to do that.

8Ruby3moI believe people have spoken to him. For one thing, he was on a panel at EAG 2015.