This still feels more analogous to Chernobyl? "The other team is going to get cheap nuclear energy first if we don't, and we prefer a nuclear accident to losing, so we might as well push ahead."

You might argue that obviously it doesn't matter very much who gets nuclear energy first, so this wouldn't apply. I'd respond that the benefit : cost ratio here seems similar to the benefit : cost ratio for AI where the benefit is "we build a singleton" and the cost is "misaligned AGI causes extinction". Surely it&#x... (read more)

AI Alignment Open Thread August 2019

by habryka 1 min read4th Aug 201996 comments

37

Ω 12


Crossposted from the AI Alignment Forum. May contain more technical jargon than usual.

This is an experiment in having an Open Thread dedicated to AI Alignment discussion, hopefully enabling researchers and upcoming researchers to ask small questions they are confused about, share very early stage ideas and have lower-key discussions.