LESSWRONG
LW

1293
Abhinav Srivastava
0010
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No posts to display.
No wikitag contributions to display.
Munk AI debate: confusions and possible cruxes
Abhinav Srivastava2y10

(If astronomers found a giant meteor projected to hit the earth in the year 2123, nobody would question the use of the term “existential threat”, right??)


I'm wondering if you believe that AGI risk is equivalent to a giant meteor hitting the Earth or was that just a throw away analogy? This helps me get a better idea of where x-risk concern-havers (?) stand on the urgency of the risk.  /gen

Thanks 

Reply