What is the highest probability of extinction through which a wise man would be willing to proceed? As i see it, any probability greater than 1:10,000 in 100 years is absurd. Maybe 100,000 is more like it. One in a million would be reasonable, from the perspective of Pascale's Wager. (Not accounting for potential benefit on the upside.)
I'm interested in the numbers other folks would be willing to tolerate.
I teach freshman Rhetoric & Writing at uni. We focus on persuasion. May i use this essay as an assigned reading? It works well because it articulates a fine-grained persuasive strategy in a context that the students are probably going to care about.
I am overhauling my whole curriculum over this summer to make it immediately relevant to students' understanding and navigation of the world they will be graduating into in four years.
Thanks for a great article. I am so frustrated and dumbfounded by our (the U.S.A.'s) lack of federal response.
You said to another Replier that you were looking into some work with lawmakers in the U.S. The sooner the better!
If i can take issue with your approach (and I'm sure you are far more tuned into the situation than i am, so please tell me what I'm not seeing; I've only become aware of things in the past 5 days), i wonder why you didn't mention China in the article. Seems to me that policymakers' attention might be piqued by recuring to a familiar threat. And they will have to factor China into their decision-making. Do you agree with folks who see the U.S. confronted with the choice between slowing things down for AI Safety and speeding things up to outpace China? I'm assuming the U.S. will nationalize the efforts at some point to provide security and CoC.
Thanks again, so much. Please keep going!
I suppose part of the strategy in approaching folks with this is to know when/what to hold back, especially an initial cold call.
Thank you again for your work. Thank you 100x.