¯\_(ツ)_/¯
Obviously just one example, but Schneier has generally been quite skeptical, and he blurbed the book.
but subconsciously I notice that MIRI was cleaning house before the book launch (e.g. taking down EY's light novel because it might look bad)
Do you have any other concrete example here besides the novel?
Maybe that's wrong; maybe the issue was lack of reach rather than exhausting the persuadees' supply, and the book-packaging + timing will succeed massively. We'll see.
This is certainly the hope. Most people in the world have never read anything that anyone here has ever written on this subject.
FWIW, and obviously this is just one anecdote, but a member of Congress who read an early copy, and really enjoyed it, said that Chapter 2 was his favorite chapter.
Huh, I thought I fixed this. Thanks for flagging, will ensure I fix now.
Also oddly, the US version is on many of Amazon's international stores including the German store ¯\_(ツ)_/¯
Schneier is also quite skeptical of the risk of extinction from AI. Here's a table o3 generated just now when I asked it for some examples.
Date | Where he said it | What he said | Take-away |
---|---|---|---|
1 June 2023 | Blog post “On the Catastrophic Risk of AI” (written two days after he signed the CAIS one-sentence “extinction risk” statement) | “I actually don’t think that AI poses a risk to human extinction. I think it poses a similar risk to pandemics and nuclear war — a risk worth taking seriously, but not something to panic over.” (schneier.com) | Explicitly rejects the “extinction” scenario, placing AI in the same (still-serious) bucket as pandemics or nukes. |
1 June 2023 | Same post, quoting his 2018 book Click Here to Kill Everybody | “I am less worried about AI; I regard fear of AI more as a mirror of our own society than as a harbinger of the future.” (schneier.com) | Long-standing view: most dangers come from how humans use technology we already have. |
9 Oct 2023 | Essay “AI Risks” (New York Times, reposted on his blog) | Warns against “doomsayers” who promote “Hollywood nightmare scenarios” and urges that we “not let apocalyptic prognostications overwhelm us.” (schneier.com) | Skeptical of the extinction narrative; argues policy attention should stay on present-day harms and power imbalances. |
LFG!
#7 Combined Print & E-Book Nonfiction
#8 Hardcover Nonfiction