LESSWRONG
LW

839
Lowell Dennings
32110
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
Best resource to go from "typical smart tech-savvy person" to "person who gets AGI risk urgency"?
Answer by Lowell DenningsOct 16, 202271

ThomasW recommended [1] Unsolved Problems in ML Safety, [2] X-Risk Analysis for AI Research, and [3] Is Power-Seeking AI an Existential Risk? He said [3] is good for people with high openness to weird things and motivation for x-risk. And if they're not as open, [1] has research areas and ML cred. He says he wouldn't share Yudkowsky stuff to ML people, and people don't really like openings with x-risk and alarmism. Personally, I like "Is Power-Seeking AI an Existential Risk" because of the writing style and it's a pretty comprehensive introduction. There's also a bounty for AI Safety Public Materials.

Reply
27A noob goes to the SERI MATS presentations
3y
0