LESSWRONG
LW

118
2021 MIRI Conversations

2021 MIRI Conversations

Nov 15, 2021 by Rob Bensinger

This sequence is a (chronological) series of chatroom conversation logs about artificial general intelligence. A large number of topics are covered, beginning with conversations related to alignment difficulty.

Short summaries of each post, and links to audio versions, are available here. There are also two related posts released shortly before this sequence:

  • Discussion with Eliezer Yudkowsky on AGI Interventions
  • Comments [by Nate Soares] on Joe Carlsmith's "Is power-seeking AI an existential risk?"

Rob Bensinger edited and posted this sequence, and Matthew Graves helped with much of the formatting.

⠀

Part One   (Primarily Richard Ngo and Eliezer Yudkowsky)

259Ngo and Yudkowsky on alignment difficulty
Ω
Eliezer Yudkowsky, Richard_Ngo
4y
Ω
152
131Ngo and Yudkowsky on AI capability gains
Ω
Eliezer Yudkowsky, Richard_Ngo
4y
Ω
61
210Yudkowsky and Christiano discuss "Takeoff Speeds"
Ω
Eliezer Yudkowsky
4y
Ω
176
121Soares, Tallinn, and Yudkowsky discuss AGI cognition
Ω
So8res, Eliezer Yudkowsky, jaan
4y
Ω
39

Part Two   (Primarily Paul Christiano and Eliezer Yudkowsky)

119Christiano, Cotra, and Yudkowsky on AI progress
Ω
Eliezer Yudkowsky, Ajeya Cotra
4y
Ω
95
156Biology-Inspired AGI Timelines: The Trick That Never Works
Ω
Eliezer Yudkowsky
4y
Ω
144
149Reply to Eliezer on Biological Anchors
Ω
HoldenKarnofsky
4y
Ω
46
90Shulman and Yudkowsky on AI progress
Ω
Eliezer Yudkowsky, CarlShulman
4y
Ω
16
91More Christiano, Cotra, and Yudkowsky on AI progress
Ω
Eliezer Yudkowsky, Ajeya Cotra
4y
Ω
28
108Conversation on technology forecasting and gradualism
Ω
Richard_Ngo, Eliezer Yudkowsky, Rohin Shah, Rob Bensinger
4y
Ω
30

Part Three  (Varied Participants)

63Ngo's view on alignment difficulty
Ω
Richard_Ngo, Eliezer Yudkowsky
4y
Ω
7
67Ngo and Yudkowsky on scientific reasoning and pivotal acts
Ω
Eliezer Yudkowsky, Richard_Ngo
4y
Ω
14
71Christiano and Yudkowsky on AI predictions and human intelligence
Ω
Eliezer Yudkowsky
4y
Ω
35
91Shah and Yudkowsky on alignment failures
Ω
Rohin Shah, Eliezer Yudkowsky
4y
Ω
47
119Late 2021 MIRI Conversations: AMA / Discussion
Ω
Rob Bensinger
4y
Ω
199