389

LESSWRONG
LW

388
AI

7

Letter to Heads of AI labs

by samuelshadrach
11th Oct 2025
3 min read
2

7

AI

7

Letter to Heads of AI labs
1StanislavKrym
1samuelshadrach
New Comment
2 comments, sorted by
top scoring
Click to highlight new comments since: Today at 12:26 PM
[-]StanislavKrym4h10
  • P(Human extinction by 2030) = 25% * 30% = 7.5%
  • P(Permanent world dictatorship by winner by 2030) = 25% * 40% = 10%

Imagine that you received such a letter. Then asking you to quit the race would reduce P(your dictatorship) to 0 instead of 3.3%. But how does it reduce P(extinction)? Maybe a better argument is that alignment is hard AND that alignment to anyone's dictatorship or another dystopian future (e.g. North Korea letting most of its population starve) is WAY harder?

Reply
[-]samuelshadrach4h10

(edited)

An AI lab head could do a lot to prevent extinction if they did not run an AI lab. For starters they could make it their (and their org's) fulltime job to convince their entire network that extinction is coming. Then they could try convincing the public and run for US election to get an AI pause.

But yes I haven't spelled out a detailed alternate plan for what to do if you're a well-networked billionaire trying to fix AI risk, and it is worth doing so.

This might have relevant stuff: Support the movement against AI xrisk

Reply
Moderation Log
More from samuelshadrach
View more
Curated and popular this week
2Comments

This letter is addressed

  • To Elon Musk
  • To Shane Legg, Demis Hassabis, Mustafa Suleyman
  • To Sam Altman, Greg Brockman
  • To Ilya Sutskever
  • To Dario and Daniella Amodei, Jack Clark, Jared Kaplan, Chris Olah, Tom Brown, Sam McCandlish
  • To anyone else I may have missed in the above list

Summary

  • The probability of human extinction is likely way higher than the probability of the AI race being winnable and you being the one who wins it.
  • The probability of someone building a world dictatorship you have no power in, is greater than the probability of you building the world dictatorship. If you lose the race, I am not optimistic you will get a seat on the coalition that wins. ASI radically centralises power.
  • People around you are probably a terrible guide for how you should think and feel about these decisions.

Why you might want ASI

I understand there are many reasons you might care a lot about creating ASI.

  • You might want to create a future where resources are abundant
  • You might want to become immortal
  • You might want to become world dictator
  • Maybe something else

Probability of me becoming world dictator

I'm sure you also have other frameworks for approaching life decisions besides using probabilities like this. You can still spend 15 minutes of your time thinking through the conclusions from this framework, and then discard it.

If I were in your shoes as an AI lab head, as of 2025-10, here are my probabilities of the following outcomes by 2030.

  • P(ASI by 2030) = 25%
    • P(Human extinction by 2030) = 25% * 30% = 7.5%
    • P(Permanent world dictatorship by winner by 2030) = 25% * 40% = 10%
      • P(I become permanent world dictator) = 25% * 40% * 33% = 3.3%
      • P(Someone who is not me becomes permanent world dictator) = 6.7%
    • P(Unknown unknown outcome by 2030) = 25% * 30% = 7.5%
  • P(No ASI by 2030 but AI race is still on) = 50%
    • P(I become a trillionaire, and AI race is on) = 25%
  • P(No ASI by 2030 and AI hits a wall) = 25%
    • P(I become a centibillionaire, and AI hits a wall) = 15%

I would not personally accept this deal, if I was in your shoes.

  • The probability of human extinction is likely way higher than the probability of the AI race being winnable and you being the one who wins it. 7.5% > 3.3%
  • The probability of someone building a world dictatorship you have no power in, is greater than the probability of you building the world dictatorship. If you lose the race, I am not optimistic you will get a seat on the coalition that wins. ASI radically centralises power. 6.7% > 3.3%
  • There is no use of being a trillionaire if the AI race doesn't stop, you will just end up facing the same Russian roulette again in the next 5 years after that.
  • My numbers are all non-robust guesswork. I highly recommend you plug in your own numbers. Don't just talk about plugging in the numbers, actually plug them in and see what it means.

Social reality does not matter

This is maybe obvious to you already

  • How much yield a nuclear bomb will output does not depend on the feelings of those deploying it. Whether you feel excited or anxious or bored or depressed, the yield is the same.
  • People around you are probably a terrible guide for how you should think and feel about these decisions.
  • The stakes are orders of magnitude unprecedented.
    • Gengis Khan wiped out ~5% of world population in order to rule over ~25% of world population. No leader in history has managed to build a world dictatorship, let alone an immortal one. If you build safe ASI, this person could be you.
    • Adolf Hitler caused ~11 million deaths via Holocaust, and world war 2 caused around ~70 million deaths. If the race to ASI leads to human extinction, this is 100 world wars or 1000 Holocausts combined. If you build unsafe ASI, you might cause this.

Force versus persuasion

I do want your lab to be forcibly shutdown in order to prevent these outcomes, but before I do that, I wanted to make one last attempt to persuade you to voluntarily shutdown your lab.