This post is a collection of key questions that feed into AI timelines and AI safety work where it seems like there is substantial interest or disagreement amongst the LessWrong community. You can make a prediction on a question by hovering over the widget and clicking. You can update your...
Ought and LessWrong are excited to launch an embedded interactive prediction feature. You can now embed binary questions into LessWrong posts and comments. Hover over the widget to see other people’s predictions, and click to add your own. Try it out Elicit Prediction (elicit.org/binary/questions/qqEklFgQG) How to use this Create a...
At Ought, we’ve started making bets on our continuous predictions. We’ve found it a fun way to hold ourselves accountable and to combat overconfidence. Here’s a thread for people to share continuous beliefs they’re willing to bet on, and make bets on other people’s beliefs. You can use Elicit to...
We made a colab notebook that lets you generate a bet from two people's Elicit distributions. You can edit the notebook to generate your bet (the changes won't be saved). Here's an example of a suggested bet between Ben Pace and SDM on AI timelines: Comparison of predictions: Snapshot link...
This is a thread for displaying your probabilities of an existential catastrophe that causes extinction or the destruction of humanity’s long-term potential. Every answer to this post should be a forecast showing your probability of an existential catastrophe happening at any given time. For example, here is Michael Aird’s timeline:...
It’s been exciting to see people engage with the AI forecasting thread that Ben, Daniel, and I set up! The thread was inspired by Alex Irpan’s AGI timeline update, and our hypothesis that visualizing and comparing AGI timelines could generate better predictions. Ought has been working on the probability distribution...
This is a thread for displaying your timeline until human-level AGI. Every answer to this post should be a forecast. In this case, a forecast showing your AI timeline. For example, here are Alex Irpan’s AGI timelines. The green distribution is his prediction from 2015, and the orange distribution is...