How good of an idea do the people on this website have in regards to predicting an AGI apocalypse? 


I get the impression that a lot of people seem to be very certain that it will happen relatively soon and I think I've figured out a way to test it. The spreadsheet for the data would look something like this. This questionnaire would be asked each year and the average percentage for each question would be used.

The interesting thing is that I think I know what it will look like over time

The smallest red box would be your prediction for the first year. It would probably be very low. The largest box (in this case) would be the probability that it happens within 25 years. As time passes by and nothing happens the graph would simply shift left.

What do you think about this approach?

Regards, theflowerpot

New to LessWrong?

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 1:03 PM

I'm not sure what you'd learn, other than that breakthroughs are hard to predict.  The thing you don't include in your predicted results is when it actually happens, and you have to figure out which of them is correct.

Personally, I don't expect to see a foom, I expect a gradual increase in power of multiple corporate- and government-affiliated AIs over the next ~30 years, with super-linear resources behind it causing sub-linear growth in power, but eventually crossing some threshold where control is no longer believably held by humans.  I predict we won't agree on what year this threshold is crossed, but we'll probably be able to identify a decade.

Not conclusive, but still worth doing in my view due to the relative easiness. Create the spreadsheet, make it public and let's see how it goes.

I would add the actual year in which you think it will happen.