For example, you might think:

It's likely that AGI will be invented before 2050; however, if it isn't, then that must mean either that AGI is impossible, or that it requires much more advanced technology than I currently think it does, or else that there was some kind of large-scale civilizational collapse in the meantime.

For that matter, any non-exponential distribution has this property, where the non-occurrence of the event by a certain time will change your expectation of it going forward. I'm curious if people think this is the case for AGI, and if so, why. (Also curious if this question has been asked before.)

New Answer
New Comment

1 Answers sorted by

p.b.

40

It's bumpy because either "normal" Deep Learning progress will get us there or there is a big roadblock ahead that will require a major scientific breakthrough. 

The Deep Learning scenario creates a bump within the next two decades I would say.

Whole brain simulation could create another bump but I don't know where.

The "major scientific breakthrough" scenario doesn't create a bump. It could've happened yesterday. 

1 comment, sorted by Click to highlight new comments since:

It's hard to come up with a reasonable probability distribution for a one-off event, not clear what the reference class might be. But my guess is that it would be some form of the power law, because it is universal and scale-independent. No idea about the power exponent though.