LESSWRONG
LW

Wikitags

Event Horizon Thesis

Edited by steven0461 last updated 26th Jun 2012

The event horizon thesis states that, once arises, the result will be alien and unpredictable in a way qualitatively different from the results of other technological advances. In this view, we cannot see beyond the , just as we cannot see beyond a black hole's event horizon.

names this idea as one of the three singularity schools, attributing it to Vernor Vinge, who describes the singularity as "a point where our models must be discarded and a new reality rules".

An argument in favor of such unpredictability goes as follows. Suppose you could always predict what a superintelligence would do. Then you, yourself, would be a superintelligence — but you are not a superintelligence.

However, this argument does not rule out all predictions. In particular, if we can predict what a superintelligence's goals will be, we can predict that it will probably achieve those goals, even if we don't know by what method. The predictions involved in Friendly AI tend to be of this nature.

External links

  • Knowability of FAI, by Eliezer Yudkowsky
  • The Coming Technological Singularity, by Vernor Vinge
  • Three Major Singularity Schools, by Eliezer Yudkowsky

See also

  • Technological singularity
superintelligence
singularity
Eliezer Yudkowsky
Discussion0
Discussion0