Singularity

Yoav Ravid (+55/-83)
Ruby (+4314/-111)
Raemon (+123/-4320)
Eliezer Yudkowsky (-69)
plex (+7/-6) Reimplemented spelling fix, asking whether EY meant to remove one or both links.
plex (+6/-55) Reverting Somervta's edits (which were readding two copies of a link EY recently removed).
somervta (+60/-27)
somervta (+21/-5)
Eliezer Yudkowsky (-48) Undo revision 14098 by [[Special:Contributions/Timoriikonen|Timoriikonen]] ([[User talk:Timoriikonen|talk]]): Bill Joy not considered especially insightful in these parts.
TimoRiikonen (+48) /* References */

The Accelerating Change School observes that, contrary to our intuitive linear expectations about the future, the rate of change of information technology grows exponentially. In the last 200 years, we have seen more technological revolutions than in the last 20.20,000 before that. Clear examples of this exponentiality include, but are not restricted to: Moore’s law, Internet speed, gene sequencing and the spatial resolution of brain scanning. By projecting these technology growths into the future it becomes possible to imagine what will be possible to engineer in the future. Ray Kurzweil specifically dates the Singularity happening in 2045.

Blog posts

The Singularity or Technological Singularityis a term with a number of different meanings, ranging from a period of rapid change to the point at which recursively self improving AI becomes dramaticallycreation of greater-than-human intelligence.

See also:Intelligence explosion, Event horizon thesis, Hard takeoff, Soft takeoff

Three Singularity schools

Eliezer Yudkowsky has observed that the varying perspectives on the Singularity can be broadly split into three "major schools" - Accelerating Change (Ray Kurzweil), the Event Horizon (Vernor Vinge), and the Intelligence Explosion (I.J. Good).

The Accelerating Change School observes that, contrary to our intuitive linear expectations about the future, the rate of change of information technology grows exponentially. In the last 200 years, we have seen more intelligenttechnological revolutions than humans,in the last 20.000 before that. Clear examples of this exponentiality include, but are not restricted to: Moore’s law, Internet speed, gene sequencing and at which pointthe spatial resolution of brain scanning. By projecting these technology growths into the future it becomes unpredictable.possible to imagine what will be possible to engineer in the future. Ray Kurzweil specifically dates the Singularity happening in 2045.

The Event Horizon School asserts that for the entirety of Earth’s history all technological and social progress has been the product of the human mind. However, Vernor Vinge asserts that technology will soon improve on human intelligence either via brain-computer interfaces or Artificial Intelligence or both. Vinge argues since one must be at least as smart as the agent to be predicted, after we create smarter than human agents technological progress will be beyond the comprehension of anything a mere human can imagine now. He called this point in time the Singularity.

The Intelligence explosion School asserts that a positive feedback loop could be created in which an intelligence is making itself smarter, thus getting better at making itself even smarter. A strong version of this idea suggests that once the positive feedback starts to play a role, it will lead to a dramatic leap in capability very quickly. This scenario does not necessarily rely upon an entirely computing substrate for the explosion to occur, humans with computer augmented brains or genetically altered may also be methods to engineer an Intelligence Explosion. It is this interpretation of the Singularity that Less Wrong broadly focuses on.

Chalmers' analysis

Philosopher David Chalmers published a significant analysis of the Singularity, focusing on intelligence explosions, in Journal of Consciousness Studies. He performed a very careful analysis of the main premises and arguments for the existence of the singularity. According to him, the main argument is:

  • 1. There will be AI (before long, absent defeaters).
  • 2. If there is AI, there will be AI+ (soon after, absent defeaters).
  • 3. If there is AI+, there will be AI++ (soon after, absent defeaters).

—————-

  • 4. There will be AI++
...
Read More (74 more words)

The Singularity or Technological Singularityis a term with a number of different meanings, ranging from a period of rapid change to the creation of greater-than-human intelligence.

Three Singularity schools

Eliezer Yudkowsky has observed that the varying perspectives on the Singularity can be broadly split into three "major schools" - Accelerating Change (Ray Kurzweil), the Event Horizon (Vernor Vinge),point at which recursively self improving AI becomes dramatically more intelligent than humans, and the Intelligence Explosion (I.J. Good).

The Accelerating Change School observes that, contrary to our intuitive linear expectations about the future, the rate of change of information technology grows exponentially. In the last 200 years we have seen more technological revolutions than in the last 20.000 before that. Clear examples of this exponentiality includes, but is not restricted to: Moore’s law, Internet speed, gene sequencing and the spatial resolution of brain scanning. By projecting these technology growths intoat which point the future it becomes possible to imagine what will be possible to engineer in the future. Ray Kurzweil specifically dates the Singularity happening in 2045.unpredictable.

The Event Horizon School asserts that for the entirety of Earth’s history all technological and social progress has been the product of the human mind. However, Vernor Vinge asserts that technology will soon improve on human intelligence either via brain-computer interfaces or Artificial Intelligence or both. Vinge argues since one must be at least as smart as the agent to be predicted, after we create smarter than human agents technological progress will be beyond the comprehension of anything a mere human can imagine now. He called this point in time the Singularity.

The Intelligence explosion School asserts that a positive feedback loop could be created in which an intelligence is making itself smarter, thus getting better at making itself even smarter. A strong version of this idea suggests that once the positive feedback starts to play a role, it will lead to a dramatic leap in capability very quickly. This scenario does not necessarily rely upon an entirely computing substrate for the explosion to occur, humans with computer augmented brains or genetically altered may also be methods to engineer an Intelligence Explosion. It is this interpretation of the Singularity that Less Wrong broadly focuses on.

Chalmers' analysis

Philosopher David Chalmers published a significant analysis of the Singularity, focusing on intelligence explosions, in Journal of Consciousness Studies. He performed a very careful analysis of the main premises and arguments for the existence of the singularity. According to him, the main argument is:

  • 1. There will be AI (before long, absent defeaters).
  • 2. If there is AI, there will be AI+ (soon after, absent defeaters).
  • 3. If there is AI+, there will be AI++ (soon after, absent defeaters).

—————-

  • 4. There will be AI++ (before too long, absent defeaters).

He...

Read More (76 more words)
Load More (10/38)