Today's post, Eliezer’s Meta-Level Determinism was originally published on June 23, 2008. A summary:

 

Eliezer has suggested that the most dramatic changes in the history of life are the result of events that changed the optimization process at work. This is experimentally testable, and the predictions it offers seem to be false.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was AI Go Foom, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New to LessWrong?

New Comment
1 comment, sorted by Click to highlight new comments since: Today at 4:19 PM

On the writing/farming debate, it seems to me that it rather misses the point - and for good reason, since Eliezer missed his own point a bit. One commenter mentioned it - the ability to keep hold of information is the important part. Writing is just a significant improvement in that effort.

Even so - would you expect it to cause an immediate kink? The more meta levels you go down, the higher the order of the derivative you're impacting. That increases the time you expect for the effects to percolate up to object level.