Today's post, Nonsentient Optimizers was originally published on 27 December 2008. A summary (taken from the LW wiki):

 

Discusses some of the problems of, and justification for, creating AIs that are knowably not conscious / sentient / people / citizens / subjective experiencers. We don't want the AI's models of people to be people - we don't want conscious minds trapped helplessly inside it. So we need how to tell that something is definitely not a person, and in this case, maybe we would like the AI itself to not be a person, which would simplify a lot of ethical issues if we could pull it off. Creating a new intelligent species is not lightly to be undertaken from a purely ethical perspective; if you create a new kind of person, you have to make sure it leads a life worth living.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Nonperson Predicates, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New to LessWrong?

New Comment
7 comments, sorted by Click to highlight new comments since: Today at 4:34 PM

FWIW, the problem of simulations that constitute people got a mention in Iain M. Banks' The Hydrogen Sonata.

Also in John C. Wright's Golden Age trilogy, and the 1954 story The Tunnel Under the World by Frederik Pohl.

I don't recall the actual problem being mentioned in Tunnel Under The World. Of course, I don't have a text copy to hand ... quote?

It was a while ago that I read it, but there's this passage:

He had been to the factory once, with Barth; it had been a confusing and, in a way, a frightening experience. Barring a handful of executives and engineers, there wasn't a soul in the factory—that is, Burckhardt corrected himself, remembering what Barth had told him, not a living soul—just the machines.

According to Barth, each machine was controlled by a sort of computer which reproduced, in its electronic snarl, the actual memory and mind of a human being. It was an unpleasant thought. Barth, laughing, had assured him that there was no Frankenstein business of robbing graveyards and implanting brains in machines. It was only a matter, he said, of transferring a man's habit patterns from brain cells to vacuum-tube cells. It didn't hurt the man and it didn't make the machine into a monster.

But they made Burckhardt uncomfortable all the same.

I took the implication to be that Burkhardt buried his discomfort by dismissing the minds in the computers as nonsentient (or just refusing to think about their sentience), so that "real" humans were free to use them as they chose; thus making it karmic when it's revealed that the advertisers running the (admittedly physical) town regarded him the same way. It's not a pure example, though.

Yeah, that discomfort's a lot of the point of the story. I suppose it raises the issue in that way implicitly, though Hydrogen Sonata sets out the problem explicitly in detail.

[-][anonymous]11y-10

Creating a new intelligent species is not lightly to be undertaken from a purely ethical perspective; if you create a new kind of person, you have to make sure it leads a life worth living.

The unspoken bias here is that antinatalism (also an ethic) is not an ethic. Antinatalism says it is not ethical to create new life. Fine to disagree, false to say this ethic does not exist.

Antinatalism shouldn't be the perspective that it is unnecessary to ask the question of whether a new life is worth living, just the perspective that the answer is always "no." Eliezer isn't discounting this as a possible set of answers.