Extraterrestrial paperclip maximizers

3


According to The Sunday Times, a few months ago Stephen Hawking made a public pronouncement about aliens:

Hawking’s logic on aliens is, for him, unusually simple. The universe, he points out, has 100 billion galaxies, each containing hundreds of millions of stars. In such a big place, Earth is unlikely to be the only planet where life has evolved.

“To my mathematical brain, the numbers alone make thinking about aliens perfectly rational,” he said. “The real challenge is to work out what aliens might actually be like.”

He suggests that aliens might simply raid Earth for its resources and then move on: “We only have to look at ourselves to see how intelligent life might develop into something we wouldn’t want to meet. I imagine they might exist in massive ships, having used up all the resources from their home planet. Such advanced aliens would perhaps become nomads, looking to conquer and colonise whatever planets they can reach.”

He concludes that trying to make contact with alien races is “a little too risky”. He said: “If aliens ever visit us, I think the outcome would be much as when Christopher Columbus first landed in America, which didn’t turn out very well for the Native Americans.”

Though Stephen Hawking is a great scientist, it's difficult to take this particular announcement at all seriously. As far as I know, Hawking has not published any detailed explanation for why he believes that contacting alien races is risky. The most plausible interpretation of his announcement is that it was made for the sake of getting attention and entertaining people rather than for the sake of reducing existential risk.

I was recently complaining to a friend about Stephen Hawking's remark as an example of a popular scientist misleading the public. My friend pointed out that a  sophisticated  version of the concern that Hawking expressed may be justified. This is probably not what Hawking had in mind in making his announcement, but is of independent interest.

Anthropomorphic Invaders vs. Paperclip Maximizer Invaders

From what Hawking says, it appears as though Hawking has an anthropomorphic notion of "alien" in mind. My feeling is that if human civilization advances to the point where we can explore outer space in earnest, it will be because humans have become much more cooperative and pluralistic than presently existing humans. I don't imagine such humans behaving toward extraterrestrials the way that the Europeans who colonized America behaved toward the Native Americans. By analogy, I don't think that anthropomorphic aliens which developed to the point of being able to travel to Earth would be interested in performing a hostile takeover of Earth.

And even ignoring the ethics of a hostile takeover, it seems naive to imagine that an anthropomorphic alien civilization which had advanced to the point of acquiring the (very considerable!) resources necessary to travel to Earth would have enough interest in the resources on Earth in particular to travel all to travel all the way to Earth to colonize Earth and acquire these resources.

But as Eliezer has pointed out in  Humans In Funny Suits , we should be wary of irrationally anthropomorphizing aliens. Even  if  there's a tendency for intelligent life on other planets to be sort of like humans, such intelligent life may (whether intentionally or inadvertently) create a  really powerful optimization process . Such an optimization process could very well be a (figurative)  paperclip maximizer . Such an entity would have special interest in Earth, not because of special interest in acquiring its resources, but because Earth has intelligent lifeforms which may eventually thwart its ends. For a whimsical example, if humans built a (literal) staple maximizer, this would pose a very serious threat to a (literal) paperclip maximizer.

The sign of the expected value of Active SETI

It would be very bad if  Active SETI  led an extraterrestrial paperclip maximizer to travel to Earth to destroy intelligent life on Earth. Is there enough of an upside to Active SETI to justify Active SETI anyway?

Certainly it would be great to have friendly extraterrestrials visit us and help us solve our problems. But there seems to me to be no reason to believe that it's more likely that our signals will reach friendly extraterrestrials than it is that our signals will reach unfriendly extraterrestrials. Moreover, there seems to be a strong asymmetry between the positive value of contacting friendly extraterrestrials and the negative value of contacting unfriendly extraterrestrials. Space signals take a long time to travel through a given region of space, and space travel through the same amount of distance seems to take orders of magnitude longer. It seems if we successfully communicated with friendly extraterrestrials at this time, by the time that they had a chance to help us, we'd already be extinct or have solved our biggest problems ourselves. By way of contrast, communicating with unfriendly extraterrestrials is a high existential risk regardless of how long it takes them to receive the message and react.

In light of this, I presently believe that expected value of Active SETI is negative. So if I could push a button to stop Active SETI until further notice then I would.

The magnitude of the expected value of Active SETI and implication for action

What's the probability that continuing to send signals into space will result in the demise of human civilization at the hands of unfriendly aliens? I have no idea, my belief on this matter is subject to very volatile change.  But is it worth it for me to expend time and energy analyzing this issue further and advocating against Active SETI? Not sure. All I would say is that I used to think that thinking and talking about aliens is at present not a productive use of time, and the above thoughts have made me less certain about this. So I decided to write the present article.

At present I think that a probability of 10-9  or higher would warrant some effort to spread the word, whereas if the probability is substantially lower than 10-9  then this issue should be ignored in favor of other existential risks.

I'd welcome any well considered feedback on this matter.

Relevance to the Fermi Paradox

The Wikipedia page on the  Fermi Paradox  references

the  Great Silence[3] — even if travel is hard, if life is common, why don't we detect their radio transmissions?

The possibility of extraterrestrial paperclip maximizers together with the apparent asymmetry between the upside of contact with friendly aliens and the downside of contact with unfriendly aliens pushes in the direction that the reason for the Great Silence is because intelligent aliens have deemed it  dangerous to communicate .

3