"Your denial of the importance of objectivity amounts to announcing your intention to lie to us. No-one should believe anything you say." -- John McCarthy
Gee--I wish I had this one when I was taking that anthropology professor's class!
"If you want to do good, work on the technology, not on getting power." - John McCarthy
Except for technologies with catastrophic potential (nanotech, biotech.)
Maybe people have the idea that the line moves slowly, and that they can't cut in line. Thus if the front of the line gets past them, they have to wait until the entire line is gone before exiting.
You're probably right. But I can see a small benefit: we have become wary. There's still the possibility that someone will develop an effective defense system against the bomb. On the other hand, if we had never used the bomb, it would probably be less widely known and there would be the possibility of a sucker punch from cult of mentally disturbed physicists.
"Morality" generally refers to guidelines on one of two things:
(1). Doing good to other sentients. (2). Ensuring that the future is nice.
If you wanted to make me stop caring about (1), you could convince me that all other sentients were computer simulations who were different in kind than I was, and that there emotions were simulated according to sophisticated computer models. In that case, I would probably continue to treat sentients as peers, because things would be a lot more boring if I started thinking of them as mere NPCs.
If you wanted to make me stop caring about (2), you could tell me that I was living in computer simulation that would grant my every request (similar to the plot of this novel). If that were the case, I would set up sophisticated games for myself. Just taking the path of least resistance and maximizing momentary dopamine release would get boring quickly. (There's a reason why you see more kids eating candy than adults.) I would think carefully before I even experimented with maximizing dopamine release, since it would make everything else seem petty by comparison.
Either way, you would be ruining the secret to happiness:
"The secret of happiness is to find something more important than you are and dedicate your life to it." - Dan Dennet
@poke:
I imagine Eliezer is more interested in doing what works than avoiding criticism. And the real danger associated with creating a superhuman AI is that things would spiral out of control. That danger is still present if humanity is suddenly introduced to 24th century science.
I started reading the first chapter of Structure and Interpretation of Computer Programs and was reminded of this post.
A computational process is indeed much like a sorcerer's idea of a spirit. It cannot be seen or touched. It is not composed of matter at all. However, it is very real. It can perform intellectual work. It can answer questions. It can affect the world by disbursing money at a bank or by controlling a robot arm in a factory. The programs we use to conjure processes are like a sorcerer's spells. They are carefully composed from symbolic expressions in arcane and esoteric programming languages that prescribe the tasks we want our processes to perform.
Makes you want to learn to program, doesn't it?
So, any ideas on how to become one of that incredibly tiny number of people who desparately want to learn?
Well, I guess I'm not talking about the learning process itself so much as what keeps you going. In a traditional school environment, grades are the de facto student motivator.
My old Creative Minds professor has plenty of anti-school arguments. But when he tried attending a school without grades, he learned that it sucked: many students didn't show up for class, and of the ones that did, the only ones who participated in classroom discussions were those who had strong opinions.
So my question is when you're learning on your own, how do you find ways to motivate yourself? As I mentioned before, curiousity can be unreliable. Another technique is to think of what you're doing as special and unique, and saying to yourself "Hardly anyone is teaching themselves using the direct, efficient methods that I'm using. I'm operating outside the system and learning things that very few others are learning. If I finish all the exercises in this book, I will be a Level 6 Probability Master."
The upside of this is that you're motivated to learn more. The downside is that it might make you arrogant.
I disagree with this one. If you scrupulously include every disclaimer and caveat, you'll be too boring for anyone to pay attention to. It's better to be pragmatic. Giving someone an improved but still not maximally-accurate belief is still an improvement.
I propose that the author of this quote is placing a moral value on people possessing maximally accurate beliefs. If so, the author's moral system is incompatible with Standard Utilitarianism, is it not?