Posts

Sorted by New

Wiki Contributions

Comments

The pain of a crisis is in having to deal with it in the first place. What you are proposing is to deal with a crisis before it occurs, in order to prepare ourselves better for it. This means incurring the pain upfront even though there is a low probability that the crisis will ever occur. Now multiply this by the number of different crises - New Orleans, after all, is different from Bear Sterns - and we might spend all of our spare time preparing for low-likelihood catastrophes.

Perhaps you'll argue that not everyone needs to prepare, only a few people need to do it on everyone's behalf, but I don't think that's true. Preparedness has to be pervasive, or else we're not prepared.

With the last quotation especially, this is ceasing to be "Rationality Quotes" and is beginning to be "Idealist Quotes".

Eliezer: how does this square with Robin's recent What Belief Conformity?

He quoted:

"physicists and mathematicians perform best in terms of "rationality" (i.e. performance according to theory) and psychologists worst. However, since "rational" behavior is only profitable when other subjects also behave rationally ... the ranking in terms of profits is just the opposite: psychologists are best and physicists are worst."

It seems to me that the mind is just a generator of random ideas based on things experienced recently, where the ideas are checked in various layers for how much sense they make and passed to the conscious mind when they have already passed some filters. In essence, our thinking process is a combination of semi-random idea generation and tweaking, combined with validation and testing.

There seems to be no reason why the same could not be implemented in a machine. People who argue that machines cannot do random stuff have apparently never dealt with cryptography.

Eliezer: if the "ethical override" differs from culture to culture, and some people don't even have it, what's universal about it?

I'm not saying the phenomenon does not exist, but calling it an "ethical override" seems a misnomer. It might be more accurate to regard it as a form of hypnosis. If you're familiar with how hypnosis works, this seems similar to the environment impressing on you, as a child, that some arbitrary things should / should not be done. Since generally such instructions relay accumulated knowledge which one cannot earn or safely test in one's lifetime, it increases an individual's genetic fitness to heed such instructions, i.e. be "hypnotizable".

Eliezer: Are you really that sure that the ethical impulse you speak of is due to nature?

I am probably not alone in suggesting that it is due to nurture. It may seem to you that the ethical override is as hard-wired in you as hunger or thirst, but it may be that what is actually hardwired is not an ethical override. It is the listen-to-your-parents override.

It is kind of peculiar, is it not?, that ethic overrides such as you describe seem to be common among people who began their lives in religion, but not quite as common, and not quite as overriding, in people who did not. Contrast the principled attitudes of uptight religious people with those who were raised without stories of hell and damnation to scare them. Which type of person can be expected to avoid sex until they're married? And what for? A hard-wired ethical override? Or because evolution taught us that if parents tell us not to eat certain berries, we should not, or we will die?

I don't think that the ethical override you speak of is nearly as common as you purport. You only need to venture into a suitable part of Africa, where your head will be removed for the slightest of reasons, or into communities which raise their children ways quite dissimilar to how Catholic or Jewish children are raised.

Many of us have the ethical override because we are designed to internalize, on pain of death, the serious lessons taught by our environments. Remove the environmental lesson, and the ethical override disappears.

Eliezer: I don't get your altruism. Why not grab the crown? All things being equal, a future where you get to control things is preferable to a future where you don't, regardless of your inclinations.

Even if altruistic goals are important to you, it would seem like you'd have better chances of achieving them if you had more power.

Unless, I guess, if you judge that the activities needed to keep power, and to remain alive while under increased threat, would be too much of an obstacle to your other goals.

The only valid reason I see not to grab power is a selfish one: if it would mean getting yourself into a mess that you don't really need or want. Which seems likely to be the case. But then this is a selfish motivation, not an altruistic one.

Vladimir Nesov: thanks for your comment. I found it insightful.

Load More