mariz

Posts

Sorted by New

Comments

Rationality Lessons in the Game of Go

"Overconfidence leads to bad play."

As an avid chess player, I can tell you that this is true in chess as well. People look at brilliant sacrifices by grandmasters (Bobby Fischer was famous for this) and think they are brilliant enough to pull off the same thing. In my experience, 80-90% of material sacrifices fail. When they do succeed, it is usually because the sacrificer is considerably better than her opponent. I've learned to subdue my desire for a sacrifice and a quick strike and work on a longterm plan instead.

Attention Lurkers: Please say hi

I'll say Hi and I'll post this link which describes a study that showed that people are more likely to believe in pseudoscience if they are told that scientists disapprove of it:

http://www.alternet.org/module/printversion/146552

They are also much more likely to believe in pseudoscience if it has popular support.

A survey of anti-cryonics writing

I posted an argument here: http://lesswrong.com/lw/1mc/normal_cryonics/1i92

I didn't see a major criticism. There were some interesting responses and questions, like what constitutes a 5% increase in quality of life (I don't know; it's a crude metric), but my point stands. You're better off spending your money on marginal increases in quality of life with high probabilities of success than on cryonics.

Normal Cryonics

Here's a simple metric to demonstrate why alternatives to cryonics could be preferred:

Suppose we calculate the overall value of living as the quantity of life multiplied by the quality of life. For lack of a better metric, we can rate our quality of life from 1 to 100. Thus one really good year (quality = 100) is equal to 100 really bad years (ql = 1). If you think quality of life is more important, you can use a larger metric, like 1 to 1000. But for our purposes, let's use a scale to 100.

Some transhumanists have calculated that your life expectancy without aging is about 1300 years (because there's still an annual probability that you will die from an accident, homicide, etc.). Conservatively, let's assume that if cryonics and revivification are successful, you can expect to live for another 1000 years. Also, knowing nothing else about the future, your quality of life will be ~50. Thus your total life-index points gained is 50,000. But suppose that the probability that cryonics/revivification will be successful is 1 in 10,000, or .0001. Thus the expected utility points gained is .0001 * 50,000 = 50.

It will cost your $300/year for the rest of your life to gain those expected 50 points. But suppose you could spend that $300 a year on something that is 80% likely to increase your quality of life by 5 points a year (only 5%) for the rest of your life (let's say another 50 years). There are all kinds of things that could do that: vacations, games, lovers, whatever. That's .80 5 50 = 200 expected utility points.

You're better off spending your money on things that are highly likely to increase your quality of life here and now, then on things that are highly unlikely or unknown to increase your quantity and quality of life in the future.

Normal Cryonics

What is the calculated utility of signing up for cryonics? I've never seen a figure.

Normal Cryonics

Plus, suicide allows you to make a controlled Exit and a controlled delivery in the cryopreserved state. You could die in a car accident, trapped in the wreckage for hours before they extract you, while your brain degenerates. You could be shot in the head. You could develop a neural disease or a brain tumor.

You just can't take these chances. The rational solution is suicide at an early age.

Normal Cryonics

Taking the cryonics mindset to its logical conclusion, the most "rational" thing to do is commit suicide at age 30 and have yourself cryopreserved. Waiting until a natural death at a ripe old age, there may be too much neural damage to reconstitute the mind/brain. And since you're destined to die anyway, isn't the loss of 50 years of life a rational trade off for the miniscule chance of infinite life?

NO.

The Wannabe Rational

"So, yeah. I believe in God. I figure my particular beliefs are a little irrelevant at this point."

I think the particulars of your beliefs are important, because they reveal how irrational you might be. Most people get away with God belief because it isn't immediately contradicted by experience. If you merely believe a special force permeates the universe, that's not testable and doesn't affect your life, really. However, if you believe this force is intelligent and interacts with the world (causes miracles, led the Israelites out of Egypt, etc.), these are testable and falsifiable claims (the Exodus should have left evidence of a large semitic migration through the Sinai, but none exists, for example), and believing them in light of disconfirmatory evidence makes you more irrational.

Because of this lack of testability, it's much easier to believe in vague gods than, for example, that your next lottery ticket will be a winner.

Load More