Posts

Sorted by New

Wiki Contributions

Comments

The safest investment is Treasury Inflation Protected Securities (TIPS). Ordinary investors should avoid investing in derivative securities such as options. If you are rationally pessimistic go with TIPS.

Also, you would never get the 1/100 odds because in a sense money is more valuable in the state in which the economy is doing poorly. So say there are two bonds, each in 30 years have a 99% chance of paying 0 and a 1% chance of paying $1,000. The first bond pays off in a state in which the economy has done very poorly, the second in a state in which the economy has done OK. The first bond will cost a lot more than the second.

If you do want to play with derivative securities just maintain a short position in the S&P 500. If you think the decline will be gradual rather then all at once you could just keep buying short term put options on the S&P 500. As the market declines you will gain wealth which you could use to increase your short position.

If you are really, really pessimistic spend your money stocking up on canned goods and guns.

Doug S.

I'm interested in learning more about extremely early readers. I would be grateful if you contacted me at

EconomicProf@Yahoo.com

High functioning autism might in part be caused by an "overclocking" of the brain.

My evidence:

(1) Autistic children have on average larger brains than neurotypical children do. (2) High IQ parents are more likely than average to have autistic children. (3) An extremely disproportionate number of mathematical geniuses have been autistic. (4) Some children learn to read before they are 2.5 years old. From what I know all of these early readers turn out to be autistic.

Eliezer-

“What justifies the right of your past self to exert coercive control over your future self? There may be overlap of interests, which is one of the typical de facto criteria for coercive intervention; but can your past self have an epistemic vantage point over your future self?”

In general I agree. But werewolf contracts protect against temporary lapses in rationality. My level of rationality varies. Even assuming that I remain in good health for eternity there will almost certainly exist some hour in the future in which my rationality is much lower than it is today. My current self, therefore, will almost certainly have an “epistemic vantage point over [at least a small part of my] future self.” Given that I could cause great harm to myself in a very short period of time I am willing to significantly reduce my freedom in return for protecting myself against future temporary irrationality.

Having my past self exert coercive control of my future self will reduce my future information costs. For example, when you download something from the web you must often agree to a long list of conditions. Under current law if these terms of conditions included something like “you must give Microsoft all of your wealth” the term wouldn’t be enforced. If the law did enforce such terms then you would have to spend a lot of time examining the terms of everything you agreed to. You would be much better off if your past self prevented your current self from giving away too much in the fine print of agreements.

“If you constrain the contracts that can be written, then clearly you have an idea of good or bad mindstates apart from the raw contract law, and someone is bound to ask why you don't outlaw the bad mindstates directly.”

The set of possible future mindstates / world state combinations is very large. It’s too difficult to figure out in advance which combinations are bad. It’s much more practical to sign a Werewolf contract which gives your guardian the ability to look at the mindstate / worldstate you are in and then decide if you should be forced to move to a different mindstate.

“why force Phaethon to sacrifice his pride, by putting him in that environment?”

Phaethon placed greater weight on freedom than pride and your type of paternalism would reduce his freedom.

But in general I agree that if most humans alive today were put in the Golden Age world then many would do great harm to themselves and in such a world I would prefer that the Sophotechs exercise some paternalism. But if such paternalism didn’t exist then Warewolf contracts would greatly reduce the type of harm you refer to.

ShardPhoenix wrote "Doesn't the choice of a perfect external regulator amount to the same thing as directly imposing restrictions on yourself, thereby going back to the original problem?"

No because if there are many possible future states of the world it wouldn't be practical for you in advance to specify what restrictions you will have in every possible future state. It's much more practical for you to appoint a guardian who will make decisions after it has observed what state of the world has come to pass. Also, you might pick a regulator who would impose different restrictions on you than you would if you acted without a regulator.

ShardPhoenix also wrote "Another way to do it might be to create many copies of yourself (I'm assuming this scenario takes place inside a computer) and let majority (or 2/3s majority or etc) rule when it comes to 'rescuing' copies that have made un-self-recoverable errors."

Good idea except in the Golden Age World these copies would become free individuals who could modify themselves. You would also be financially responsible for all of these copies until they became adults.

You are forgetting about "Werewolf Contracts" in the Golden Age. Under these contracts you can appoint someone who can "use force, if necessary, to keep the subscribing party away from addictions, bad nanomachines, bad dreams or other self-imposed mental alterations."

If you sign such a contract then, unlike what you wrote, it's not true that "one moment of weakness is enough to betray you."

Non-lawyers often believe that lawyers and judges believe that laws and contracts should be interpreted literally.

"Eliezer, I'd advise no sudden moves; think very carefully before doing anything."

But about 100 people die every minute!

I have signed up with Alcor. When I suggest to other people that they should sign up the common response has been that they wouldn't want to be brought back to life after they died.

I don't understand this response. I'm almost certain that if most of these people found out they had cancer and would die unless they got a treatment and (1) with the treatment they would have only a 20% chance of survival, (2) the treatment would be very painful, (3) the treatment would be very expensive, and (4) if the treatment worked they would be unhealthy for the rest of their lives; then almost all of these cryonics rejectors would take the treatment.

One of the primary cost of cryonics is the "you seem insane tax" one has to pay if people find out you have signed up. Posts like this will hopefully reduce the cryonics insanity tax.

You and Robin seem to be focused on different time periods. Robin is claiming that after ems are created one group probably won't get a dominant position. You are saying that post-singularity (or at least post one day before the singularity) there will be either one dominant group or a high likelihood of total war. You are not in conflict if there is a large time gap between when we first have ems and when there is a singularity.

I wrote in this post that such a gap is likely: http://www.overcomingbias.com/2008/11/billion-dollar.html

Load More