LESSWRONG
LW

814
Crazy philosopher
-206940
Message
Dialogue
Subscribe

17 years old, I'm interested in AI alignment, rationaluty & philosophy, economy and politics.

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No wikitag contributions to display.
The Anthropic Trilemma
Crazy philosopher9h1-2

If one of your future selves will see red, and one of your future selves will see green, then (it seems) you should anticipate seeing red or green when you wake up with 50% probability.

...

Program your computational environment to, if you win, make a trillion copies of yourself, and wake them up for ten seconds, long enough to experience winning the lottery.  Then suspend the programs, merge them again, and start the result.  If you don't win the lottery, then just wake up automatically.

No. You won’t see yourself winning the lottery when you wake up.

There is the real you. You may create copies of yourself, but they are still just copies.

Let’s suppose Eliezer starts this experiment: the universe splits into 10,000,000 copies, and in one of them he wins and creates a trillion copies of himself.

So, there are 10,000,000 actual Eliezers — most of whom are in universes where he didn’t win — but there are also a huge number of copies of the winning Eliezer.

Or, if Eliezer’s consciousness continues only into one future universe and doesn’t somehow split 10,000,000 ways, then in most of the universes where his consciousness could go, he didn’t win the lottery.

Since your clones are not you, and you don’t feel what your clones feel, I don’t think the number of clones created really matters.

You might think that there could be many versions of you sharing your consciousness — that they are all “you” — but consciousness is a result of physical processes. So I don’t think it can teleport from one dimension to another. Therefore, since most of the real/original Eliezers exist in universes where he lost, he would wake up to find that he lost.

Reply
Hospitalization: A Review
Crazy philosopher11d30

You should subscribe on cryonisation. We all should, actually.

Reply
Do One New Thing A Day To Solve Your Problems
Crazy philosopher13d30

I implemented this two days ago, and I’m already seeing incredible results in some areas! Not much progress in productivity, since I had already tried most of these ideas before, but I did experiment with buying different products for my lunch, found something better, and also made progress on something that takes a lot of context to explain.

(This comment doesn’t share any new important information about the technique, but it’s still important to write comments like this to support the authors. It’s hard to keep creating when all you hear is criticism.

I think of comments like this as a kind of reward for good behavior. In behavioral style)

Reply11
Please don't throw your mind away
Crazy philosopher1mo10

I'm curious about useful topics and uncurious about unusefuls ones. I find it better that your proposition.

Reply
Felix Moses's Shortform
Crazy philosopher1mo10

So, when a community will reach 240, it will have an incentive to don't grow to don't have drama and decrease of efficiency because of economies of scale? How would you prevent it?

Reply
Why did everything take so long?
Crazy philosopher1mo10

Excellent point!

Reply
Why Every Politician Thinks They’re So Right (and Why That’s a Disaster)
Crazy philosopher2mo10

I don’t mean that the probability is always 50/50. But it’s not 100% either.

In Europe, the smartest people for centuries believed in god, and they saw endless confirmations of that belief. And then—bam! It turned out they were simply all wrong.

Or take any case of ancient medicine. European doctors believed for centuries that bloodletting cured everything, while Chinese doctors believed that eating lead prolonged life.
There are also other examples where all the experts were wrong: geocentrism, the ether theory, the idea that mice spontaneously generate in dirty laundry, the miasma theory of disease…

In all these cases it was either about cognitive biases (God, medicine) or about lack of information or broken public discussion (geocentrism).

Today we fight biases much better than a thousand years ago, but we’re still far from perfect.

And we still sometimes operate under very limited information.

I think one should have fundamental rational habits that would protect me from being so sure in god or bloodletting. That’s why, from any conclusion I make, I subtract a few percentage points of confidence. The more complex the conclusion, the more speculative my reasoning or vulnerable to diases, the more I subtract.

If you claim that my way of fighting this overconfidence shouldn’t be used, I’d want you to suggest something else instead. Because you can’t just leave it as it is—otherwise one might assign 99% confidence to some nonsense.

Reply
Why Every Politician Thinks They’re So Right (and Why That’s a Disaster)
Crazy philosopher2mo*10

Interesting model. Probably you are right and I didn't considered this because all my friends and me are not idiots.

Reply
Why Every Politician Thinks They’re So Right (and Why That’s a Disaster)
[+]Crazy philosopher2mo-5-2
Load More
11Why Every Politician Thinks They’re So Right (and Why That’s a Disaster)
2mo
27
-5Reflections on anthropic principle
4mo
13
0Hedonic adaptation: you should not seeks pleasure
5mo
6
3Your memory eventually drives confidence in each hypothesis to 1 or 0
1y
6
3Joint mandatory donation as a way to increase the number of donations
1y
3
-4Regularly meta-optimization
1y
6
3Why write down the basics of logic if they are so evident?
1y
9