raptortech97

Nash Equilibria and Schelling Points

So if all pirates implement TDT, what happens?

Infinite Certainty

Hrmm... I'm still taking high school geometry, so "infinite set of axioms" doesn't really make sense yet. I'll try to re-read that thread once I've started college-level math.

Mysterious Answers to Mysterious Questions

Are you suggesting that we apply a punishment to any theory that sounds wise? Or that we apply a punishment only for those that also satisfy (a)?

Infinite Certainty

I intended the claim posed here about tests and priors. It is posed as

p(A|X) = [p(X|A)*p(A)]/[p(X|A)*p(A) + p(X|~A)*p(~A)]

But does it make sense for that to be wrong? It is a theorem, unlike the statement 2+2=4. Maybe some sort of way to show that the axioms and definitions that are used to prove Baye's Theorem are inconsistent, which is a pretty clear kind of proof. I'm not sure anymore that what I said has meaning. Well, thanks for the help.

The Crackpot Offer

I don't remember ever coming up with a false disproof in math, though I did manage to "solve" perpetual motion machines. I did successfully prove a trivial result in solving quadratic equations in modular arithmetic.

Infinite Certainty

Eliezer, what could convince you that Baye's Theorem itself was wrong? Can you properly adjust your beliefs to account for evidence if that adjustment is systematically wrong?

Belief in Self-Deception

I benefit from believing people are nicer than they actually are.

I empathize with her here. I believe that it is in my advantage to act towards people the way I would act if they were nicer than they actually are. I'll try to parse that out. Let's say Alice is talking to Bob. Cindy, at a different time, also talks to Bob. Bob is a jerk; we assume he is not nice.

- Alice honestly expects that Bob is nicer than he actually is, and accordingly she is nice to Bob.
- Cindy honestly expects that Bob is exactly as nice as he actually is, and accordingly she is dismissive of Bob.

I expect that Bob will be nicer towards Alice than towards Cindy. (Warning: This is starting to feel like a belief, suggesting that it is actually a belief in belief.) My theory is that I should act like Alice. Of course, there are alternatives, like simply being to nice to people.

I hope this comment made sense to you. I know I'm pretty confused about it myself now.

Dark Side Epistemology

Wait a second - the scientific method? How? It may not be the most efficient way to get the truth, and it may not take into account Baye's theorem that could speed it up, but I don't see how the scientific method is epistemologically (is that a word?) **wrong**.

Harry Potter and the Methods of Rationality discussion thread, part 15, chapter 84

Do not have the audience be part of the group being tested. Pull in confederates off the street, and tell them about the test. Do not allow subjects to see each other's testing. Let's say now that the current subject is Alex. Alex prefers vanilla ice cream to chocolate ice cream. Now go through the anti-conformity training.

After the training, hold a break (still with just Alex and the confederates). Offer ice cream in chocolate, vanilla, and, say, mango. Have most (maybe about 80%) of the confederates go for the chocolate, 10% for the vanilla, and 10% for the mango.

The mango should help to decrease the suspicion, as should having not everybody go for the chocolate. It may help to have the confederates go through the training as well, to decrease suspicion.

The problems I see with this are a) Cost. This one I'll ignore, because that is a matter of practicality. b) The subject group is not the group conforming. This will decrease the likelihood of conforming.

The problem with having the subject group be the confederates, is that then the subject group knows how the test is being done.

Ok so there's a good chance I'm just being an idiot here, but I feel like a multiple worlds kind of interpretation serves well here. If, as you say, "the coin is deterministic, [and] in the overwhelming measure of the MWI worlds it gives the same outcome," then I don't believe the coin is fair. And if the coin isn't fair, then of course I'm not giving Omega any money. If, on the other hand, the coin is fair, and so I have reason to believe that in roughly half of the worlds the coin landed on the other side and Omega posed the opposite question, then by giving Omega the $100 I'm giving the me in those other worlds $1000 and I'm perfectly happy to do that.