Sorted by New

Wiki Contributions




From what I understood on reading the Wikipedia article on Bayesian probability and inferring from how he writes (and correct me if I'm wrong), Eliezer is talking about your "subjective probability." You are a being, have consciousness, and interpret input as information. Given a lot of this information, you've formed an idea that 7 is prime. You've also formed an idea that other people exist, and that the sky is blue, which also have a high subjective probability in your mind because you have a lot of direct information to sustain that belief.

Moreover, if you've ever been wrong before, hopefully you've noticed that you have been wrong before. That's a little information that "you are sometimes wrong about things that you are very sure of". So, you might apply this information to your formula of your probability of the idea that "7 is prime", so you still end up with a high probability, but not 1.

Now, you might not think that "you are sometimes wrong about things that you are sure of" about every single subject, such as primeness. But, what if you had the information that other humans, smart people, have at some point in the past, incorrectly understood the primeness of a number (the anecdote). You might state, that "human beings are sometimes wrong about the primeness of a number," and "I am a human being." Again, if you include that information in your calculation of the probability that the idea that "7 is prime" is true, then you end up with a high probability, but not 1.

(Oh, but what if you didn't make the statement "human beings are sometimes wrong about the primeness of a number", but instead, "this idiot is sometimes wrong about the primeness of a number, but I am never" Well, you can. That's one big problem with Bayesian subjective probabilities. How do we generalize? How can we formalize it so that two people with the same information deterministically get the same probability? Logical (or objective epistemic) probability attempts to answer these questions.)

So, you're right that it is just "a single person" getting it wrong, that his cerainty was incorrect. But that's Eliezer's point. We are not supreme beings lording over all reality, we are humans who have memorized some information from the past and made some generalizations, including generalizations that sometimes our generalizations are wrong.


I'm sorry. Eliezer, can you please explain to me what you mean when you say the how certain you are (probability %) that something is true? I've studied a lot of statistics, but I really have no idea what you mean.

If I say that this fair coin in my hand has a 50% chance of coming up heads, then that means that if I flip it a lot of times, then it'll be heads 50% of the times. I can do that with a lot of real, measurable things.

So, what do you mean by, you are 99% certain of something?


"""But if you think you would totally wear that clown suit, then don't be too proud of that either! It just means that you need to make an effort in the opposite direction to avoid dissenting too easily. That's what I have to do, to correct for my own nature."""

I know exactly what you mean. I often see myself dissenting with the majority. Unfortunately, it is difficult to tell if I do so because I am Right, or because I want to be Different.

Sure, I can use logic. But, how do I know I am being a Rationalist, rather than just Rationalizing? It's easy to make up arguments (even coherently logical ones) to support incorrect conclusions. Look at economists.