A distinction I don't see made often enough is between what I call randomness and ignorance. Roughly, every expression of uncertainty is either about "where in the universe am I?" or "what is the universe like?" (or both). The former is the domain of randomness, the latter of ignorance.

Suppose you roll a die. You know that you're in a situation where you've just rolled a die, and that, in roughly 1/6th of the situations where one has just rolled a die, the die will come up a three. Thus, your uncertainty about the die roll is random.

Suppose you're wondering whether or not an omnipotent and immortal being exists. Whatever the answer is, it is the same every time someone is asking this question. Thus there is no randomness involved, but you are ignorant of what the answer is (though you might have a hunch).

Often, your uncertainty will have components of both. Suppose someone hands you a die, you roll it five times, and every time you roll a three. You can probably guess that the die is biased. But why? One way to answer this question is that, in most of the situations where one is handed a die and rolls it five times and gets the same number five times, the die is biased. You are ignorant of exactly how often this is the case, though. It could be the case every 9 out of 10 times this happens, or perhaps every 8 out of 10 times. Now suppose you knew that it was 9 out of 10 times. Then it would still be random whether you are in one of the 9 cases, or in the tenth.

Next up: Reference Classes for Randomness

New Comment
5 comments, sorted by Click to highlight new comments since: Today at 9:04 AM

It could be argued that it's all ignorance. The die will roll the way that physics demands, based on the velocity, roll, pitch, yaw of the die, and the surface properties of the felt. There's only one possible outcome, you just don't know it yet. If you roll a die in an opaque cup, the uncertainty does not change in kind from the time you start shaking it to the time you slam it down - it's all the same ignorance until you actually look.

You can, if you like, believe that there is unknowability at the quantum level, but even that doesn't imply true randomness, just ignorance of which branch you'll find your perceptive trail following.

Luckily (heh), Bayes' Theorem doesn't care. It works for updating predictions on evidence, regardless of where uncertainty comes from.

It could be argued that it's all ignorance. The die will roll the way that physics demands, based on the velocity, roll, pitch, yaw of the die, and the surface properties of the felt. There's only one possible outcome, you just don't know it yet. If you roll a die in an opaque cup, the uncertainty does not change in kind from the time you start shaking it to the time you slam it down - it's all the same ignorance until you actually look.
You can, if you like, believe that there is unknowability at the quantum level, but even that doesn't imply true randomness, just ignorance of which branch you'll find your perceptive trail following.

I'm not going to argue for unknowability at the quantum level, but I will argue (in the next post) that you are not sufficiently smart to differentiate precisely enough between the different possible situations, and that's why you have to group a bunch of different situations together, and that's how you get what I call randomness. I'm not arguing for or against any kind of "true" randomness. I agree that you can argue it's all ignorance, but (I claim) not doing so will solve a lot of problems

[-]TAG4y10

It could be argued that it’s all ignorance. The die will roll the way that physics demands, based on the velocity, roll, pitch, yaw of the die, and the surface properties of the felt.

Assuming physics is deterministic, which is not known to be the case.

You can, if you like, believe that there is unknowability at the quantum level, but even that doesn’t imply true randomness, just ignorance of which branch you’ll find your perceptive trail following

Assuming MWI is the correct interpretation of QM, which is also not known to he the case.

This is aleatory (inherent randomness) vs. epistemic (knowledge) uncertainty. You can parse this as uncertainty inherent in the parameters vs. uncertainty inherent in your estimates of the parameters / the parameterization of the model.

This is a very important distinction that has received treatment in the prediction literature but, indeed, is not applied enough to interpreting others' predictions among laypeople.

I've seen some academic talk of this. Adam Bjorndahl at CMU has written some papers where he reframes situations that normally have randomness as being about the ignorance of an agent. Noting that his papers are very technical and I don't know what if any good general insights there are to glean from them.