Say Tim states, “There is a 20% probability that X will occur”. It’s not obvious to me what that means for Bayesians.
It could mean:
I’ve heard some more formalized proposals like, “I estimate that if I and several other well respected people thought about this for 100 years, we would wind up estimating that there was a 20% chance”, but even this assumes that listeners would converge on this same belief. This seems like possibly a significant assumption! It's quite to Coherent Extrapolated Volition, and similarly questionable.
I think my current best guess to this is something like:
When humans say thing X, they don't mean the literal translation of X, but rather are pointing to X', which is a specific symbol that other humans generally understand. For instance, "How are you" is a greeting, not typically a literal question. [How Are You] can be thought of as a symbol that's very different than the sum of it's parts.
That said, I find it quite interesting that the basics of human use of language seem to be relatively poorly understood; in the sense that I'd expect many people to di