Realistic epistemic expectations

by JonahSinick 4y31st May 201511 comments

14


When I state a position and offer evidence for it, people sometimes complain that the evidence that I've given doesn't suffice to establish my position. The situation is usually that I'm not trying to give a rigorous argument for my position, and I don't intend to claim that the evidence that I provide suffices to establish my position.

My goal in these cases is to offer a high-level summary of my thinking, and to provide enough evidence so that readers have reason to Bayesian update and to find the view sufficiently intriguing to investigate further.

In general, when a position is non-obvious, a single conversation is nowhere near enough time to convince a rational person that it's very likely to be true. As Burgundy recently wrote:

When you ask Carl Shulman a question on AI, and he starts giving you facts instead of a straight answer, he is revealing part of his book. The thing you are hearing from Carl Shulman is really only the tip of the iceberg because he cannot talk fast enough. His real answer to your question involves the totality of his knowledge of AI, or perhaps the totality of the contents of his brain.

If I were to restrict myself to making claims that I could substantiate in a mere ~2 hours, that would preclude the possibility of me sharing the vast majority of what I know.

In math, one can give rigorous proofs starting from very simple axioms, as Gauss described:

I mean the word proof not in the sense of lawyers, who set two half proofs equal to a whole one, but in the sense of mathematicians, where 1/2 proof = 0, and it is demanded for proof that every doubt becomes impossible'.

Even within math, as a practical matter, proofs that appear to be right are sometimes undercut by subtle errors. But outside of math – the only reliable tool that one has at one's disposal is Bayesian inference.  In 2009, charity evaluator GiveWell made very strong efforts to apply careful reasoning to identify its top rated charity, and gave a "conservative" cost-effectiveness estimate of $545/life saved, which turned out to have been wildly optimisticArgumentation that looks solid on the surface often breaks down on close scrutiny. This is closely related to why GiveWell emphasizes the need to look at giving opportunities from many angles, and gives more weight to robustness of evidence than to careful chains of argumentation.

Eliezer named this website Less Wrong for a reason – one can never be certain of anything – all rational beliefs reflect degrees of confidence. I believe that discussion advances rationality the most when it involves sharing perspectives and evidence, rather than argumentation.

14