I recently noticed similarities between how I decide what stock market evidence to look at, and how the legal system decides what lawyers are allowed to tell juries.

This post will elaborate on Eliezer's Scientific Evidence, Legal Evidence, Rational Evidence. In particular, I'll try to generalize about why there's a large class of information that I actively avoid treating as Bayesian evidence.

Here's an example of a briefly popular meme that helped teach me to ignore certain types of information:

Companies need to be forgetting organizations. Enron Corp., which has repeatedly been tagged as the nation's most innovative corporation, is exhibit A as a world-class forgetting organization. It's not wedded to what it did yesterday. Enron chiefs Kenneth Lay and Jeff Skilling have figured out how to operate like a band of pirates. Got an idea? Don't dally. Go for it while it's an original!

Everything here seems at least technically true (The Media Very Rarely Lies). I suspect it was mostly intended to be honest and helpful. But as a reader, I found it discouraged me from focusing on some key questions, such as: who was tagging Enron as innovative? What were the goals of those taggers? What was their track record at helping readers identify good investments? How much of a connection is there between innovation and good investments? What are some examples of Enron's innovation, and how do they resemble the kind of innovations that have caused other companies to succeed?

Enron seems to have done a few things which qualify as fraud. But most of the Enron stock price bubble was generated through misdirecting investor's attention (and maybe the attention of Lay and Skilling) away from the risks that Enron was taking.

I was in fact tempted to invest in Enron when I listened to that kind of meme. Whenever I felt that temptation, I looked at its financial statements and charts of its stock price, and decided I wouldn't miss much if I waited for better evidence.

The standard Bayesian worldview asks me to update on all relevant information. Why do both I as an investor and the legal system exclude large classes of information from that rule?

Both contexts involve some zero sum or negative sum conflicts. That makes it likely that someone is packaging information in order to mislead.

If I, or a jury, only notice a datum because someone thought it would mislead me or the jury, odds are that updating on that information would reduce the accuracy of my (or the jury's) beliefs. We reject that information in order to ensure that its expected value will be zero, because the alternative tends to be that its expected value is negative.

In both contexts, rules have been developed to admit evidence only if it's hard to manipulate or if it comes from a source that's unlikely to manipulate it.

Accounting rules strongly limit how many ways key information can be presented. They also ensure that someone can in principle be punished if the information misleads.

Similar rules ensure that company press releases are fairly safe for investors to read. The company will often be sued if the press release contains questionable claims that subsequently become linked to investor losses. That deters a wide variety of harmful propaganda (and presumably a fair amount of helpful evidence). It still leaves some gray areas for vaguely misleading memes, such as talk of Enron's innovation. Companies have mostly shifted to getting other respected sources to promote those ideas for them, rather than putting them in their own press releases.

Still, I consider it an important skill to read those press releases in somewhat the same way that courts admit evidence: carefully distinguishing between "facts" and everything else.

Science has similar problems with adversarial sources of evidence.

A century or two ago, there may have been small enough communities of scientists and science journals, and a good enough reward structure, that everyone involved could trust each other to be pursuing the common good.

Today, wide areas of science have grown big enough that it's hard to have a high-trust relationship with everyone who's publishing in the field.

I expect that would still have been achieved if there had been sufficiently good incentives. But few scientists want to think about incentives. Economists being the obvious exception - did they do better? The evidence seems unclear.

Increased science funding over the past century has contributed to a large increase in competition for tenured positions. Publications have become an overly important part of how people get tenure. That combination means increased incentives for science communications to be adversarial rather than truth-oriented.

I expect drug-related science to have the biggest problem with adversarial evidence, since billions of dollars can depend on whether the science convinces the FDA that a drug falls on the right side of a threshold.

This model suggests that medical science likely has the most constraining rules of evidence. I'd guess that the least constraining rules would be found in physics, computer science, chemistry, and geology. Those seem to be the fields that are least likely to use rules such as p > 0.05 to reject evidence.

Scientists, like lawyers and experienced investors, focus on communications that rigorously minimize ambiguity and subtext, in spite of the cost in readability and entertainment value.

Science, law, and investing all use different heuristics in deciding what qualifies as evidence, adapted to particular problems they've faced and to what types of information they need to allow. The adversarial perspective illustrates a unifying theme behind those heuristics.

New to LessWrong?

New Comment
1 comment, sorted by Click to highlight new comments since: Today at 10:00 PM

More than an upvote, I want to note that I find your posts unusually helpful quite often.

A gloss on this one:

Selection filters are

  • ubiquitous
  • strong
  • difficult to back out of data, and relatedly often difficult to even notice (noticing is a substantial fraction of the work needed to back it out)
  • non random, since the above properties make them attractive as weapons