Mar 5, 2009
by Steve Rayhawk and Anna Salamon. (Joint authorship; there's currently no way to notate that in the Reddit code base.)
Related to: Use the Native Architecture
When cholera moves through countries with poor drinking water sanitation, it apparently becomes more virulent. When it moves through countries that have clean drinking water (more exactly, countries that reliably keep fecal matter out of the drinking water), it becomes less virulent. The theory is that cholera faces a tradeoff between rapidly copying within its human host (so that it has more copies to spread) and keeping its host well enough to wander around infecting others. If person-to-person transmission is cholera’s only means of spreading, it will evolve to keep its host well enough to spread it. If it can instead spread through the drinking water (and thus spread even from hosts who are too ill to go out), it will evolve toward increased lethality. (Critics here.)
I’m stealing this line of thinking from my friend Jennifer Rodriguez-Mueller, but: I’m curious whether anyone’s gotten analogous results for the progress and mutation of ideas, among communities with different communication media and/or different habits for deciding which ideas to adopt and pass on. Are there differences between religions that are passed down vertically (parent to child) vs. horizontally (peer to peer), since the former do better when their bearers raise more children? Do mass media such as radio, TV, newspapers, or printing presses decrease the functionality of the average person’s ideas, by allowing ideas to spread in a manner that is less dependent on their average host’s prestige and influence? (The intuition here is that prestige and influence might be positively correlated with the functionality of the host’s ideas, at least in some domains, while the contingencies determining whether an idea spreads through mass media instruments might have less to do with functionality.)
Extending this analogy -- most of us were taught as children to wash our hands. We were given the rationale, not only of keeping ourselves from getting sick, but also of making sure we don’t infect others. There’s an ethic of sanitariness that draws from the ethic of being good community members.
Suppose we likewise imagine that each of us contain a variety of beliefs, some well-founded and some not. Can we make an ethic of “epistemic hygiene” to describe practices that will selectively cause our more accurate beliefs to spread, and cause our less accurate beliefs to stay con tained, even in cases where the individuals spreading those beliefs don’t know which is which? That is: (1) is there a set of simple, accessible practices (analogous to hand-washing) that will help good ideas spread and bad ideas stay contained; and (2) is there a nice set of metaphors and moral intuitions that can keep the practices alive in a community? Do we have such an ethic already, on OB or in intellectual circles more generally? (Also, (3) we would like some other term besides “epistemic hygiene” that would be less Orwellian and/or harder to abuse -- any suggestions? Another wording we’ve heard is “good cognitive citizenship”, which sounds relatively less prone to abuse.)
Honesty is an obvious candidate practice, and honesty has much support from human moral intuitions. But “honesty” is too vague to pinpoint the part that’s actually useful. Being honest about one’s evidence and about the actual causes of one’s beliefs is valuable for distinguishing accurate from mistaken beliefs. However, a habit of focussing attention on evidence and on the actual causes of one’s own as well as one’s interlocutor’s beliefs would be just as valuable, and such a practice is not part of the traditional requirements of “honesty”. Meanwhile, I see little reason to expect a socially-endorsed practice of “honesty” about one’s “sincere” but carelessly assembled opinions (about politics, religion, the neighbors’ character, or anything else) to selectively promote accurate ideas.
Another candidate practice is the practice of only passing on ideas one has oneself verified from empirical evidence (as in the ethic of traditional rationality, where arguments from authority are banned, and one attains virtue by checking everything for oneself). This practice sounds plausibly useful against group failure modes where bad ideas are kept in play, and passed on, in large part because so many others believe the idea (e.g. religious beliefs, or the persistence of Aristotelian physics in medieval scholasticism; this is the motivation for the scholarly norm of citing primary literature such as historical documents or original published experiments). But limiting individuals’ sharing to the (tiny) set of beliefs they can themselves check sounds extremely costly. Rolf Nelson’s suggestion that we find words to explicitly separate “individual impressions” (impressions based only on evidence we’ve ourselves verified) from “beliefs” (which include evidence from others’ impressions) sounds promising as a means of avoiding circular evidence while also benefiting from others’ evidence. I’m curious how many here are habitually distinguishing impressions from beliefs. (I am. I find it useful.)
Are there other natural ideas? Perhaps social norms that accord status for reasoned opinion-change in the face of new good evidence, rather than norms that dock status from the “losers” of debates? Or social norms that take care to leave one’s interlocutor a line of retreat in all directions -- to take care to avoid setting up consistency and commitment pressures that might wedge them toward either your ideas or their own? (I’ve never seen this strategy implemented as a community norm. Some people conscientiously avoid “rhetorical tricks” or “sales techniques” for getting their interlocutor to adopt their ideas; but I’ve never seen a social norm of carefully preventing one’s interlocutor from having status- or consistency pressures toward entrenchedly keeping their own pre-existing ideas.) These norms strike me as plausibly helpful, if we could manage to implement them. However, they appear difficult to integrate with human instincts and moral intuitions around purity and hand-washing, whereas honesty and empiricism fit comparatively well into human purity intuitions. Perhaps this is why these social norms are much less practiced.
In any case:
(1) Are ethics of “epistemic hygiene”, and of the community impact of one’s speech practices, worth pursuing? Are they already in place? Are there alternative moral frames that one might pursue instead? Are human instincts around purity too dangerously powerful and inflexible for sustainable use in community epistemic practice?
(2) What community practices do you actually find useful, for creating community structures where accurate ideas are selectively promoted?