Epistemic status: I’m moderately confident in the positions I endorse here, and this series is the product of several months’ research, but it only skims the surface of the literature on these questions.
Bookkeeping: This post is part of a short series reviewing and commenting on papers in epistemology and philosophy of science concerning research norms. You can find the other posts in this series here and here. The sources used for these posts were suggested to me by my professor as a somewhat representative sample of the work on these subjects. The summaries and views expressed are my own unless otherwise stated. I have read the papers in the bibliography; I have not read the papers in the “See Also” section, but they are relevant to the discussion, and I encourage anyone interested to give them a shot. Many of the papers mentioned in this series are publicly available on philpapers.org.
Linus Pauling, the brilliant chemist and energetic proponent of peace, won two Nobel Prizes—one for his work in chemistry, and another for his activism against atomic weapons. Later, Pauling asserted that mega-doses of vitamin C could effectively treat diseases such as cancer and cure ailments like the common cold. Pauling was roundly dismissed as a crackpot by the medical establishment after researchers ran studies and concluded that high-dose vitamin C therapies did not have the touted health effects. Pauling accused the establishment of fraud and careless science. This trespasser did not want to be moved aside by the real experts. (Ballantyne, p. 367).
Experts drift over a highly-visible boundary line and into a domain where they lack either the relevant evidence or the skills to interpret the evidence well. But they keep talking nonetheless. Experts on a public stage are cast in the role of the ‘public intellectual’ or ‘celebrity academic’. … So what do you have to say about philosophy, Neil deGrasse Tyson? And what about arguments for the existence of God, Professor Dawkins?. (Ballantyne, p. 369).
The above cases outline Nathan Ballantyne’s meaning when he refers to “epistemic trespassing”; Linus Pauling doesn’t understand medicine, and public scientists like Neil deGrasse Tyson and Richard Dawkins tend to misunderstand what philosophy is and/or how to use it (according to philosophers). But epistemic trespassing isn’t a practice limited to public figures.
Ballantyne’s theses are that “trespassing is a widespread problem that crops up especially in the practice of interdisciplinary research,” and that “reflecting on trespassing should lead us to have greater intellectual modesty, in the sense that we will have good reason to be far less confident we have the right answers to many important questions.”
How to spot epistemic trespassing
Let’s look at some useful terms for this discussion. Interdisciplinary research in this context is research across multiple fields, and in Ballantyne’s sense a field encompasses “an extremely narrow set of questions” and nothing more. “Expertise is a status of thinkers and it is relative to a field at a particular time,” so right now Richard Dawkins is an expert on biology, but that doesn’t mean he was an expert when he was a baby (and his expertise may waver if he stops keeping up with the field), and it doesn’t mean he’s an expert in general. What it does mean is that he has both “enough relevant evidence to answer reliably or responsibly [his] field’s questions,” and “enough relevant skills to evaluate or interpret the field’s evidence well.” (p. 371). Finally, epistemic trespassing is when someone fails to answer questions reliably or responsibly due to a lack of relevant expertise.
Thus, when Richard Dawkins "fails to engage with the genuine issues and sets up strawmen as his dialectical opponents” in religious arguments, he trespasses due to a lack of philosophical (and presumably theological) argumentative skills. When Neil deGrasse Tyson says “that philosophy is ‘useless’” he apparently goes on to demonstrate a limited or inaccurate understanding of what philosophy is, so he trespasses due to a lack of evidence about philosophy, let alone the questions that make up the field. More subtly, I might be considered an expert on data science, but I definitely should not be considered an expert on operating systems, and an interdisciplinary question that touched on both “fields” could provoke me to overstep the bounds of my expertise.
Ballantyne sees three situations in which a question would be interdisciplinary/”hybridized”:
- the evidence required to answer a question reliably or responsibly comes from two or more fields;
- the skills required to evaluate the evidence well come from two or more fields;
- both the relevant evidence and the relevant skills required to answer a question reliably or responsibly come from two or more fields (p. 372)
So, for example, the question of how best to transfer America’s electricity grids to renewables is hybridized, because answering it properly requires skills and evidence from infrastructure engineering, climatology, and other fields. Someone with expertise in just one of these fields would probably feel well equipped to answer this question one way or another, and they would certainly be more qualified to do so (in an epistemic sense) than a total layperson, but without expertise in all of the relevant fields, the expert would still be trespassing. This may seem to imply that only polymaths can answer such questions properly, but, as Ballantyne indicates, a hybridized question could be answered “reliably and responsibly [using] cross-field resources” in general such as via collaboration between experts from different fields (Ballantyne, p. 372).
When epistemic trespassing is okay
There are some cases in which it might be okay for you to trespass with your views on some proposition p, assuming you’re already an expert in one of the relevant fields:
(D1) I am trespassing on another field, but that field does not feature any relevant evidence or skills that bear on my view about p;
(D2) I am trespassing on another field, but my own field’s evidence conclusively establishes that p is true;
(D3) I am trespassing on another field, but my own field’s skills successfully ‘transfer’ to the other field. (p. 379)
(D1) covers cases in which the evidence from the other fields could only either strengthen or have no effect on your view. For example, a currency expert might think Bitcoin is unlikely to see widespread use based on its slow transaction speeds, and they wouldn’t need to consult experts for information about blockchain energy consumption to verify that because they know that it definitely uses more energy than established fiat currencies, and that could only hurt the adoption of Bitcoin (or, at best, have a neutral effect.
(D1) also covers the realm of pseudoscience: “I believe the substantive claims of astrologers are false … But I’ll admit that astrologers have evidence and skills that I lack. My considered view, however, is that astrologers’ evidence and skills do not constitute a reliable method for establishing their claims, and so I am justified in dismissing their claims.” (Ballantyne, 380). We don’t have to wait to hear what pseudoscience has to say about a topic, because pseudoscience wouldn’t tell us anything useful anyway.
Of course, we’d be reckless to apply (D1) indiscriminately; we should have good reason to believe that we’ve accurately summarized the field(s) upon which we trespass. Ballantyne points out that it’s easy for researchers to be “unduly dismissive about research programs they do not contribute to" (p. 380), or to otherwise misunderstand them, so it seems like “reasonably accepting (D1) will typically require considerable effort” and consultation with experts from the appropriate fields (Ballantyne, p. 381).
(D2) seems like it would require less work on the part of the trespasser, but the situations it applies to are few and far between. If your view can be established as the uncontroversial truth without any input from the other relevant fields, then it seems like someone should have figured that out and settled the matter already. “To accept (D2) reasonably, you need an account for why the discussion grinds on—as it shouldn’t, on the assumption that (D2) is reasonable for the relevant disputants.” (Ballantyne, p. 381). In a way, Ballantyne’s dismissal of (D2) is a kind of anthropic principle of arguments; if the argument were so trivial that we could dismiss it this way, we would have dismissed it as trivial already (unless you think there’s some reason we wouldn’t have) so we wouldn’t be having this argument.
Justifying trespassing using (D3) requires little in the way of extra work or exceptional circumstances, so it’s probably the most likely excuse to come up. As Ballantyne points out, Richard Dawkins “suggests that he does not see what expertise philosophers of religion could possibly have that scientists like him would lack; in his own eyes, his scientific competence apparently transfers to a new context where he can appropriately answer questions about arguments for and against God’s existence" (p. 381). If it’s true that his scientific skills transfer to philosophy as well as he thinks, then it seems as if Dawkins has a serviceable defense of his trespassing.
However, Ballantyne presents two weaknesses of (D3). The first is that even if your skills transfer to the new field, you still won’t have any of the field-specific evidence. Case in point, even if Richard Dawkins’s skills transfer well enough the philosophy of religion that he’s on equal footing skill-wise with other philosophers, those philosophers will also have an expert-level familiarity with philosophical texts and arguments related to religion, whereas Dawkins may have the default, amateur level of evidence in that area. Thus, (D3) is useless to a trespasser unless it’s not just reasonable to accept, but also joined by justification for how the trespasser is on equal evidential footing to the field experts. (Ballantyne, pp. 381-2).
The second weakness of (D3), according to Ballantyne, is that it’s hard for the trespasser to acquire a reason to accept it. His justification for this point is that our best chance at determining when it’s okay to accept (D3) is by consulting empirical research on skill transference and metacognition. The evidence from the former is intended to show that skill transfer is difficult and that we often overestimate the success of our skill transference due to a lack of applicable track records in the territory where we trespass. The evidence from the latter is meant to show that while experts can employ (and would hope to transfer) “metacognitive heuristics such as ‘consider both sides of an issue’ or ‘generate alternative explanations for the evidence’,” these strategies are useless without enough relevant evidence to build accurate pictures in the first place (p. 386).
Thus, if Ballantyne is right, it’s difficult to justify accepting (D3). Even if we can accept it, we need further justification to ensure that we also have enough evidence to make proper use of our transferred skills. Alongside the issues with applying (D1) and (D2), it seems like there are few situations in which epistemic trespassing is okay. To explain why people would choose to trespass with confidence anyway, Ballantyne cites the Dunning-Kruger effect, which (for anyone who hasn’t heard of it) says that those who are competent tend to doubt their confidence, while those who are incompetent tend to possess inflated confidence.
In light of the difficulty of avoiding trespassing, and the separate challenge of trespassing safely, Ballantyne thinks our best bet for solving the big, multidisciplinary problems in the world is to collaborate in multidisciplinary environments.
I’ll come back to the psychology research in the next section, but the gist is that skill transference is hard.
I think the second weakness of (D3) is the least justified part of Ballantyne’s argument. The empirical evidence cited mainly consists of decades-old studies from a field, psychology, known for its replication crisis. I probably haven’t studied psychology as much as Ballantyne, but I don’t think these papers help his argument.
Also, Ballantyne follows up his research summary to say that “[t]ransfer failures are unsurprising in view of disheartening findings from contemporary educational research. The development of critical thinking skills is a central goal of modern education, but researchers say critical thinking does not easily generalize across domains.” He then presents “Linus Pauling and company” as “poster children for the perils of trespassing,” which seems true, but concludes that “they are cautionary tales for how exemplary critical thinking in one field does not generalize to others.” (p. 385).
While it’s true that a failure to transfer critical thinking skills from their original fields to those that they trespassed in would be sufficient to explain their perils, I don’t think it’s necessary. As Ballantyne said in the same section, they could have transferred their skills perfectly and fallen short due to a lack of evidence. I do think it’s more likely that Dawkins and Tyson failed to transfer their skills in the first place, given their apparent misunderstandings about the fields in which they trespass, but Ballantyne’s paper fails to support this point.
To explain/justify the importance of a track record, he gives an example of a classically trained pianist claiming to be able to play bebop jazz piano with no background jazz piano experience. He claims that "we should think the pianist's claim needs to be backed up by a satisfactory jazz performance" (p. 386). While it's true that the pianist's claim would need more evidence to be believed responsibly, the category of potentially worthy track records seems broader to me than Ballantyne indicates. For example, perhaps the pianist has an exceptional track record for picking up new styles by ear. In this case, the trespasser's versatility is an adequate substitute for field-specific experience, and the pianist may defend themself with (D3).
Similarly, a strong understanding of statistics and experimental design coupled with a bit of background knowledge seems like an adequate level of expertise to correctly interpret some areas of medicine and most social sciences. Assuming that background knowledge consists of things like desirable effect sizes, which wouldn't be too hard or time-consuming to look up in the course of trespassing anyway, it seems that both requirements for transferability are easier to meet than Ballantyne said they were, and (D3) is a realistic/common scenario after all. Maybe in Linus Pauling's day research was less accessible or something, because nowadays it seems like the skills I mentioned above and some metacognitive ones for detecting bias and conflicts of interest are all you'd need to avoid making the same mistake.
That said, I am not a psychologist, statistician, doctor, or other social scientist. As Ballantyne notes, “researchers can be unduly dismissive about research programs they do not contribute to,” (p. 380) and I could be overestimating the ability of my skills to transfer to these areas for the very reasons I’m dismissing. This may be cause to take my rebuttal with a grain of salt.
In any case, epistemic trespassing seems to me like a trap that's worth avoiding, such as by qualifying statements like "if [uncertain empirical claim] obtains, then [philosophical argument] follows," rather than attempting to verify the claims on one's own (as mentioned by Ballantyne in a footnote (p. 375)). In particular, this seems important when it comes to trespassing between fields that draw on quite different skill sets; if a historian, a moral philosopher, and a nuclear physicist walk into a bar, they probably wouldn't be able to leave with the information and skills necessary to form responsible beliefs about questions in each other's fields. However, I think this paper's account of transference is dated, and thus trespassing may be reasonable more often than Ballantyne claims.
Ballantyne, Nathan. “Epistemic Trespassing.” Mind, vol. 128, no. 510, Apr. 2019, pp. 367–95. DOI.org (Crossref), doi:10.1093/mind/fzx042. Available at https://www.academia.edu/34743123/Epistemic_Trespassing
Levy, Neil. “Radically Socialized Knowledge and Conspiracy Theories.” Episteme, vol. 4, no. 2, June 2007, pp. 181–92. DOI.org (Crossref), doi:10.3366/epi.2007.4.2.181.
Anderson, Elizabeth. “Democracy, Public Policy, and Lay Assessments of Scientific Testimony.” Episteme, vol. 8, no. 2, June 2011, pp. 144–64. DOI.org (Crossref), doi:10.3366/epi.2011.0013.
Hazlett, Allan. “The Social Value of Non-Deferential Belief.” Australasian Journal of Philosophy, vol. 94, no. 1, Jan. 2016, pp. 131–51. DOI.org (Crossref), doi:10.1080/00048402.2015.1049625.