tl;dr: Opinion: rigorous, reliable science progress depends heavily on epistemic virtues that are largely private to the mind of the scientist. These virtues are neither quantifiable nor fully observable. This may be uncomfortable to those who wish scientific rigor could be checked by some objective method or rubric. Nevertheless, if I'm right, identifying such epistemic virtues would be constructive toward the improvement of science.
Scientific Rigor
I’m interested in the conditions required to support rigorous and robust scientific progress. Statistical methods are good for what they do, but I think they unduly dominate discourse about scientific rigor. There are a number of other positive research methods and practices that I think are equally or more important, and I have written about some of these elsewhere. But the more time I spend thinking about this question and reading the history of science, the more I have come to think that the most important factor underlying research rigor is epistemic virtue. It's pretty hard to substantiate or prove any of these claims, but for what it's worth, here are some opinions.
Most natural scientists I know seem to have in common, and generally take as a given, that the facts of nature are whatever they are, independent of what we say or think about them. The most rigorous scientists seem to be distinguished mainly by how deeply and fully this metaphysical stance pervades their private thoughts and emotions. Contrary to the cultural image of "cold logic", such commitment to truth can be hotly passionate.
The chief epistemic virtue this seems to engender is "skepticism": trying hard to think of experiments or tests or arguments that could weigh against any conclusion one is otherwise inclined to reach; checking for the consequences that should follow if the conclusion is true, but also taking special interest in apparent exceptions and contradictions. Another epistemic virtue is taking care to articulate conclusions precisely, such as avoiding over-generalization and differentiating interpretations from observations. Another is to cognitively bundle with every provisional conclusion, not only an overall confidence rating, but rich information about the type, amount and quality of evidence that led to its support.
If a scientist discovers something surprising and important, they rise in the esteem of other scientists, and perhaps even in the broader culture. We are social creatures, so the respect of peers and prestige in the community feels great. Approval can also catapult one’s career forward, leading to publications, jobs, research funding, tenure, public fame, and so on. To the non-vigilant, these social awards can easily start to be confused for the end goal. This can lead to scientists being motivated more by the desire for one’s claims to be accepted, than for one’s claims to be true. I think this is sadly prevalent. Red flags of this include: feeling annoyed, resentful or threatened by evidence that weighs against the truth of one’s beliefs; feeling motivated to “win” arguments when one’s beliefs are disputed; or staking one’s identity or reputation on being credited for some particular claim or theory. Rigorous scientists are not above social motivations, rather they stake their identity and reputation on being careful, so that when their beliefs are challenged, the desire for approval and social status is aligned with ruthless truth-seeking.
Epistemic Hygiene
I think lot of being a rigorous scientist comes down to vigilance about everyday habits in how one thinks, including how one uses language to express one's thoughts, even in casual conversations and internal monologue. I have always thought of this as "epistemic hygiene" (a term I was astonished to see others use here on LW!). Here's an attempt to list some epistemic habits I associate with great scientists I have known. It's a bit repetitive, as many of these habits overlap.
Habits to Cultivate
Precisely delimiting the scope of statements/conclusions
Differentiating inferences or interpretations from direct observations
Differentiating between evidence for X vs. evidence against some alternative
Differentiating absence of evidence vs. evidence of absence
Stating conclusions in provisional language until exceedingly well tested
Describing tests even in one's own mind as “to find out if X” (not to prove/disprove)
Cultivating indifference to outcome/not preferring either result in an experiment[1]
Staking one's self-esteem on being careful and honest, not being first or right
Staking social identity, status, reputation on being known as rigorous, careful, trustworthy
Refraining from basking too much in approval/acceptance, even if it is deserved
Refraining from chafing too much at being rejected or ignored, even if undeserved
Focusing on how it would harm one to believe x if not true, or disbelieve it if true
Focusing on how it would help one to believe x if true, or disbelieve it if not true
Being glad to have learned about evidence or arguments that cause one to abandon a wrong conclusion for a corrected or better-supported one
If the outcome of a debate is to change one's mind, regarding this as a “win”
Retaining links back to the evidence and inferential steps that led to a conclusion
Trying to think of as many alternative explanations as possible of the known facts
Being one's own devil's advocate, or cultivating friends who serve in that role
Seeking out observations or tests that could disprove a favored conclusion
Checking many diverse predictions that should all hold if a conclusion is true
Looking for the boundary cases until one finds exactly where a rule breaks down
Actively seeking and investigating exceptions or apparent contradictions
Habits to Shun
Overgeneralizing language
Overconfident language
Using persuasive rhetorical devices (which are subtle rhetorical fallacies)
Stating inferences or interpretations as if they were direct observations
Staking one's identity, status or reputation on a particular discovery or theory
Use of language like “we did this to rule out x” or “to prove y” or “we hope to show”
Other factors
Epistemic hygiene is not all there is to doing great science, of course. There are other perhaps more important factors, such as training (having deep domain-specific knowledge, extensive hands-on experience, advanced technical skills); having good taste in scientific questions; having good hunches (priors); creativity in coming up with theories/hypotheses that could explain the existing knowns; cleverness in designing experiments or studies that answer the crux unknowns; fastidiousness in execution of such experiments/studies; astuteness of observation, including noticing things one didn't plan to observe; ability to synthesize disparate lines of evidence; ability to extrapolate potential applications; and more. I just think the role of epistemic virtue is underrated.
If your doctor performs a cancer test, of course you hope for a negative result. But by this you mean: you hope you don't, in fact, have cancer. If you do have cancer, you wouldn't want the test to give a false negative result. Likewise, you might be doing an experiment to find out if your theory is true. You can hope you have indeed figured out the correct theory. But focusing on that hope often leaks into unconscious bias in interpreting results. Therefore it is better to focus on hoping the experiment provides a clear, true answer, either way. Or at least, counterbalance the success hope by also fervently hoping you don't get a false lead that would cause you to waste time or to fail to arrive at the correct theory.
tl;dr: Opinion: rigorous, reliable science progress depends heavily on epistemic virtues that are largely private to the mind of the scientist. These virtues are neither quantifiable nor fully observable. This may be uncomfortable to those who wish scientific rigor could be checked by some objective method or rubric. Nevertheless, if I'm right, identifying such epistemic virtues would be constructive toward the improvement of science.
Scientific Rigor
I’m interested in the conditions required to support rigorous and robust scientific progress. Statistical methods are good for what they do, but I think they unduly dominate discourse about scientific rigor. There are a number of other positive research methods and practices that I think are equally or more important, and I have written about some of these elsewhere. But the more time I spend thinking about this question and reading the history of science, the more I have come to think that the most important factor underlying research rigor is epistemic virtue. It's pretty hard to substantiate or prove any of these claims, but for what it's worth, here are some opinions.
Most natural scientists I know seem to have in common, and generally take as a given, that the facts of nature are whatever they are, independent of what we say or think about them. The most rigorous scientists seem to be distinguished mainly by how deeply and fully this metaphysical stance pervades their private thoughts and emotions. Contrary to the cultural image of "cold logic", such commitment to truth can be hotly passionate.
The chief epistemic virtue this seems to engender is "skepticism": trying hard to think of experiments or tests or arguments that could weigh against any conclusion one is otherwise inclined to reach; checking for the consequences that should follow if the conclusion is true, but also taking special interest in apparent exceptions and contradictions. Another epistemic virtue is taking care to articulate conclusions precisely, such as avoiding over-generalization and differentiating interpretations from observations. Another is to cognitively bundle with every provisional conclusion, not only an overall confidence rating, but rich information about the type, amount and quality of evidence that led to its support.
If a scientist discovers something surprising and important, they rise in the esteem of other scientists, and perhaps even in the broader culture. We are social creatures, so the respect of peers and prestige in the community feels great. Approval can also catapult one’s career forward, leading to publications, jobs, research funding, tenure, public fame, and so on. To the non-vigilant, these social awards can easily start to be confused for the end goal. This can lead to scientists being motivated more by the desire for one’s claims to be accepted, than for one’s claims to be true. I think this is sadly prevalent. Red flags of this include: feeling annoyed, resentful or threatened by evidence that weighs against the truth of one’s beliefs; feeling motivated to “win” arguments when one’s beliefs are disputed; or staking one’s identity or reputation on being credited for some particular claim or theory. Rigorous scientists are not above social motivations, rather they stake their identity and reputation on being careful, so that when their beliefs are challenged, the desire for approval and social status is aligned with ruthless truth-seeking.
Epistemic Hygiene
I think lot of being a rigorous scientist comes down to vigilance about everyday habits in how one thinks, including how one uses language to express one's thoughts, even in casual conversations and internal monologue. I have always thought of this as "epistemic hygiene" (a term I was astonished to see others use here on LW!). Here's an attempt to list some epistemic habits I associate with great scientists I have known. It's a bit repetitive, as many of these habits overlap.
Habits to Cultivate
Habits to Shun
Other factors
Epistemic hygiene is not all there is to doing great science, of course. There are other perhaps more important factors, such as training (having deep domain-specific knowledge, extensive hands-on experience, advanced technical skills); having good taste in scientific questions; having good hunches (priors); creativity in coming up with theories/hypotheses that could explain the existing knowns; cleverness in designing experiments or studies that answer the crux unknowns; fastidiousness in execution of such experiments/studies; astuteness of observation, including noticing things one didn't plan to observe; ability to synthesize disparate lines of evidence; ability to extrapolate potential applications; and more. I just think the role of epistemic virtue is underrated.
If your doctor performs a cancer test, of course you hope for a negative result. But by this you mean: you hope you don't, in fact, have cancer. If you do have cancer, you wouldn't want the test to give a false negative result. Likewise, you might be doing an experiment to find out if your theory is true. You can hope you have indeed figured out the correct theory. But focusing on that hope often leaks into unconscious bias in interpreting results. Therefore it is better to focus on hoping the experiment provides a clear, true answer, either way. Or at least, counterbalance the success hope by also fervently hoping you don't get a false lead that would cause you to waste time or to fail to arrive at the correct theory.