I've been often wondering why scientific thinking seems to be so rare. What I mean by this is dividing problems into theory and empiricism, specifying your theory exactly then looking for evidence to either confirm or deny the theory, or finding evidence to later form an exact theory.
This is a bit narrower than the broader scope of rational thinking. A lot of rationality isn't scientific. Scientific methods don't just allow you to get a solution, but also to understand that solution.
For instance, a lot of early Renaissance tradesmen were rational, but not scientific. They knew that a certain set of steps produced iron, but the average blacksmith couldn't tell you anything about chemical processes. They simply did a set of steps and got a result.
Similarly, a lot of modern medicine is rational, but not too scientific. A doctor sees something and it looks like a common ailment with similar symptoms they've seen often before, so they just assume that's what it is. They may run a test to verify their guess. Their job generally requires a gigantic memory of different diseases, but not too much knowledge of scientific investigation.
What's most damning is that our scientific curriculum in schools don't teach a lot of scientific thinking.
What we get instead is mostly useless facts. We learn what a cell membrane is, or how to balance a chemical equation. Learning about, say, the difference between independent and dependent variables is often left to circumstance. You learn about type I and type II errors when you happen upon a teacher who thinks it's a good time to include that in the curriculum, or you learn it on your own. Some curriculums include a required research methods course, but the availability and quality of this course varies greatly between both disciplines and colleges. Why there isn't a single standardized method of teaching this stuff is beyond me. Even math curriculums are structured around calculus instead of the much more useful statistics and data science placing ridiculous hurdles for the typical non-major that most won't surmount.
It should not be surprising then that so many fail at even basic analysis. I have seen many people make basic errors that they are more than capable of understanding but simply were never taught. People aren't precise with their definitions. They don't outline their relevant variables. They construct far too complex theoretical models without data. They come to conclusions based on small sample sizes. They overweight personal experiences, even those experienced by others, and underweight statistical data. They focus too much on outliers and not enough on averages. Even professors, who do excellent research otherwise, often suddenly stop thinking analytically as soon as they step outside their domain of expertise. And some professors never learn the proper method.
Much of this site focuses on logical consistency and eliminating biases. It often takes this to an extreme; what Yvain refers to as X-Rationality. But eliminating biases barely scratches the surface of what is often necessary to truly understand a problem. This may be why it is said that learning about rationality often reduces rationality. An incomplete, slightly improved, but still quite terrible solution may generate a false sense of certainty. Unbiased analysis won't fix a lousy dataset. And it seems rather backwards to focus on what not to do (biases) rather than what to do (analytic techniques).
True understanding is often extremely hard. Good scientific analysis is hard. It's disappointing that most people don't seem to understand even the basics of science.