TLDR: First you go with your gut, then you get a logical model, then you improve that model. Trusting your logical model over your gut before it gets good enough is a very common way to believe wrong things.
[epistemic status: probably approximately true, with possible pathological cases around the edges]
The process of getting better at describing and predicting things seems to usually go something like this:
First, you start out with an intuitive model which nature gives you automatically without any effort on your part. This model uses the language of system 1, and is a black box whose contents are unknown to you.
Then, you develop a weak analytical model in the language... (read 582 more words →)