I usually don't use paper or spreadsheet for Fermi estimates; that would make them too expensive. Also, my Fermi estimates tend to overlap heavily with big-O estimates.
When programming, I tend to keep a big-O/Fermi estimate for the runtime and memory usage in the back of my head. The big-O part of it is usually just "linear-ish" (for most standard data structure operations and loops over nested data structures), "quadratic" (for looping over pairs), "cubic-ish" (matrix operations), or "exponential" (in which case I usually won't bother doing it at all). The Fermi part of it is then, roughly, how big a data structure can I run this on while still getting reasonable runtime? Assume ~1B ops per second, so for linear-ish I can use a data structure with ~1B entries, for cubic-ish ~1k entries, for exponential ~30 entries.
This obviously steers algorithm/design choice, but more importantly it steers debugging. If I'm doing a loop which should be linear-ish over a data structure with ~1M elements, and it's taking more than a second, then something is wrong. Examples where this comes up:
I also do a lot of Fermi estimates when researching a topic or making a model. Often these estimates calculate what a physicist would call "dimensionless quantitites" - we take some number, and express it in terms of some related number with the same units. For instance:
In general, the trigger for these is something like "see a quantity for which you have no intuition/poor intuition", and the action is "express it relative to some characteristic parameter of the system".
I notice that with regards to many things I always think of at least one of the following aspects:
As each of those is quantifiable, it prompts me to actually put some numbers on the given problem.
I last made a spreadsheet because I received a medical bill and wanted to calculate the correct amount and estimate what the insurance company should pay.
I probably do basic sanity checks moderately often, just to see if something makes sense in context. But that's already intuition-level, almost.
Last time I actually pulled an excel was when Taleb was against IQ and said its only use is to measure low IQ. I wanted to see if this could explain (very) large country differences. So I made a trivial model where you have parts of the population affected by various health issues that can drop the IQ by 10 points. And the answer was yes, if you actually have multiple causes and they stack up, you can end up with the incredibly low averages we see (in the 60s for some areas).
It's an interesting example because on one hand it sounds trivial: you have shitty living conditions, you end up with shitty results. But on the other hand my mind didn't want to accept the end result of an under 80 average until I had the numbers in front of me.
I do it pretty rarely, so maybe not the best answerer. But I often do it when I feel like I want to compare long-term plans and one of them has a clear price while the other one only maybe does. Trying to estimate prices I'd put on things is one of a couple different decision-making tools.
Recent spreadsheet situations:
The rule I try to apply for myself is: whenever it is at all possible to open a spreadsheet and/or calculator app. On the rare occasion it's not possible (or would be impolite) the extra experience and intuition will be valuable. There's much more risk that I will underuse it than overuse it.