I've always held that, since "fewer" refers to discrete quantities and "less" refers to continuous quantities, but the integers are a subset of the reals, using "less" when the grammar police would call for "fewer", is less precise, but not wrong. Going the other way doesn't work, though.
Getting to Mars is a big jump compared with going to the moon, since you're traveling 500-1000x farther on any efficient route, and have to do much more to keep the astronauts from getting irradiated, or hopelessly weakened by the lack of gravity. Getting to Pluto is 100x farther than that, so you would need to do some special things to, e.g., not have your astronauts die of old age on the trip.
This is all true, but these jumps in difficulty are nothing compared with the jump in difficulty of getting to the moon (or even LEO), versus getting between any two points on Earth.
This is an interesting piece, thought provoking, but the core premise is unconvincing. As you've presented things, maybe in this case, I have to accept that there is some super-powerful being that will do really bad things to me if I don't kowtow, or do really good stuff for me if I do, but that's not the same as truly accepting that this being is the fundamental reference for right and wrong, and that aligning with what this being says is ultimately good, and not aligning with it is evil. There's a fundamental difference between believing, "According to God, [religiously proscribed thing] is wrong", and, "[religiously proscribed thing] is wrong". In your scenario, maybe I become convinced of the former, and either rebel, or avoid [religiously proscribed thing], or try find some way to appease God, while in the second case, I'll either be genuinely trying to change, or at least feeling really guilty about being such a bad person.
There's a big difference between being an Atheist in a quandary, versus being truly converted.
The scenario you've presented starts with the discovery that the universe is different than previously believed, but glosses over the steps for me to become convinced "with p(99%), that Christianity is and always has been true". Maybe for someone at 4 or 5 on the Dawkins scale, just seeing some strong evidence supporting the existence of God nudges them to 1 or 2. But, for me (6), Atheism is not just a matter of lack of evidence, but the cumulative effect of experience and logic. There are versions of God that I could accept, but an arbitrary, anthropomorphized God is a much bigger leap. To get from A to B would involve such a fundamental rewiring of my thought processes, that I'm no longer really the same person I am now. If I got there somehow, I would not be rebelling.
Most of your examples seem more like "prerequisites" or basic skills that you build on. But scaffolding is a thing you build up to get something else done, then get rid of afterwards. So, a scaffolding skill would be a skill that enables you to learn how to do something you actually want to learn, but once you have learned how to do that thing, you no longer need the scaffolding skill.
Algebraic notation can still be useful to a chess player. Knowing basics like how to properly cut things is integral to cooking. Debugging is an essential skill for programming. Etc.
A couple better examples of scaffolding skills:
There are conventions pertaining to the style a letter is written in. You might have a as an arbitrary element in set A, while a<sub>1</sub>, a<sub>2</sub>, etc., are a sequence of elements in A, while fancy-boldface A or fancy-script A (I don't know how to render those here) could represent the class of A-like sets. Also, a′ would be another case of a, or maybe the derivative of a, while a″ would be a third case, or maybe the second derivative. Sometimes superscripts or backscripts are used, when subscripts are not enough. Sometimes the Greek equivalent of a Latin letter denotes some relationship between them, e.g. ⍺ is some special version of a.
If the math goes way off into the weeds, the author might even whip out a Hebrew letter or two.
There is a related problem where many browser-based productivity tools follow design principles from websites that are trying to get clicks. For example, I commonly run into DB interfaces at work that will return, say, 10 (or 25, or other small number) results per page. Now, it's good design to not let a query that returns, say, a million results, crash the browser. But, if a few hundred, or even a few thousand, results will display within milliseconds, why make me page dozens (or hundreds) of times? (I'm looking at you, GitHub commit history!)
Another example would be Microsoft Office's long history of making design choices that optimize increasing "engagement with the tool" rather than making the tool unobtrusively facilitate the task. They got rid of Clippy, but now they're into obnoxious pop-ups that are designed to call attention to some new gadget or feature.
The "click" economy has had pervasive effects on the software industry.
It seems to me this is an example of you and Kaj talking past each other. To you, B's perspective is "eminently reasonable" and needs no further explanation. To Kaj, B's perspective was a bit unusual, and to fully inhabit that perspective, Kaj wanted a bit more context to understand why B was holding that principle higher than other things (enjoying the social collaboration, the satisfaction of optimally solving a problem, etc.).
Except there's more at play than just winning the election. If you're a voter in a swing state, the candidates are paying more attention to you, and making more promises catering to you. The parties are picking candidates they think will appeal to you. Even if your odds of winning stay the same, the prize for winning gets bigger.
It was exiting a few elections ago when Colorado was in play by both parties. We even got to host the Democratic convention in Denver. Now, they just ignore us.
One thing you touched on, but didn't delve into, is that the various "pay" components will having varying marginal utility at different levels.
For example, if you're literally starving, "coolness" won't matter much, you need enough money to buy food! But if you have enough money, you start caring about other things.
Perhaps having some social interaction is important, and you would sacrifice other things to have at least some of that in your job. But, beyond a certain point, the value diminishes, and would likely go negative, as the constant socializing gets tiring, and distracts from work you actually would like to do.
I think a good manager would be good at optimizing against those utility curves. They would pay people enough, but not more, than they need to not be upset about low pay. They would recognize that one team party per quarter might be valuable to the team, but parties every week would not be appreciated. They would give people opportunities to socialize, but also, to avoid getting dragged into socializing when they would really rather be focused on the job. And so on.
One reason missing from your "couldn’t get it on their own" scenario: They couldn't justify spending the extra money on the "fancy" version of something, or never would have thought to. If that something was relatively cheap, you can get the deluxe version without breaking the bank.
I had a crummy little basic stapler that would jam. Someone got me a fancy Swingline. They spent like, $12.99, and years later, every time I staple something (maybe only a few times a year, but still...) I feel gratitude. I could easily have bought a better stapler, but never thought to.
The nice pen. The fancy bit of super-tasty cheese. The actual Otterbox phone case. One decent kitchen knife. Go for quality on something small, rather than look for a bargain on the big stuff.