I see your distinction, but how much slack you think you have is necessarily a judgement based on how demanding your environment is. People mis-estimate that regularly, and it changes over time anyway. If it feels wrong to have available resources that you're not using, it may be that you just need to lighten up. It also may be that you're correctly, if not necessarily consciously, perceiving that your environment is competitive and you actually do need to buckle down (or go somewhere else). These are different problems with different solutions but similar symptoms.
Thanks to Scott Alexander, people on this blog typically use the term Moloch for the antithesis of slack. Moloch is a dynamic where intense competition forces everyone to spend all available resources to have a chance (not a guarantee) of success.
If one person is talking analytically and the other is talking about meaning-making, then you're each trying to have different conversations. One of you is talking about how to do something and the other is talking about how to motivate people to do something. If at all possible you should let the first person lead; if they're diligently working on the problem then they're motivated enough.
To consider your support team example: they seem to be assuming that if their product works well, customers will be satisfied. That's not a terrible strategy, and it puts the focus on something they can control (the product). If you could point to something else about the customer experience that's causing customer dissatisfaction, they would probably understand the problem and deal with it. But if there's nothing specific that needs addressing otherwise, it's probably best just to let them focus on getting the instrument to work as well as possible.
And of course, maximizing customer satisfaction is itself a strategy toward achieving your real goal, which is profit. Companies don't give their flagship products away for free*, no matter how much it would please the customers.
*With the exception of some loss leaders that are carefully calculated to grow revenues over the long term.
I think part of it is that contracts are mostly interpreted by trained humans. A computer works through each line of code before continuing to the next line. A human can look at a paragraph of standard legal language, understand that it does the standard thing, and move on in a second or so; reading a paragraph of non-standard language makes the human stop and think, which is much slower and often causes anxiety.
Even better, there are usually many court cases establishing exactly how the standard language should be interpreted in a wide variety of circumstances, which makes the standard language much more predictable and reliable. In software terms, it has already been debugged.
I'm not a gambler by temperment, so I'm just not very interested in betting.
In each of those cases, what worked was a fundamentally new approach. We didn't breed leeches to the point where they could cure smallpox. Photovoltaics have been around since the 50s; if they were going to work at scale they'd have worked by now.
I think we've uncovered the basic disagreement and further discussion seems pointless.
We've been trying to make solar work for a very long time. I can remember when there were solar panels on the White House roof (Reagan had them removed). Things that have underperformed for decades almost never take off.
Since my side of the bet implies that the internet is not likely to exist by 2040 and I'd never find you if I won, this bet is not appealing. It is not possible to take a short financial position on civilization. However, if settlement could be arranged and the stakes weren't chump change, in principle I'd take the bet.
Everything you're saying fits the common narrative; I just think there's a roughly 80 percent chance that it's wrong.
I invite you to look at the Sankey diagrams for the US last year (2019). Despite decades of hard subsidies, solar power generated only 1.04 percent of the energy we used. Spain scaled up solar as much as they could, and despite significant advantages (sunny climate, lack of hurricanes) they only managed an EROEI value of 2.45 (for comparison, some estimates put the minimum EROEI for civilization as we know it at about 8-10, although optimists go as low as 3). Solar power has been ten years away for at least fifty years now, and it's starting to look like it always will be.
Nuclear power is more realistic, as you noted - it generated 8.46 percent of our energy last year. Still, the ability to scale that up to 100% is questionable. Fission power requires rare earths, and they're called rare for a reason. Fusion is great at generating neutrons* and high-level radioactive waste (when the neutrons impact the environment), but I've never heard of it coming anywhere near breakeven (EROEI=1) in energy terms (unless you count solar).
*There are aneutronic reactor proposals, but they're pretty unrealistic even by fusion energy standards.
It seems like what you're calling "progress studies" is what was called "modern history" until about 1960 and is derisively termed "Whig history" in the field these days. The basic premise is that material wealth went exponential in Europe starting around the 17th century, that this process (called "progress") gave Europe the means to travel to and dominate the rest of the world, and that the central questions of modern history are what happened to initiate this "progress", how it works, whether it will continue, and what forms it will take. Despite the change in academic fashions, these questions remain crucially important.
I tend to agree with what you call the "materialist" position. A barrel of oil has more energy than a decade of manual labor; without fossil fuels it is expensive to smelt metals and all but impossible to make useful semiconductors. Progress as we know it today is entirely dependent on metal (e.g. wires) and semiconductor-based computers. In principle nuclear power may be sufficient, but that's an open question at this point.