This is a story about an odd fact about capital project decision making in engineering I noticed and how it might be related to cognitive biases
Chemical engineers often have to make decisions about what capital improvement projects the firm will undertake, so they must answer questions such as 'install cheap pumps that wear out quickly or the expensive ones that don't?', 'what ethanol producing bacteria is most efficient for producing ethanol?' and 'is it worth it to install a heat exchanger to recover the waste head from this process or not?'. The standard technical way of judging the profitability of an option or project is to calculate the Net Present Value (NPV) of the expected cash flows to and from the firm for each different option (installing pump type A or B, using bacteria A, B or C, installing or not installing a heat exchanger). The option with the highest NPV is the most profitable. Calculating the NPV discounts future expected cash flows for the fact that they occur in the future and you have other productive things you could do with money, such as earning interest with it.
Oddly high discount rates
When I was in school, I noticed an odd thing: the interest rates that people used to evaluate projects on this basis, called the Minimum Acceptable Rate of Return (MARR), were often rather high, 15-50%/year. I saw this in textbook discussions and had it confirmed by several working engineers and engineering managers. My engineering economics teacher mentioned that firms often require two year "pay back periods" for projects; that annualizes to a 50% interest rate! I was very confused about this because a bank will loan a small business at ~8% interest (source) and mortgage rates are around 5% (source). This implied that many many industrial projects would be profitable if only outsiders could fund them, because investors should jump at the chance to get 15% returns. I know I would! The profit opportunity seemed so great that I started to work out business models around alternative sources of investment for industrial projects.
Overestimating benefits, underestimating costs
To understand why MARRs are so high in chemical engineering, I tried to find research on the question and talked to experienced engineers and managers. Unfortunately, I never did find research on this topic (let me know if you know of relevant research). I talked to several managers and engineers and the most common answer I got was that investors are short sighted and primarily interested in short run profits. I didn't find this answer very plausible. Later, I met an engineer in charge of reviewing project evaluations made by other engineers in order to decide which projects would be approved. His explanation was that engineers usually overestimate the benefits of a project under consideration and underestimate the costs and that they gave engineers high MARRs in order to counterbalance this. I asked him why they didn't just apply a scaling factor to the costs and benefits and he explained that they did this a little bit, but engineers respond to this by inflating benefits and deflating costs even more! I later met another engineer who talked about doing exactly that; adjusting estimated costs down and estimated benefits up because the process evaluating projects did the reverse.
One thing to note is that if engineers overestimate benefits, underestimate costs uniformly over time, then a high MARR will make projects which pay off in the short term artificially attractive (which is why I asked about using a scaling factor instead of a large interest rate). On the other hand, if engineers make more biased predictions about costs and benefits the further out they are in time (for example, if they tend to overestimate the productive life of equipment), then a high MARR is a more appropriate remedy.
There are a couple of reasons why engineers might end to overestimate the benefits and underestimate the costs of projects. Any number of these may contribute. I suspect cognitive bias is a significant contributor.
- Confirmation bias suggests engineers will tend to overestimate the benefits and underestimate the costs of projects they initially think are good ideas. The head project engineer I spoke with described a common mindset thus,
'And this is why we tend to focus on the goodness and diminish the badness of projects. We know they are good so all we need to do is prove it to get the approvers bought in. Then the project approvers over time notice that these projects returns are lower than expected so they say, “Let’s raise the bar.” But guess what? The bar never rises. Why? Because we still "know" what the good projects are and all we need to do is prove they are good.'
- The planning fallacy suggests engineers will underestimate completion times and costs.
- Overconfidence suggests engineers will underestimate costs even when explicitly accounting for uncertainty.
- Bad incentives: engineers may often be rewarded for spearheading projects and not punished commensurately if the project is not beneficial so that they often expect to be rewarded for spearheading a project even if they don't expect it to be a success.
Anyway I started asking our cost coordinator about predicted schedule and she is by far more accurate than the engineers with how long it takes to do a project. That has led me to think that an independent review would be a good step in project returns. Unfortunately, I have not noticed her to be any better on predicting project performance than the engineers.
On adjusting predictions based on a track record
The problem with predicting a project will take longer than expected based on experience does not help because managers (usually engineers) want to know "why" so the can "fix it."
This calls to mind Archibald Putt's "Fifth Law of Decision Making":
In other words, the engineer has an obvious incentive to get his project approved regardless of whether or not it is good: if it gets approved, he gets more money.
And indeed, you specifically mention this:
Everyone on LW really should read Putt's Law and the Successful Technocrat; it's the definitive guide to the social dynamics of large corporations.
See Project Management Haiku.
That'd be an ok list if it didn't pretend to be poetry.
Could taxes be a factor? If $100 of investment returns $50 benefits in a year, part of those benefits may be taxed before they could be used to pay back a loan (I don't know the details of how that would work out if the $50 benefits would be money saved, i.e. material that doesn't need replacing etc.).
Risk would also be a factor - even if you expect $50 benefits for $100 investment in a year, an exterior investor would have to factor in the chances that the company will go bankrupt. in his expected benefit calculation, whereas that doesn't feature in your calculation of the expected returns of the $100. And even assuming a honest assessment by the engineer, an exterior investor has to trust a longer chain of people than internal management does (and in addition, the external investor knows the involved people less, so has more uncertainty about how much they are exagerating).
So even if engineers always made perfectly accurate estimates, I would still expect taxes and risk to bring the "internal interest" to a lower value when it's considered by an external investor.
(Sorry, I'm not well versed in the vocabulary of economics and finance, I'm sure there's a better way of phrasing all this)
Corporations are taxed on their profits (instead of their income) specifically to avoid discouraging investment in a perverse way.
Bankruptcy risk is certainly relevant, but its even more relevant for small businesses, which still have reasonable interest rates. If bankruptcy risk is what drives high MARRs that implies that a company that uses a 30% MARR has a ~20%/year default rate (30-10%) which is implausibly high.
If this were true then we would expect MARR to vary with the things that affect the magnitude of overconfidence bias. If MARR were set mainly by external factors they should be unrelated.
Would it be possible to do a study? Has a study already been done?
I have not been able to find any, but it may be that my searching skills are not up to snuff.
They might also have lots of projects but not enough ressources for them all, so they have to select for high return rates.
In this case, they could simply borrow more money.
There are sharper limits than you'd expect on how fast companies can grow. (Relevant: Ben & Jerry's vs Amazon: fast land grabs kill corporate culture, among other things; also that Robin Hanson(?) post I can't find about companies wanting to be bigger than they should.) You can borrow money, hiring is a bit harder, and assimilating new hires into your company just has to be slow, as well as restructuring management when everything gets bigger and you do more in parallel.
This was my first thought as well, which is why I started trying to think up creative ways firms could get more capital, including employee investment in particular projects. However, unless outsiders are extremely wary of industrial firms, these constraints should only apply when a firm is expanding its investment very very quickly, so high MARRs would only be a transient feature, not the norm.
From the perspective of a software engineer, this sounds very familiar. Though in software it's generally estimating time-spent on producing features.
I'd also add: 5) engineers often totally forget about the black swans (or unknown unknowns) - especially obscure edge-cases that require special handling.
...though there's also one that I suspect doesn't happen often in chemical engineering: 6) scope-creep.. ie the client gets so enthusiastic about the idea that they keep adding to it until the final requirements are much bigger than the original
As to processes that seem to help keep down the overestimation. I find that velocity-tracking seems to actually work, if you can keep it up. ie measuring how much work actually got done and using it as a scaling factor on future estimations.
I'm not sure that's what is meant by a black swan, but in any case, this would be a factor only if managers more accurately predict the impact of these events. (If it's a case of known-but-unusual things happening in part of the problemspace, that's an issue of defining the problem appropriately for the engineers, and can be corrected before the solution is implemented without a higher time-discount or margin of safety.)
Yes, perhaps that should instead read:
5) engineers often totally forget about the unknown unknowns or even black swans.
I agree that 'black swan' isn't really appropriate here, but can you clarify what you mean by 'that's an issue of defining the problem appropriately for the engineers'?
I just mean that if you don't tell the engineers that special constraints apply in part of the designspace that render any design in that part horrible, it's not their fault if, not knowing this extra constraint, they offer that as their solution because it is otherwise optimal.
Yes, that's true. and we should make sure that a client thinks deeply about all the cases that would reasonably come up.
...but I often find that there are situations that come up that really could not have been predicted in advance.
A small example for an e-commerce site might be the effect that adding multiple currencies has on dealing with "$X off" promotions if all you do is scale from a single currency you end up with $5 off or £3.24 off... which is ugly... and unfortunately required us to rework promotions to allow us to add "currency value overrides" for this kind of promotion.
It was unexpected, we didn't know about it in advance( either us or our clients) but it was obvious-in-hindsight that it would be unacceptable to the client/customers.
I agree with you in that respect. If neither managers nor engineers are aware of special conditions that attach in part of the designspace until it catastrophically happens, then that is a black swan, and a reason to temper judgment with a higher discount rate.
I was only adding that if engineers come back with a solution that, given what they were told, is optimal, when the managers knew all along about special conditions make the proposed solution unfeasilbe, it's not a black swan, nor really fair to say, "silly engineers -- not accounting for the real world!" (It's at that point that you should re-iterate, adding that constraint to the spec this time and you certainly shouldn't go forward with it.) And that, if this is the problem, it's not the kind of thing that requires a higher discount rate or margin of safety, just better problem specification.
Definitely agree with you on that one. I'd guess it more a problem of scope-creep. You can also solve that by discounting, but it's much better to solve by enforcing different accounting for change-requests. I've found that if you keep time spent on scope-creep as a separate bottom-line, it draws more attention to the problem (which makes it more likely to get solved).
It's almost certainly a reference to this theory, which essentially describes extremely difficult to predict events that have a disproportionate impact. Unfortunately the very nature of these events makes them difficult to control for.
I'm familiar with the term, and it's not the same thing as obscure edge cases that require special handling; rather, it's unforseeable events that have disproportionately large impact, nothing to do with being an edge case or requiring special handling per se. An unknown unknown that never has an impact is not a black swan. And something that requires special handling isn't a black swan if that "special handling" is known in advance (even if the fact that it will happen is not).
I think an unforeseeable edge case or bug that requires deep refactoring and severely cuts into allotted development time fits the bill for a black swan dead on.
Understood. For some reason I misread your original post as saying that you weren't sure what a black swan was. I agree with your analysis here.