If technological progress has slowed down, what is causing it? Here is a hypothesis.

Broadly speaking, there are three domains of activity important to technological progress: science, invention, and business. Science discovers new knowledge; invention creates useful machines, chemicals, processes, or other products; and business produces and distributes these products in a scalable, self-sustaining way. (Occasionally inventions are distributed by government: water sanitation is an example. But this oversimplified model will serve for our purposes.)

These domains do not form a simple linear pipeline, but they are distinct areas that attract different types of people, pose different challenges, and are judged by different standards. As such they create distinct communities and subcultures.

My hypothesis is that while science and business have functioning career paths, invention today does not.

Consider science. Suppose a high school or university student has a glimmer of desire to become a scientist. They will find that their road has already been paved. “Scientist” is a career. There’s an established path into the career: get a BS and then a PhD in a scientific field. There are research labs that hire scientists, organize them into teams, and give them space and equipment. There is funding for all of this, from government and philanthropy. There is an established deliverable: talks and papers, presented at conferences and published in journals. There are awards and honors that confer prestige within the discipline; some of these, such as the Nobel, are even well-known and respected among the general public.

All of this combines to create a career path for the scientist: anyone with even a modest level of commitment and effort can start down the path, and those who are exceptionally talented and ambitious can reach for inspiring goals. Importantly, there is a feedback loop in which progress down the career path opens opportunities. The more the scientist produces legible accomplishments, the more they are able to get grants, secure coveted positions, and attract talent to work with them. Money, prestige, and the opportunity to do meaningful work all (roughly) go together.

Entrepreneurship has different structures, but the career path is there nonetheless. “Startup founder” is not a job you get hired for; it is a job the founder must create for themselves. They must raise their own funding, create their own organization, and hire their own team. In this sense, the founder is much less well-supported than the scientist. But there are established sources of funding for startups, in venture capital. There is a known job title, CEO, that you can give to yourself and that is understood by others in the industry and in society. There is an objective way to measure success: company profits and market valuation.

The founder career path is to create a successful company. Once again, progress on this path opens up opportunities. The most successful founders have the resources and reputation to launch even more varied and ambitious projects (think Jeff Bezos or Elon Musk). However, a startup failure does not end a career. In Silicon Valley at least, failure is not a black mark, and a failed founder can do another startup, or get a job in engineering, design, sales, or management.

We can think of a career path as a social support structure around a value. In science, the value is new knowledge. In entrepreneurship, the value is profitable business. Having a support structure around a value means that if someone is motivated to pursue that value, they can be paid to do so; and if they succeed, they can expect both prestige and expanded career opportunities.

Now, what is the career path for an inventor?

“Inventor” is not a role one can be hired for. The aspiring inventor finds themselves straddling science and business. They could join a research lab, or become an engineer at a technology-based company. In either case, they will be misaligned with their environment. In research, what is valued is new knowledge. An invention that achieves a practical goal is not valued if it demonstrates no new scientific principle. In the corporate environment, what is valued is what drives the business. The engineer may find themselves optimizing and refining existing products, without any mandate to create fundamentally new ones. Neither environment values simply making fundamentally new technologies work. Alternately, an inventor could also be an entrepreneur, starting a company to commercialize the invention. But this requires of the inventor that they have the wherewithal of the startup founder to raise money, hire a team, etc. We ask this of founders because it’s in the nature of the job: someone who can’t do these things probably wouldn’t succeed at the rest of the founder’s task. But we don’t expect every scientist to found their own research lab, and we shouldn’t expect every inventor to be a founder either.

In the early 20th century there were options for inventors. Some joined the great corporate research labs of the day: General Electric, Westinghouse, Kodak, Dow, DuPont, and of course Bell Labs. Others stayed independent, patented their inventions, and sold or licensed the patents to businesses. This let them make a living by inventing, without being personally responsible for commercializing, scaling, and distributing their inventions (although it required seed funding: many inventors had second jobs, or got angel investment through personal connections).

For reasons I still don’t fully understand, both options have withered. Corporate research is largely not as ambitious and long-term as it used to be. The lone inventor, too, seems to be a thing of the past.

The bottom line is that if a young person wants to focus their career on invention—as distinct from scientific research, corporate engineering, or entrepreneurship—the support structure doesn’t exist. There isn’t a straightforward way to get started, there isn’t an institution of any kind that will hire you into this role, and there isn’t a community that values what you are focused on and will reward you with prestige and further opportunities based on your success. In short, there is no career path.

Note that funding alone does not create a career path. You could start an “invention lab” and hire people to make inventions. You could even pay, reward and promote them based on their success at this task. But it would be difficult to hire any ambitious academic, or anyone who wanted to climb the corporate ladder, because this role wouldn’t be advancing either career path. That isn’t to say that it would be impossible to hire great talent, but you would be facing certain headwinds.

I think this is why the NIH receives relatively conventional grant proposals even for their “transformative research awards”, and why Donald Braben says that he had to build a high degree of trust with researchers before they would even tell him their ambitious research goals (see Scientific Freedom, p. 135). The community that forms around a career path has its own culture, and that includes an oral tradition of career advice, passed down from senior to junior members of the tribe. What kinds of goals to pursue, what kinds of jobs to take and when, how to choose among competing opportunities—there is folklore to provide guidance on all these questions. A single grant program or call for proposals cannot counter the weight of a culture that communicates: “the reliable way to build a scientific career is by proposing reasonable, incremental research goals that are well within the consensus of the field.”

In part, I see this as both the challenge and the opportunity of efforts like PARPA or FROs. It’s a challenge because a career path must ultimately be supported by a whole community. But it’s an opportunity because efforts like this could be how we bootstrap one. Funding alone doesn’t create a career path, but it can attract a few talented and ambitious mavericks who value independence and scoff at prestige. Success could bring more funding, and inspire imitators. Enough imitators would create an ecosystem. Enough success would bring prestige to the field.

It won’t be easy, but I am excited by efforts like these. We need a career path for invention.


Thanks to Ben Reinhardt, Matt Leggett, and Phil Mohun for reading a draft of this.

New to LessWrong?

New Comment
18 comments, sorted by Click to highlight new comments since: Today at 10:57 AM

I definitely agree that the idea of unconstrained "invention" is not well supported in society, but the hypothesis makes me go "huh?"

Science discovers new knowledge; invention creates useful machines, chemicals, processes, or other products; and business produces and distributes these products in a scalable, self-sustaining way.

Is the place you use the word "invention" not engineering? For most types of engineering, undergrad students are taught science for two years (it is new knowledge to them), they're taught how to usefully apply that knowledge for a year and a half, and then they have a final semester or two explaining how that knowledge can be used to achieve some business goals. 

In other words, "career that applies scientific knowledge to make up stuff" seems to be engineers. 

When I ctrl+f-replace "inventors" with "engineers" in my head, I personally see your career path theory making more sense given that engineers do have a career path, which is mostly to become well-degreed technicians, financiers, or tenture-track-warriors. They ought to becoming inventors, but the existing paths divert them.

Corporate research is largely not as ambitious and long-term as it used to be.

A large part of this may be that there is increasing pressure on CEOs to focus on short-term earnings at the expense of long term earnings. 

All (most?) invention is engineering, but a lot of engineering is not invention.

Boeing employs many airplane engineers, but they don't really invent new planes. Facebook employs many software engineers but isn't inventing much in software. Both are doing product development engineering—which is fine and something the world certainly needs a lot of, but it's not the same thing.

I think anyone who wanted to be an inventor would train as an engineer. So the education/training part of the inventor career path is there. But it falls apart after university.

Facebook employs many software engineers but isn't inventing much in software

 

React, Jest, and GraphQL (among many other projects that I'm less familiar with) were created by Facebook, many of which heavily altered the way that people do programming in the relevant domains. Without knowing what exactly you mean by "inventing" in software, I think you'd have a hard time arguing that they're done none.

Many of the large tech companies have similar contributions, depending on what sort of work they're doing. It's not even just large tech companies - a number of much smaller companies I've worked for have some level of open source contributions, which often are representative of "invention", and arguably a lot of the actual products that companies create could be described as "invention" as well.

Without a further explanation what you'd consider an "invention" vs not, it's hard to say whether or not there's any "there" there with your original point.

Fair enough, I might consider React/GraphQL inventions. (Jest doesn't seem that fundamentally new?)

But how much of Facebook's engineering effort went to inventing React/GraphQL? 1% Surely less than 10%.

You might like Scientific Freedom by Donald Braben. It's a whole book about the problem of developing incentives for basic research.

Yup, I've read it, thanks!

Thank you, great essay.  Reminds me of parts of "The Age of Unreason" by Charles Handy, but that was from the 1990s guessing at what most working people would experience.  Curious if anyone thinks Handy's "portfolio professional" aligns with career paths for those hoping to invent.  This definitely hits on many experiences I have had in my career as I have moved to different organizations.  

I would say our engineering workshop is staffed by inventors. We need people to invent solutions to problems - they do it, usually in collaboration with the scientist that has a problem. I think this is a pretty common setup although not the model for lone inventor producing killer product that is patented and sold. I rather liked their solution for remote camera lens cleaner - (drone with super soaker. No public video sorry).

Alternative theory: most of the low-hanging fruit has been picked. In the information-technology side of things we see an enormous amount of innovation, often because it is possible to create new products with a garageband skeleton crew. In the physical realm, thing are simply much harder, much more specialized. 

Also, there is a negative feedback loop: since the world is much more specialized there are fewer people working in technical fields, and hence fewer people having the basis in technical know-how that might lead to chance-inventions [which I learned from you was a typical story during the early Industrial Revolution in Great Britain]. 

How would you respond to this take?

Jason does have a post where he briefly tackles the low-hanging fruit hypothesis [here]. It isn't 100% compelling, but the idea is that there are "multiple orchards" and we go through one after another. The conceit doesn't include the possibility of "barren earth orchards" though. 

Low-hanging fruit alone doesn't explain stagnation, because our ability to pick the fruit has also been improving. To explain stagnation, you have to explain why the former is happening faster than the latter, and why this only started happening in the last ~50 years.

See also my post here and this interview.

Let me outline a very simple model of technological progress. 

Innovations get exponentiably harder. As the lower-hanging fruits get picked, one needs to combine and master more and more previously understood scientific knowledge to get higher. Moreover, fruits higher up in knowledge tree may be intrinsically harder to pick. 

As an example, see the ever increasing material complexity and size of particle colliders. I.e. LHC to earlier backyard garage colliders. 

Our ability to pick fruit does still increase but that ability may not necessarily increase comparably fast.

Between ~1850-1950 the effective amount of brain power increased substantially. More important than just population growth is probably increased literacy, urbanization, formal education, improved nutrition, improved communication methods to facilliate knowledge etc etc. It might have increased the total effective amount of brain power applied by ~ two orders of magnitude.  

By comparison in the period ~1950-2020 the total amount of brain power may have only increased  ~a couple times. 

One has to be cognizant of the fact that innovations are made by a tiny percentage of highly excentric and talented individuals. This may not always line up with the mean of the population. 

We see substantially more collaboration in Science, and a much larger number of scientists, and a greatly increased amount of specialization. The sum of human scientific expertise does not fit comfortably in the skull of an unedited Homo Sapiens, and this capacity difference is increasing over time. 

You don't need to be a technical field to make inventions. A chef isn't traditional seeing as a technical person but they invent new dishes. A HR person can invent new policies or ways to distinguish expertise when hiring. 

The reason why it's more expensive to build a bridge today then 50 years ago, is not just due to lack of technical innovations but also due to lack of inventions about how to effectively coordinate humans. 

Any ideas on what may have caused companies to trend towards short-termism?

CEO income is largely linked to the short term stock price and investments which have a return that's not legible on short timescales to outsiders of the company are thus not in the interest of the CEO.

This is coupled with investors holding stocks for lower timeframes and thus caring more about the short-term stock price then the longer term stockprice. 

Rant on the way, not an exhaustive list by any means! I kinda extrapolated ‘short-termism’ to mean lack of investment and more so a lack of results in technological progress—as I see the critique or ‘short-termism’ to really hit hard in the idea that companies aren’t investing when they should to make technological progress.

Perhaps it’s the market working as intended, value outperforms growth in the long term, and growth necessitates higher investment to expand. Doubtful, as no one outside of hardcore finance guys know about this anyways.

Perhaps it’s companies becoming more beholden to financial interests in Wall Street—increasing financialization. One of the simpler explanations, but I’d argue that not much has formally changed on this front, and convention is going in the opposite direction. Additionally, if we’re talking about long term technological progress—government, private, and academic progress hasn’t been obviously faster than ‘short term’ public companies, in my opinion at least, I could easily be corrected here.

Paradoxically, as markets get more competitive, economic profits decrease and investment capital dries up. This would necessitate a shorter term outlook to simply survive or profit at all. I don’t favor this one either, if anything natural monopolies are more common when complexity and scale really start ramping up—see Apple or Google.

When interest rates go up, future earnings are more heavily discounted (aka: money now>>>money later, as opposed to money now>money later). Probably the easiest one to strike off, unless you’re an inflation truther, lol.

My pet theory is as follows: we have orders of magnitude more scientists and researchers, rGDP is much higher than in the past as well. Yet, outside of IT, earth shattering innovation that reaches the market has been lacking in recent decades (debateable, surely, again my theory here). This leads me to believe that we are simply running out of “low hanging fruit,” or we are looking in the wrong places. Or it’s so complex, that scientists are forced to specialize to such a degree as to render their findings realistically irrelevant. I think this is among the simplest explanations for the apparent stagnation in technological progress.

Allow me to pick a bone here though, as I (largely) take issue with the premise to begin with. A company has a fiduciary duty to its shareholders. As the other commentator touches on, there’s pressures on CEO’s and corporations to show results. None of this has changed significantly in recent years, in fact I’d argue pushes towards ESG and ‘stakeholder capitalism’ are now working in the opposite direction of pure shareholder value. I don’t put the blame entirely on companies, because if they legitimately saw value in long term high R&D spending, their shareholders should be convinced to see that, and failing that, they could carry out said duty and stick to their guns. Not an easy thing do, however. Tesla’s shareholders seem to see it, likewise investors saw the value in the tech bubble, EV’s, green energy, crypto, biotech etc (ARKK anyone?). Capital is pouring into smaller, nimbler, forward thinking (perhaps to a fault?) companies, elevating multiples to bubble territory in some cases. The companies are there, the (investment) money is there, where I see the problem is largely either the execution or the feasibility of many of these ideas. Or maybe somewhere else, I won’t pretend to be more than an amateur in anywhere but Economics. That’s not to say that every company is doing it perfectly, but in the long run those that have a vision and an execution will win out. Disclaimer: I’m a finance guy, not an inventor or scientist, and probably quite obviously not a competent rationalist either.

I don’t put the blame entirely on companies, because if they legitimately saw value in long term high R&D spending, their shareholders should be convinced to see that, and failing that, they could carry out said duty and stick to their guns. 

If a company has a secret project which they believe will revoluationize everything in a few years, then it's worthwhile for them to be not fully open about the fact. Given current regulations they are not allowed to secretly telling their biggest shareholders either because that unfairly gives them benefits that smaller investors don't get.

An investor has no way to evaluate whether the money that goes into secret R&D is well-spent or isn't while the CEO has a much better idea of the merits of the secret projects inside the company. 

I think the most dire need in this area is something I call "open engineering".

If you create a physical invention then there is a difficult, but well-known career path: become CEO, raise money, do marketing, build the invention, sell it to customers.

But what about those who want to make an intellectual invention? Yes, yes, copyright exists, so you can sell copies of (e.g.) a piece of software, but when was the last time you saw someone successfully selling a new general purpose programming language? That almost never happens, and there's a very good reason why: because learning a new programming language requires time and effort, which for most people is not worthwhile unless there is an employer out there who wants to hire you to use that language. A language costs a lot of time even if it's free! Good luck adding a monetary cost on top of that: you're saying that after you've spent all this time learning the language, you can't necessarily use it until you convince your employer to pay for it? Uh-huh...

But normally, employers don't want to use an obscure language, even a free one, and indeed, I've heard various examples of crappy proprietary languages built in-house at companies that the employees of those companies like to groan and gripe about. There's no theoretical reason that a proprietary language couldn't be good, but in practice they're not, because there is virtually no market for them, so they tend to be underfunded even in cases where they exist at all. People who learn brand new languages, at first, tend to be people who do it for fun, not for profit. Therefore if the inventor charges money for it, these early adopters probably won't be interested.

By producing an open-source language, the inventor forfeits any practical path to profitability, but it's much easier to get early adopters. More than that, it provides a path to creating a crucial network effect: employers, very slowly at first, become interested in using the language if it has major benefits, which slowly encourages developers to learn the language, and as the number of developers who know the language increases, to too does the number of employers willing to use it, in a virtuous cycle.

As evidence of this, I cite its opposite: Mathematica, a very cool, capable and versatile programming platform, which however is proprietary, and I think this is why it has never taken off among software developers. You buy it if you really need what it offers; otherwise you ignore it and use JavaScript (all the while griping about its stupid design decisions). Mathematica has something no other language has: an huge and unified library of mathematical functionality. It's also fundamentally graphical in nature—another rare feature. Clearly this wasn't cheap to produce, so they recoup their costs by charging money, which, however, may have reduced Mathematica's popularity by orders of magnitude.

No only that, but by going open-source the inventor gains something else that any EA like myself salivates over: the ability to improve the world in a real and significant way. If my language is free, anyone can use it, and that includes schoolchildren and developing-world residents who normally can't afford to buy software. That's a real win on the "warm fuzzies" side of things. People like me care more about improving the world than making money, so producing expensive proprietary software has no appeal except for the purpose of making money. I do need money, sure, absolutely, very much so. But I know that making a proprietary programming language or library is a slow path to riches, so instead I am building software for the oil and gas industry which, if not for the money, I would prefer not to support! (I'm in Calgary, Canada's oil & gas hub, so that's a very common kind of job here, so I took that job and spend some of my paycheck on clean energy advocacy.)

Another key foundational component of software—arguably more important than programming languages—is libraries, units of code such as algorithms, data structures and protocols, that are designed to be re-used by an unlimited number of people and companies. Here too, the proprietary route is possible, but usually you have to charge very high prices to make it worthwhile, since most developers prefer to use an inferior open-source product if available, and you'll probably have to spend a lot on marketing for people to even discover your product.

Case in point, my company was looking for a Simplex algorithm for C# — a algorithm taught in many university courses, but not an easy algorithm to implement correctly and robustly. There were expensive commercial products available that supported it, but we're a tiny company with no customers, so we decided to go the open source route. Guess what, though? There are no good Simplex implementations for C#! To get Simplex in C#, I extracted the Simplex algorithm from the massive Apache Math library for Java. This was not easy, because that algorithm is embedded in a complex object-oriented framework and the transitive closure of its dependencies was perhaps dozens of times larger than the core algorithm.

I got the job done, so now we have a "free" Simplex algorithm in C# that just required a few days of engineering work, but have we released it as open source? No, we have not. Why? Because although Simplex is useful in a wide variety of industries and could significantly benefit the world, it could potentially also benefit a competitor. I think this phenomenon happens very widely in the software industry and is certainly an impediment to progress. Now, my company has good people who support open source, so they are willing to release the code after we have a few customers. But we're not planning to support or improve the software, it'll just be a code drop. Look at the wasteful duplication here: Simplex for C# already exists, but we didn't want to  pay for it, so we built our own. This is a waste of human labor that we accept because we have no better system. (Imagine if, instead, engineers spent more time creating new things instead.)

Another example from this year is a library called SyncLib that I built for a company before being laid off due to the Covid downturn. SyncLib is a unique serialization library that I have demonstrated can easily cut your serialization code size in half while making it less error-prone, supporting multiple data formats, and maybe even improving performance. It's useful in any industry and great for hobbyists too.

They refused to give me any of the source code of SyncLib, including even an interface file that would remind me how it worked. Why? Was it because competitors might benefit? No. It was because (under their own open-source policy) they would have someone review, publish and maintain the code, and they didn't want to spend money on that just for some forgotten ex-employee.

So in order to build an open-source version I had to do it all from scratch, again. The new version is much more ambitious, and of course, free. But how can I afford to build an open-source product? Simple: I have a deal with my employer in which they pay me 20% less money in exchange for 20% open-source time (because I get burned out if I have to do it all on weekends).

This, Jason, is what I'd like you to advocate: there should be funding for open engineering. I don't care if the pay is lousy, either. Low pay — and I mean low, as in minimum wage or thereabouts — has a huge advantage for anyone who wants to improve the world. The advantage is that whoever supplies the funds can be less picky about whose applications for funding they can accept. Paying $12/hour, they can fund three times as many projects as they could at $36/hr. Hopefully that means I don't have to prove to some conventionally-minded grant reviewer that my radical idea is a good one before I can get any funding for it.

Support Open Engineering! And by the way, it's not just for software. No doubt there are people out there who would like to write free textbooks, or help design open RISC-V microprocessors, or blueprints for gadgets you can mostly print at home on your 3D printer, etc.