Crossposted from the AI Alignment Forum. May contain more technical jargon than usual.

The Semiconductor Supply Chain: Assessing National Competitiveness (Saif M. Khan, Alexander Mann, Dahlia Peterson)

The Semiconductor Supply Chain: Assessing National Competitiveness

Summary

This report analyzes the current supply chain for semiconductors. It particularly focuses on which portions of the supply chain are controlled by US and its allies and China. Some key insights:

  • The US semiconductor industry is estimated to contribute 39 percent of the total value of the global semiconductor supply chain.
  • The semiconductor supply chain is incredibly complicated. The production of a single chip requires more than 1,000 steps and passes through borders more than 70 times throughout production.
  • AMD is currently the only company with expertise in designing both high-end GPUs and high-end CPUs.
  • TSMC controls 54% of the logic foundry market, with a larger share for leading edge production, e.g., state-of-the-art 5 nm node chips.
  • Revenue per wafer for TSMC is rapidly increasing, while other foundries are seeing declines.
  • The Netherlands has a monopoly on extreme ultraviolet (EUV) scanners, equipment needed to make the most advanced chips.
  • The Netherlands and Japan have a monopoly on argon floride (ArF) immersion scanners, needed to make the second most advanced chips.
  • The US has a monopoly on full-spectrum electronic design automation (EDA) software needed to design semiconductors.
  • Japan, Taiwan, Germany and South Korea manufacture the state-of-the-art 300 mm wafers used for 99.7 percent of the world's chip manufacturing. This manufacturing process requires large amounts of tacit know-how.
  • China controls the largest share of manufacturing for most natural materials. The US and its allies have a sizable share in all materials except for low-grade gallium, tungsten and magnesium.
  • China controls ~2/3rds of the world's silicon production, but the US and allies have reserves.

The report also analyzes US competitiveness at very detailed levels of the supply chain, which I didn't read that carefully. Tables:

img

img

img

Opinion

One perspective on the economy is that it's running a vast, distributed computation to allocate supply to demand in a relatively efficient manner. Examining the details on one of the supply chains underpinning a half a trillion dollar industry is relatively awe-inspiring. The only thing I'm currently aware of that is as complicated as computer hardware is computer software, which is sort of cheating. As AI becomes more advanced, control of semiconductor production becomes a strategic resource. However, there are multiple monopolies/sort of monopolies at every point. Each of these monopolies has a relatively large amount of bargaining power under many reasonable models. This situation puts the world in an interesting place. One concrete thing that I didn't consider before reading this report is the relevance of design software to semiconductor manufacturing. In retrospect, it seems pretty clear that the design of complicated things, e.g., video games, buildings, semiconductors, and animations, require complicated software with companies dedicated to building it. Lacking this software could constitute a meaningful bottleneck to being able to produce complicated artifacts. The asymmetry between manufacturing software and hardware is that software is easier to acquire through illegal means, whereas a EUV scanner has “100,000 parts, 3,000 cables, 40,000 bolts and 2 kilometers of hosing,” making it prohibitive to steal.

Intelligence Explosion Microeconomics (Eliezer Yudkowsky)

Intelligence Explosion Microeconomics

Summary

Takeaways I found interesting:

  • Evolutionary history strongly suggests that there are linear or superlinear returns to increased cognitive investment. Various lines of evidence, e.g. brain size of humans compared to chimps, look linear with respect to time. The last common ancestor of humans and chimps was only about 10 million years ago. 400,000 generations isn't enough time for mutations with low fitness increases to reach fixation, so the timelines are inconsistent with rapidly diminishing cognitive returns.
  • If intelligence is a product of software capabilities and hardware capabilities, a series of recursive software improvements (in terms of percentile gains) can converge to a finite amount or go to infinity depending on the amount of hardware present. In economics, Jevon's paradox is the observation that increasing the efficiency of coal use might increase the demand for coal. Similarly, algorithmic improvements in AI might increase how profitable AI is, resulting in increased spending on computers.

Opinion

I knew most of the stuff in this paper already, but the presentation was clean and the arguments were well laid-out. I wish I had read it instead of the AI-FOOM debate, although the latter was likely more entertaining.

Artificial Intelligence and Economic Growth (Philippe Aghion, Benjamin F. Jones & Charles I. Jones)

Artificial Intelligence and Economic Growth

Summary

The authors speculate on how economic growth might be affected by artificial intelligence by considering two main themes. The first theme is modeling artificial intelligence as a continuation of the natural historical processes that have automated various parts of the economy for the last 200 years. This theme allows historical data to be used to inform future speculation.The second theme is to constrain growth by considering Barmol's "cost disease", the observation that sectors that experience productivity growth see their share of the GDP declining, while sectors with slow growth see their share of GDP increasing. To quote the authors, "as a consequence, economic growth may be constrained not by what we do well but rather by what is essential and yet hard to improve." The author's model a situation in which the economy has two inputs, labor and capital, and work is divided into various tasks. Tasks can be automated, in which case they take a unit of captial to finish, or unautomated, in which case they take a unit of labor. In this model, as the fraction of tasks that are automated increases, the capital share, i.e. the amount of GDP going to people who are investing money as opposed to time, goes up. However, as the captial share rises, captial begins to accumulate, which results in an excess of automated goods (since they can be produced with only capital). This surplus results in the value of automated goods declining relative to nonautomated goods, resulting in "cost disease" type effects. As more sectors are automated, this increases the share of automated capital. However, since these sectors grow faster, the price declines. Implicit in this model is low elasticity of substitution between automated and nonautomated goods, i.e. as more goods become automated, it requires even more automated goods to replace the remaining nonautomated goods. This rate increases fast enough that even with infinite automated goods, there still remains some amount of goods that are nonautomated. This general process produces overall balanced growth because nonautomated goods provide a constraint the overall speed of the economy. Other models are then presented and analyzed, with the general trend being elasticity of substitution being an important parameter.

Opinion

I'm not really sure of what to think of predicting AI growth with economic tools. I think they're useful because they allow you to answer questions like, "What happens if we model AI as a continuation of historical automation?".However, I am unsure if this is the correct question to be answering. I also enjoyed the discussion of how cost disease type effects can prevent extremely explosive growth even if one good cannot be automated. I'm pretty skeptical that there will exist such a good. I don't have data to back this up, but I have a vague sense that historically when people have claimed that X cannot be automated away for fundamental reasons, they've been mostly wrong. The most plausible story to me seems like regulation won't be fast enough to keep up with the pace of automation and instead of getting e.g. automated doctors, we get AI systems that do all the work for the doctor but the doctor has to be the one that gives the treatment. In the short term, these regulatory hurdles might be enough produce balanced growth for a period before growth becomes explosive.As a counterpoint, historical trends suggest rapid adoption. Brookings:

img

Are Ideas Getting Harder to Find? (Nicholas Bloom, Charles I. Jones, John Van Reenen, and Michael Webb)

Are Ideas Getting Harder to Find?

Summary

The authors model technological progress as a product of the number of researchers and productivity per researcher. They typically define technological progress such that constant progress leads to exponential growth in the economy, with the logic being that ideas can spread throughout the entire economy without cost. For example, an idea that causes a percentage decrease in the size of a transistor will result in a percentage increase in the number of transistors per some unit area. In areas such as medicine, the authors measure technological progress as a linear change in life expectancy.

The authors measure the number of researchers by taking aggregate R&D spending and deflating it by the average researcher wage. In addition to controlling for the effects of researcher wage increase, assuming roughly that labs will optimally allocate capital between high/low skilled researchers and lab equipment, dividing the by the average researcher wage allows for relatively fair comparisons between these different types of spending. They also consider other measures of research such as the number of clinical trials or the number of published papers in an area. In general, research efforts are increasing exponentially. In semiconductors, the research effort has risen by 18 since 1971. Between 1969 and 2009, corn and soybeans seed efficiency research both rose by more than a factor of 23.

The authors measure total technological progress across many domains by looking at transistors, crop yield per unit area, and life expectancy. The first two exhibit steady exponential growth (constant percentage), while life expectancy historically exhibits linear growth. Estimating the rate of technological progress and the effective number of researchers lets the authors calculate productivity. The results are quite clear: it's declining. For a representative example, transistors per unit area has been basically increasing at a constant 35% per year for the past 50 years, but research effort has risen by 18, which means it's currently 18x harder to maintain the same rate of growth as it was in 1971. This pattern appears to varying degrees in other fields. Table 7 summarizes:

img

The authors also calculate a parameter that measures the extent to which successive doublings in production are more difficult. For example, the economy has , so each successive doubling of the economy requires 3x the aggregate research input (in this case R&D spending). However, semiconductors have , so each successive doubling only requires 0.2 of a doubling of inputs. The reason why semiconductor research has undergone such a large productivity decrease is because it has undergone so many doublings.

Opinion

Yep, it sure seems like research productivity is declining. Why? I recently read Intelligence Explosion Microeconomics, so I'm tempted to say something like "Research is a product of serial causal depth of thinking, you find the easiest innovations first, and the later ones are going to be exponentially harder." I don't quite think this is true and I'm not sure what it would imply if it were true, but I suspect that low-hanging-fruit being picked is probably the main reason. I've heard "duplicate work" mentioned as another possible reason, but I'm not as compelled because good ideas getting duplicated doesn't seem to happen often enough to explain such large decreases in productivity. I enjoyed reading about how the authors thought about measuring the various inputs and outputs of their equations, e.g. how they would measure "total research" and "technological progress." I wouldn't be surprised if they got something very wrong, but their results have been applied to many areas so I would be surprised if the qualitative trends were wrong. I'm sort of surprised that growth theory is said to assume constant research productivity. The sort of basic observations that R&D spending has increased but economic growth has remained roughly the same seems to imply the obvious conclusion that productivity is declining. The corresponding counterargument surveyed by the paper that productivity within domains remains constant but aggregate productivity is declining seems sort of also clearly false if you think about any industry. My suspicion is that the falseness of this assumption has been obvious for a while, but people thought it might be a reasonable approximation or no one had bothered to get the data to falsify it until the authors. There's some argument that since economics research (and other theoretical stuff) has no clear way for it to make money, it's not going to be as efficient as semiconductor research, so fruit hangs lower, this paper being one such example. I suspect this observation to be relatively sophomoric compared to analyses of "research economies" that have definitely been done by some people.

Statistical Basis for Predicting Technological Progress (Béla Nagy, J. Doyne Farmer, Quan M. Bui, Jessika E. Trancik)

Statistical Basis for Predicting Technological Progress

Summary

How will technological development happen in the future? The authors compile a dataset of production versus unit cost for 62 technologies and hindcast a variety of prediction rules to determine which functional form tends to relate production and unit cost. They find that Wright's law, which predicts unit cost is some constant fraction of cumulitive production, is most predictive, with Moore's law slightly after.Another way to interpret Wright's law is the prediction that unit cost will be exponentially decreasing and cumulative production will be exponentially increasing, with the ratio between the exponents being a constant. Moore's law, on the other hand, only predicts that unit cost will decrease exponentially without positing a relation between production and cost. Wright's law is often interpreted to imply "learning by doing," with the thought being most technological advances (decrease in unit costs) come from attempting to scale up manufacturing. The authors conclude by commenting that their dataset doesn't contain enough technologies that don't grow exponentially to strongly distinguish between Moore's law and Wright's law.

Opinion

From the perspective of thinking about broad strokes technological progress, Wright's law seems like a strict upgrade to Moore's law. However, Wright's law still leaves unanswered the question "Why is cumulative production increasing exponentially?" Also, given that production is increasing extremely rapidly for most technologies, total cumulative production is only a few years or decades of current production levels. For example, if production grows by 50% a year and we're at 150 units per year right now, the past few years will have been 100 units, 67 units, 45 units, etc., which is a convergent geometric series that will sum to 450, 3 years of current production (see that 3 = 1/(1 - (1/1.5))). Thus, even if current technological production stagnates, we should expect cumulative production to continue growing quickly. If Wright's law holds, this will correspond to a large decrease in unit cost.

Experience curves, large populations, and the long run (Carl Shulman)

Experience curves, large populations, and the long run

Summary

An experience curve predicts unit cost will fall a constant percent every time cumulative production doubles. Experience curves have empiricallly good at predicting technological progress across a variety of domains. For information technologies, experience curves predict an approximate halving of unit cost for each doubling of cumulitive production. There are enough resources on Earth to sustain many doublings of cumulative production. For example, solar panel production could be scaled up 1000x, which is 10 doublings. According to Nagy et al., this would result in a 10x decrease in unit costs. For information technologies, whose production has been rising exponentially, a thousand years at current production levels would result in a 100x increase in cumulative production, which implies a 100x decrease in unit costs. Furthermore, Earth only captures about a billionth of the share of solar energy. [Fact check: the Earth is 150 billion meters from the Sun and is about 6 million meter radius. The area of space 150 billion m from the sun is about 10^22, while the area occupied by the Earth is about 10^12, a difference of 10^10, which is about a billion.] This theoretically gives another 9 orders of magnitude (OOM) for growth in cumulative production across many technologies. Space also exists and has many resources, giving many more OOMs. All this suggests if humanity does eventually use these resources, Wright's law implies we will run into the physical limits of technological progress.

Opinion

It's often useful to answer questions like "if we just take the existing trends and extrapolate them, what does that predict?" For example, Roodman uses historical GWP data to predict that humanity reaches a singularity in 2047. The point of the exercise is not to find out when humanity will reach a singularity, but rather determine how strong one's arguments have to be in either direction. For example, people who claim that AGI will cause a singularity around 2050 are making a prediction in line with historical super-exponential economic growth. Those who claim this won't happen are thus claiming that Roodman's extrapolation is overoptimistic. Similarly, using simple models relating technological progress to cumulative production predicts that we'll hit technological limits before we exhaust the amount of easily available resources. This is interesting because then people who claim that we won't either have to defy the increase in cumulative production or the relation between cumulitive production and unit cost.I'm tempted to say something like "burden of proof", but I don't think that's really how arguments should work. The point is that doing such analyses lets discussion drop more easily to object-level disputes about whether particular models are accurate or not or the value of particular parameters, which I think are often more productive.

New Comment
9 comments, sorted by Click to highlight new comments since: Today at 7:10 PM

Curated! I think these intermittent distillations are an excellent format. This single post (and others in the series) let me sample a much wider variety of ideas and sources than I could read on my own. It's further valuable that you provided your own opinion on each topic.

I also enjoyed the discussion of how cost disease type effects can prevent extremely explosive growth even if one good cannot be automated. I'm pretty skeptical that there will exist such a good. I don't have data to back this up, but I have a vague sense that historically when people have claimed that X cannot be automated away for fundamental reasons, they've been mostly wrong.

(That should be "if even one good", right?)

What do you think of the idea that "consumption by a human" could be considered a task? People may value products / services being consumed by humans because it confers status (e.g. having your artwork be well-received), or because they want humans to have good experiences (aka altruism), or for other reasons.

As long as anyone has a reason to value human consumption of goods / services, it seems like that could play the role of task-that-can't-be-automated-away.

Yeah that seems like a reasonable example of a good that can't be automated.

I think I'm mostly interested in whether these sorts of goods that seem difficult to automate will be a pragmatic constraint on economic growth. It seems clear that they'll eventually be ultimate binding constraints as long as we don't get massive population growth, but it's a separate question about whether or not they'll start being constraints early enough to prevent rapid AI-driven economic growth.

From the perspective of thinking about broad strokes technological progress, Wright's law seems like a strict upgrade to Moore's law. However, Wright's law still leaves unanswered the question "Why is cumulative production increasing exponentially?"

Jevon's paradox isn't quite the answer. (It only says that 'this thing happens sometimes'.)

1. Technology is a broad category. We could also use it to characterize tech which is used 'everywhere'. Like electricity, plumbing, or air conditioning. If there's demand for something everywhere, that partially explains why production would keep growing. Another part would be that we (keep) build(ing) on older (universal) tech - from phones to the internet, from electric lightbulbs to electric everything, from just air conditioning so we can stand the heat to cooling computers and food as well.

2. Population growth? When people have more money to spend, they spend money on...more (types of) things?

Keep doing these, I find them quite valuable!

[-]habryka3yΩ250

I also found these very valuable! I wonder whether a better title might help more people see how great these are, but not sure.

[-]Mark Xu3yΩ250

Thanks! I will try, although they will likely stay very intermittent.

The sort of basic observations that R&D spending has increased but economic growth has remained roughly the same seems to imply the obvious conclusion that productivity is declining.

That does seem like the most likely conclusion, but another obvious interpretation of "X doesn't change when we increase Y" would be "maybe Y doesn't actually affect X in the first place".  For instance, maybe funding above some threshold is all eaten by parasites, or maybe the bottleneck on growth speed is something other than formal R&D (e.g. good ideas might strike randomly regardless of whether you're a researcher or not, and the official "researchers" just "harvest" the insights that are already "in the air" thanks to the general population).

(Most of my probability mass is still on "productivity is declining.")

I think there are two areas - qualitative improvements, and regulatory hurdles - that are interesting to think about in the context of this (excellent!) summary.  These are some fairly loose thoughts without any real conclusions.

First are qualitative improvements.  These may be difficult to define - is the smartphone a truly qualitative improvement?  Some cyborg theory says yes (provides audiovisual transfer, eases access to vast repositories of information, etc), but smartphones are certainly far less useful than the internet.

In the near future, I'm very optimistic about recent protein folding simulation advances (Google's DeepMind) significantly speeding up drug research.  A good historical case might be the development of well defined physical measurements (metric system, but better imperial definitions too) eventually allowing higher precision machining, which allowed better measuring tools, and then a virtuous cycle of better measurement and higher precision tools until we reached full interchangeable parts (essentially required for a modern supply chain and mass-production).

This is indicative of a general trend.  A long period of ever-more difficult quantitative improvements eventually allowing a serious qualitative leap.  

This is particularly applicable to research - some sort of proof of concept, followed by further development into something useful, qualification of that useful application, actual use of the technology, and then further development of the technology over time.  As other competing technologies are more capable, it takes longer for a technology to become competitive.  But the point where a company can make money on a technology only really happens in those last two stages.  So the more developed and refined competitor technologies are, the longer a technology takes to go out of the academic lab and into the mainstream.

Second, the effects of restrictions, especially in physical sciences/engineering.  These essentially serve as algae on the bottom of the ship of progress, slowing things down.  There are two major places for this - government regulations, and industry standards.  The latter doesn't get enough attention.  

NACE MR0175, for example (a standard dealing with materials for in oil environments with corrosive H2S present), ended up having a serious chilling effect on new alloy research in the US from 1975 until it was finally significantly revised.  A poorly thought out qualifications rule meant that new materials could be forced to undergo a much longer, more expensive process essentially at the will of any committee member.  Committee members included representatives from major materials manufacturers, none of whom were excited at the idea of their competitors having an easy time.  As a result, academic research into new alloy systems for this application reduced significantly.

I remember a few years ago reading an article about a bridge near Harvard taking an outrageous amount of time to renovate.  The article authors pointed out that the Romans managed to build longer bridges faster.  The problems weren't technological, but administrative/political.

There's currently a serious issue with standards proliferation.  Not only are standards getting a lot longer, there are a lot more of them.  And they're poorly unified - even if you meet US, European, Chinese, etc standards, simply producing the paperwork and statements can be a serious administrative burden (not to mention the possible extra costs getting similar tests done by different companies due to different accreditation requirements).  For poorly funded individuals, the thousands of dollars simply to acquire the necessary standards may be a serious burden.  Further, standards often have the force of law.  And some of those laws are poorly written, calling out outdated standards (or conversely not taking into account the possible legacy issues when a new standard is issued).

There's also complex interactions between the necessary aspects of standards, and the possible advantages they give.  If a company is sending senior engineers to participate in standards activities, there is an incentive to create standards which are advantageous for that company (or at least more difficult for other companies to adhere to).  This is quite apparent in some regional/international standards differences that provide local companies competitive advantages.

But ultimately, these impediments seriously increase the cost and decrease the speed of new developments.  And these impediments are growing.  I found a post from 1999 talking about how bad they were, and that's a tiny fraction of how they are now.  So I wonder how much slow-down comes from this sort of thing?