Goodhart’s Law states that "any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes." However, this is not a single phenomenon. I propose that there are (at least) four different mechanisms through which proxy measures break when you optimize for them.

The four types are Regressional, Causal, Extremal, and Adversarial. In this post, I will go into detail about these four different Goodhart effects using mathematical abstractions as well as examples involving humans and/or AI. I will also talk about how you can mitigate each effect.

Throughout the post, I will use to refer to the true goal and use to refer to a proxy for that goal which was observed to correlate with and which is being optimized in some way.


Quick Reference

  • Regressional Goodhart - When selecting for a proxy measure, you select not only for the true goal, but also for the difference between the proxy and the goal.
    • Model: When is equal to , where is some noise, a point with a large value will likely have a large value, but also a large value. Thus, when is large, you can expect to be predictably smaller than .
    • Example: height is correlated with basketball ability, and does actually directly help, but the best player is only 6'3", and a random 7' person in their 20s would probably not be as good
  • Causal Goodhart - When there is a non-causal correlation between the proxy and the goal, intervening on the proxy may fail to intervene on the goal.
    • Model: If causes (or if and are both caused by some third thing), then a correlation between and may be observed. However, when you intervene to increase through some mechanism that does not involve , you will fail to also increase .
    • Example: someone who wishes to be taller might observe that height is correlated with basketball skill and decide to start practicing basketball.
  • Extremal Goodhart - Worlds in which the proxy takes an extreme value may be very different from the ordinary worlds in which the correlation between the proxy and the goal was observed.
    • Model: Patterns tend to break at simple joints. One simple subset of worlds is those worlds in which is very large. Thus, a strong correlation between and observed for naturally occuring values may not transfer to worlds in which is very large. Further, since there may be relatively few naturally occuring worlds in which is very large, extremely large may coincide with small values without breaking the statistical correlation.
    • Example: the tallest person on record, Robert Wadlow, was 8'11" (2.72m). He grew to that height because of a pituitary disorder, he would have struggled to play basketball because he "required leg braces to walk and had little feeling in his legs and feet."
  • Adversarial Goodhart - When you optimize for a proxy, you provide an incentive for adversaries to correlate their goal with your proxy, thus destroying the correlation with your goal.
    • Model: Consider an agent with some different goal . Since they depend on common resources, and are naturally opposed. If you optimize as a proxy for , and knows this, is incentivized to make large values coincide with large values, thus stopping them from coinciding with large values.
    • Example: aspiring NBA players might just lie about their height.

Regressional Goodhart

When selecting for a proxy measure, you select not only for the true goal, but also for the difference between the proxy and the goal.

Abstract Model

When is equal to , where is some noise, a point with a large value will likely have a large value, but also a large value. Thus, when is large, you can expect to be predictably smaller than .

The above description is when is meant to be an estimate of . A similar effect can be seen when is only meant to be correlated with by looking at percentiles. When a sample is chosen which is a typical member of the top percent of all values, it will have a lower value than a typical member of the top