Goodhart's Law
The principle that when a measure becomes a target, it ceases to be a good measure, widely observed in AI benchmark gaming.
Goodhart's Law, named after British economist Charles Goodhart, states that once a metric becomes an optimization target, people (or systems) will game it in ways that undermine its original purpose. In AI, this manifests as companies optimizing for benchmark scores rather than real-world performance: testing dozens of model variants and publishing only the best results, designing benchmarks that favor their own products, and training specifically on benchmark-like data. When industry controls 96% of benchmark design, the measures reflect what companies want to show rather than what users need to know.
Also known as
Goodhart's law, metric gaming