Saved by Aman and
Goodhart's law
Too Much Efficiency Makes Everything Worse: Overfitting and the Strong Version of Goodhart's Law
Jascha Sohl-Dicksteinsohl-dickstein.github.ioGoodhart's law is an adage often stated as, "When a measure becomes a target, it ceases to be a good measure".[1] It is named after British economist Charles Goodhart, who is credited with expressing the core idea of the adage in a 1975 article on monetary policy in the United Kingdom:[2] Any observed statistical regularity will tend to collapse
... See moreGoodhart's law states that, when a measure becomes a target, it ceases to be a good measure 2 . Goodhart proposed this in the context of monetary policy, but it applies far more broadly. In the context of overfitting in machine learning, it describes how the proxy objective we optimize ceases to be a good measure of the objective we care about.