Learn Before
Theory

Goodhart's Law

Goodhart's Law is a principle which states that when a measure becomes a target, it ceases to be a good measure. The act of optimizing for a specific metric can distort the very system it is meant to monitor, thereby invalidating the metric's usefulness as an indicator of the original goal.

0

1

Updated 2025-10-08

Contributors are:

Who are from:

Tags

Ch.4 Alignment - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences