Explanation
What it is
Goodhart’s Law is a principle stating that when a measure is turned into a target, it loses reliability as a true indicator of progress. In practice, metrics stop reflecting reality once people and institutions begin optimising for them. The law highlights how numbers can become decoupled from the purpose they were designed to serve.
When to use it
- When evaluating performance metrics tied to incentives.
- When you see teams or institutions celebrating numbers without corresponding improvements in outcomes.
- When diagnosing why reforms, audits, or dashboards aren’t producing lasting change.
- When reviewing whether KPIs, OKRs, or benchmarks are still aligned with mission goals.
Why it matters
Goodhart’s Law matters because it exposes the danger of mistaking measurement for meaning. By showing how targets distort behaviour, it helps leaders and teams avoid the trap of “looking good on paper” while systemic issues remain unresolved. Applying the principle ensures metrics drive genuine outcomes rather than surface-level theatre, improving alignment, quality, and trust.
Reference
Definitions
Goodhart’s Law
The principle that when a measure becomes a target, it ceases to be a reliable indicator.
Metric
A quantitative measure used to track performance or progress.
Target
A benchmark or goal tied to incentives or accountability frameworks.
Proxy Measure
An indirect indicator used in place of a harder-to-measure outcome.
Campbell's Law
Related principle stating that the more a quantitative indicator is used for decision-making, the more it corrupts the process it monitors.
Measurement Theatre
Optimising for visible metrics while leaving underlying dysfunction unresolved.
Canonical Sources
- Charles Goodhart, Problems of Monetary Management: The U.K. Experience, 1975 Bank of England paper (archival references)
- K. Alec Chrystal & Paul D. Mizen, Goodhart’s Law: Its Origins, Meaning and Implications for Monetary Policy, 1999 ResearchGate
- Michael F. Stumborg et al., Recognizing and Mitigating the Manipulation of Measures in Analysis, CNA, 2022 — CNA PDF
- Michael Fire & Carlos Guestrin, Over-Optimization of Academic Publishing Metrics: Observing Goodhart’s Law in Action, 2018 — arXiv
- Adrien Majka & El-Mahdi El-Mhamdi, The Strong, Weak and Benign Goodhart’s Law, 2025 — arXiv
- Oliver Braganza, Proxyeconomics: The Inevitable Corruption of Proxy-Based Competition, 2018 — arXiv
- Hilde Weerts, Lambèr Royakkers, Mykola Pechenizkiy, Are There Exceptions to Goodhart’s Law?, 2022 — arXiv
Internal Sources
- Idiocracy Index – Diagnostic Framework (not yet published)
- Broken Journeys, Broken Trust: UX Strategy’s Blind Spot
- Upstream vs Downstream: Diagnosing Systemic Dysfunction
Notes & Caveats
- Scope: Goodhart’s Law is often applied beyond its economic origins into education, healthcare, AI, and management.
- Misread: The law is sometimes overstated as “all metrics are useless” — in reality, it warns of distortion under pressure, not futility.
- Controversy: Related formulations (e.g. Campbell’s Law) blur conceptual boundaries; academic debate continues over distinctions.
- Versioning: Recent work distinguishes “Strong,” “Weak,” and “Benign” Goodhart effects, especially relevant in machine learning contexts.
How-To
Objective
Ensure metrics remain aligned with institutional purpose rather than collapsing into targets that distort behaviour.
This guide helps teams and leaders detect when measures have become self-serving and restore their relational integrity.
Steps
- Map each metric to its North Star
Tie every measure back to the institution’s stated mission. - Audit behaviours around the metric
Within a set review cycle (e.g. quarterly), check if actions serve the number or the purpose. - Cross-check outcomes with stakeholders
Validate results with users, customers, or frontline teams to see if improvement is real. - Retire or refresh distorted metrics
Once evidence of gaming or drift emerges, phase out or replace the measure.
Tips
- Use a basket of indicators rather than over-relying on one metric.
- Make the purpose of each measure explicit and visible.
- Pair quantitative measures with qualitative feedback for balance.
- Introduce sunset clauses: no metric should live forever without re-approval.
- Frame reviews as opportunities to learn, not witch hunts.
Pitfalls
Treating dashboards as reality
Always triangulate metrics with user or frontline experience.
Assuming objectivity of numbers
Remember every metric encodes subjective choices.
Adding more KPIs as a fix
Focus on fewer, purpose-anchored measures instead of KPI overload.
Ignoring unintended consequences
Run “pre-mortems” to explore how metrics might be gamed before rollout.
Acceptance criteria
- Evidence that metrics track real outcomes, not just compliance with targets.
- Dashboard or reporting artefacts updated to include purpose-link notes.
- Stakeholders across teams confirm alignment between measures and mission.
- Clear process exists for retiring or refreshing metrics that drift.
Tutorial
Scenario
A hospital management team is under pressure to reduce waiting times in its emergency department. Government oversight ties funding to meeting a four-hour patient handover target. Administrators must balance this metric with the real objective of providing safe, effective care.
Walkthrough
Decision Point
The leadership team decides whether to prioritise “hitting the four-hour target” or addressing underlying process inefficiencies.
Input/Output
Staff begin moving patients into short-stay wards earlier to clear the queue.
Input: waiting-time data.
Output: apparent compliance on dashboards.
Action
A new reporting sheet is logged into the hospital’s performance portal, showing compliance with the four-hour standard.
Error handling
Clinicians flag that some patients are being transferred prematurely, creating risks. Resolution: governance board reviews discrepancy between patient outcomes and reported compliance.
Closure
The board establishes a parallel metric tracking readmission rates and patient safety incidents, anchoring both back to the hospital’s mission statement.
Result
- Before → After delta:
- Before: Waiting times reported as compliant but at cost of patient safety.
- After: Broader metrics adopted, balancing time targets with quality and trust.
- Artefact snapshot:
New *Performance Dashboard v2.0* stored in hospital intranet reporting hub.
Variations
- If government regulations tighten, add parallel indicators that measure quality as well as speed.
- If team size is smaller, prioritise qualitative spot-checks with patient interviews instead of complex dashboards.
- If tooling differs (e.g., no integrated portal), use simple shared spreadsheets annotated with outcome notes to maintain transparency.