Calibrate Your Confidence

Intuition is not about being right. It’s about knowing when you are.

Knowledge is not binary. An answer can be correct or not, but more important is how sure you are about that answer, and updating the belief over time.

The world is changing at a quietly astonishing rate. Mental models update slowly - we carry assumptions from the last year, or the last decade. The data changes every day, and our intuition doesn’t keep up. We’re miscalibrated, and we don’t even know it.

Calibrated thinking isn’t about having the right answer. It’s about holding beliefs with appropriate uncertainty, and updating them when evidence arrives. It’s a skill, and it’s trainable, and I wrote to game to practice.

calibrates.net

The goal is not to be correct. It’s to be calibrated.

What this game is (and isn’t)

The goal isn’t to absorb trivia - that data will be outdated soon. And it’s not to be right 100% of the time. The goal is to know what you know, to acknowledge uncertainty, and update your assumptions when the facts change. And, most importantly, to have fun doing it.

This matters for anyone who makes decisions under uncertainty - which is everyone. But especially if you work in systems where the cost of overconfidence is high. If you’ve ever been surprised by a production failure that “shouldn’t have happened,” then you’ve felt miscalibration.

How it works

Each day, the game picks a dataset from Our World in Data and other data sets. You’re shown two cards and asked to place them in order by value. As the round progresses, the comparisons get harder. After each placement, you rate your confidence: on the fence? Probably? Almost certain?

Over time, you build a calibration curve. The curve plots your stated confidence against your actual accuracy. A perfectly calibrated person is right 80% of the time when they say they’re 80% confident. Most of us aren’t. Most of us are systematically overconfident - especially on questions where we have just enough domain knowledge to be dangerous.

You can also upload your own data and test your own assumptions. The data never leaves the browser session. Share that link and compare calibration curves.


The goal is not to be correct. It’s to be calibrated. When you are, you make better decisions with imperfect information - which is the only kind of information any of us ever have.

This game wouldn’t have been nearly as fun and entertaining without the work of Our World in Data and Hannah Ritchie. It’s a truly incredible resource, and I encourage you to explore it. The work is an extension of Hans Rosling’s. Rosling spent his career showing smart, educated people that their model of the world was out of date, and they were confidently wrong. His book Factfulness remains one of the best things you can read on the gap between what we think we know and what the data says.