
On Wednesday evening, Kelechi Iheanacho stepped up in the 98th minute at Fir Park and converted a penalty that VAR had awarded Celtic for a handball that most pundits, reviewing the same footage, were not convinced they could see. He scored. Celtic won 3-2. The title race goes to a final-day showdown with Hearts. Nobody, in the immediate aftermath, was auditing the process. The decision was correct under the rules as written. The outcome was exactly what Celtic needed. When both those things are true, the review meeting tends not to happen.
Had he missed, the same decision, made by the same process, would be the story today.
THE NOISE AFTER A MISS
The pattern is easier to see when the outcome is bad. In the summer of 2021, England lost the Euro final on penalties for the first time in their history. The reaction was familiar: demons, bottle jobs, the usual vocabulary of fate. What made that loss different was invisible in the coverage. Gareth Southgate had spent four years building the most systematic penalty preparation programme in English football. He assembled an 18-month task force, consulted an academic at the London School of Economics who had researched optimal placement and goalkeeper behaviour, and assigned each taker a teammate to meet them at the halfway line on the way back. England scored three. Italy scored four. The methodology had worked. The result said otherwise.
Three years later, at Euro 2024, England beat Switzerland on penalties, scoring all five. Pickford's water bottle, covered in printed notes on each Swiss taker's tendencies, became a symbol of preparation. Same system. Different result. The assessment of the process barely shifted. It just followed the outcome.
CORRECT. REVIEW LOST.
Cricket's Decision Review System was formally adopted in 2009 specifically to remove outcome bias from umpiring. But embedded within it is a rule called "umpire's call." When ball-tracking shows a delivery clipping the very edge of the stumps, the original decision stands and the review is consumed regardless. A captain who challenges on strong evidence can lose the review anyway. The system was designed to reward process thinking. It accidentally preserved outcome-based evaluation of the very decision it introduced.
The captain who challenges on instinct and gets lucky retains the review. The one who challenges carefully and hits "umpire's call" looks careless. Same quality of reasoning. Football's VAR works the same way. Whether the Celtic decision on Wednesday becomes a story about the system's integrity or its vindication depends entirely on the final score.
THE YEAR THE DATA WAS RIGHT
In 2022, the LA Dodgers shifted their infield on 53% of plate appearances, the highest rate in Major League Baseball. The Houston Astros, who had pioneered systematic shift usage years earlier, were not far behind. The logic was straightforward: batted-ball data showed where hitters actually put the ball, so you positioned fielders there. The approach was analytically sound and the evidence was extensive. The following season, MLB banned the shift entirely, citing concerns about offensive pace. The teams that had invested most in shift methodology had that investment retrospectively framed as a mistake. The process had been correct. The rule change made the outcome look otherwise.
THE AUDIT NOBODY RUNS
Most organisations review decisions the same way. They look at what happened, work backwards, and draw conclusions about the quality of the thinking that produced it. This feels like learning. It isn't quite. It is pattern-matching against outcomes shaped partly by factors outside anyone's control: a goalkeeper's guess, a rule change in New York, a delivery clipping leg stump.
The discipline almost no organisation builds is the separation of the process audit from the outcome review. Not a philosophical gesture. A structural one. Different meeting, different questions, different evidence. What did we know at the time? Was the reasoning sound? Did we run the process we said we would?
England eventually built that separation, with a task force and an economist's research and a goalkeeper's notebook. They still lose some shootouts. But they are no longer learning the wrong thing when they do.
TRY IT YOURSELF
🔍 Pick a recent decision your team reviewed after a bad outcome. Then ask:
📋 What did we actually know at the time? Separate what was knowable from what you only learned after.
⚖️ Would we have reviewed this process if the result had been good? If not, the review is about the outcome, not the decision.
🔄 What's the structural practice, not just the intention, that separates the two? A good-faith desire to be fair to the process is not enough.
The goal isn't to win every shootout. It's to stop learning the wrong thing when you lose.
FURTHER READING
📚 Thinking in Bets by Annie Duke (Portfolio, 2018) Duke spent years as a professional poker player before becoming a decision-science consultant to businesses and sports organisations. Her central concept, "resulting", names the routine error of judging decision quality by outcome quality; the poker framing earns its place because the variance is explicit and immediate in a way business rarely makes visible.
📰 "England's soccer team used to dread penalty shootouts. Here's why they've come to embrace them" by Washington Post, July 2024 The article above had to summarise England's preparation in a paragraph; this piece shows it in detail, covering the specific routines, the academic consultation, and the cultural shift from fate to methodology. Reading both together makes the contrast between what England actually built and what the public narrative says about them considerably sharper.
🎙️ "Michael Mauboussin: A Decision-Making Jedi" Shane Parrish, The Knowledge Project, Episode 28 (Farnam Street) Mauboussin walks through the mechanics of separating skill from luck in environments where both are always present, which is every competitive context that matters. His framework for identifying where an activity sits on the skill-to-luck spectrum is directly applicable to which organisational decisions deserve process scrutiny and which are genuinely subject to variance.
🦉 If this piece resonated, you might also like Optimised to Fail — which explores what happens when analytically correct strategies get invalidated by the system's collective response to them.
SOME FINAL WISE WORDS
"When luck has little influence, a good process will always have a good outcome. When a measure of luck is involved, a good process will have a good outcome but only over time."
— Michael Mauboussin, The Success Equation (Harvard Business Review Press, 2012)
Until next time

The world's best business lessons, told through the stories of sport.
