What we see gets credit. What we prevent gets forgotten.
Why quiet success deserves louder recognition and what leaders can do to spot it, reward it and build on it.
In Minority Report (2002), a futuristic police unit stops crimes before they occur. Their results are nearly perfect. But public trust begins to erode. If no crimes happen, how can people be sure the system works? How do you justify the cost of something that prevents the invisible?
That’s the paradox of prevention.
Whether it’s crime, crisis or collapse - when things go right because someone acted early, the success is silent. And when we can’t see what was avoided, we start to question whether there was ever a threat at all.
This isn’t just science fiction. It’s how we often judge leadership, planning and risk.
The paradox of prevention.
Even if the crew of the Titanic had managed to bring her to New York despite the collision, some would still have complained about the delay.
- Thomas Rauch
This quote captures something deeply human: we tend to celebrate visible wins and overlook what was quietly prevented. It’s a bias that shapes how we judge leaders, teams, outcomes and even history.
This article takes a deeper look: Why do we overvalue what’s visible and undervalue what’s prevented? What psychological forces drive this? And how can we shift our perception to give prevention the credit it deserves?
Why it matters
Many of the most important achievements in leadership, risk management and everyday life are invisible.
The crisis that didn’t spiral.
The outage that never happened.
The team that stayed calm when others would have panicked.
If we don’t learn to see these quiet wins, we risk rewarding recklessness and overlooking true foresight.
The Psychology behind it
Several well-documented cognitive biases explain why we underrate what’s prevented:
1. Outcome Bias
We tend to judge decisions by their results, not by the quality of the decision-making process. If a risky move pays off, we call it smart. If a cautious move avoids disaster, we call it lucky.
It’s like praising the driver who runs a red light and doesn’t crash and ignoring the one who stops safely.
Well-known work by Baron & Hershey (1988, Outcome Bias1 in Decision Evaluation) illustrates how people favor outcome over process, even in morally significant decisions.
2. Preparedness Paradox
When prevention works well, people believe the threat was exaggerated. This paradox shows up in public health, disaster planning and cybersecurity: success looks like overreaction.
Flood defenses that seemed like overengineering, until the storm came and damage was minimal.
A company invested heavily in routine patch management and zero-day monitoring and was later criticized for “overspending on threats that never materialized.” But nothing materialized precisely because they were ready.
Y2K preparedness avoided major disruptions and was later criticized as “overblown.”
3. Survivorship Bias
We focus on the visible winners and forget the failures that were quietly avoided. A classic example: studying successful companies without analyzing the ones that failed due to poor risk management.
4. Availability Heuristic
We judge importance by what’s most noticeable. A major success story is easier to remember than a slow, stable year with no drama. Visibility beats subtlety.
5. Reward Structures
In many organizations, incentives are tied to visible outcomes: launches, sales, growth. There’s rarely a bonus for the risk you mitigated or the fire that never started.
Bazerman & Tenbrunsel’s behavioral‑ethics research shows that incentive and goal‑structures often push organizations to focus on visible results, sometimes at the expense of careful, ethically sound decisions. In many cases we end up rewarding success, not robustness.
How this plays out in practice
A product manager cancels a feature that would have introduced serious complexity. No applause.
A security lead fixes a vulnerability before it’s exploited. No headlines.
A CEO maintains team morale during a downturn and avoids layoffs. No press release.
Prevention doesn’t feel like progress. But it often is the progress.
Additional real-world examples
Healthcare: Vaccination campaigns that prevent outbreaks are often questioned because the crisis never comes.
Infrastructure: Engineers who maintain bridges and prevent collapses are rarely featured in annual reports.
IT & Cybersecurity: Quietly patching vulnerabilities and maintaining backups prevents chaos and earns silence.
Manufacturing: A team in a mid-sized factory flagged an abnormal motor temperature. A preventive shutdown avoided a three-day outage, saving €250,000. The success was invisible. The failure would’ve been very public.
What to do differently
To rebalance our judgment and incentives, we need to:
1. Ask: What didn’t happen and why?
After every project or quarter, reflect not just on wins, but on disasters avoided.
2. Make the invisible visible
Talk about stability. Celebrate the calm. Publicly recognize teams that averted problems.
How to recognize quiet wins
“You flagged this early - thanks.”
“We avoided escalation - great judgment.”
“This didn’t make noise - because you handled it well.”
“Because of your calm prep, this stayed simple.”
3. Value process over luck
Highlight decisions that were made with care, even if the outcome was neutral. Encourage teams to document how they thought through risk, not just what they shipped.
4. Incentivize Risk Awareness
Reward teams who identify vulnerabilities early. Build a culture where flagging concerns is seen as a strength.
Measuring Prevention (without gaming it)
Number of critical issues de-escalated early
Volume of resolved “Almost Problems”
Participation in risk workshops / pre-mortems
Time to response on weak signals
Clean audit results / zero critical escalations
Use Pre-Mortems2: As Gary Klein suggests, imagine a future failure and work backward to ask what might cause it. Then act early.
5. Develop a dual lens
Measure success in two ways:
What moved forward?
What didn’t go wrong and why?
Leadership Tip: Add a “Near Miss3” column to retrospectives. Ask: “Where did we actively avoid risk this week/month?”
Micro-Ritual: Friday 15 Minutes: each team member notes one silent success4 of the week.
The big misunderstanding
People often think leadership is about bold moves and big calls. But real leadership, especially in uncertainty, often means reducing noise, focusing attention and keeping the team on course.
That doesn’t make for dramatic headlines. But it’s often the reason you’re still afloat.
Misjudgment Case: The Challenger Disaster
Engineers at contractor Morton Thiokol warned NASA leadership about the risk of O‑ring failure in cold temperatures before the launch. Despite that (and previous missions showing signs of O‑ring erosion) the launch went ahead. The shuttle broke apart 73 seconds after liftoff. The tragedy exposed how a known, repeatedly documented danger can be dismissed, especially when past successes masked how real the risk actually was.
Getting started (without overthinking it)
Two simple habits can shift the culture:
In reviews, always ask: “What did we avoid?”
When recognizing people, say: “Thank you for keeping things from going off the rails.”
Also useful:
Create “Almost Problems” logs5: Document risks identified and neutralized.
Add a “Prevention Lens” to retrospectives: What didn’t escalate? What did we catch early?
Include “silent success stories” in town halls or internal newsletters.
For the very precise among us
“This is too theoretical.”
Clarification: Prevention is hard to measure, but the article offers practical tools, examples and leadership prompts to make it actionable.“You can’t reward things that didn’t happen.”
Clarification: True, we should reward the process that prevented harm, not the absence of a result. Clarity, foresight and calm under pressure matter.“Nice idea, but hard to apply.”
Clarification: That’s why we suggest small shifts, like adding a prevention lens to reviews or logging almost-problems. Culture change starts with habit change.“Results still matter more than intentions.”
Clarification: Absolutely, but robust processes prevent bad outcomes. Long-term health depends on resilience, not just performance.“This excuses mediocrity.”
Clarification: Quite the opposite. It highlights high-quality, thoughtful work that often goes unseen. Quiet excellence deserves recognition.
Final Thoughts
We tend to overestimate what we can see and underestimate what we’ve been spared. But real progress is both:
Building the right things
And protecting them from the wrong outcomes
We judge more wisely when we recognize not only what is present, but what was prevented - allowing real progress to shine beyond the imperfections that remain.
Let’s give prevention the credit it deserves.
Further Reading & References
Baron, J. & Hershey, J. (1988). “Outcome Bias in Decision Evaluation.”
Kahneman, D. (2011). “Thinking, Fast and Slow”.
Bazerman, M. & Tenbrunsel, A. (2011). “Ethical Breakdowns”
Klein, G. (2007). “Performing a Project Premortem”
“The Preparedness Paradox” Wikipedia
Glossary
Outcome Bias: The tendency to judge a decision based on its result rather than the reasoning that led to it. A good outcome doesn’t always mean the decision was wise and a bad outcome doesn’t always mean it was foolish.
Pre-Mortem: A method developed by Gary Klein where a team imagines that a project has failed and then works backward to identify what could lead to that failure. It’s a way to spot risks early and act before things go wrong.
Near Miss: An event that could have become a serious problem, but didn’t, thanks to timely action or sheer luck. In high-reliability industries, near misses are treated as learning opportunities.
Silent Success: A win that leaves no dramatic trace. No outage, no crisis, no incident. It feels like “nothing happened”, but that nothing was earned.
Almost Problems Log: A simple record of issues that were identified and neutralized early before they caused damage. It’s a tool to make invisible wins more visible.



