The Antidote to Biased Decision-Making Under Pressure
I recently spoke at a global security conference about cognitive bias and how it hinders humans from recognizing critical information needed to make good decisions under pressure. The response from the attendees was overwhelmingly in agreement that we lack training and awareness on this topic, which causes problems in understanding threats and how to deal with them. Poor decisions are leading to catastrophe during crisis events. We must improve.
In today’s security landscape, decisions are rarely made with perfect information. Whether handling executive protection, global risks, or crisis management, leaders must make quick judgments in the face of ambiguity and pressure. What sets high-performing security leaders apart isn’t access to more intelligence, but the ability to think critically and reduce cognitive bias in the decision-making process.
Critical thinking is not just an academic exercise; it is a practical skill that enables professionals to filter out noise, evaluate different narratives, and make informed, confident decisions. In security operations, it supports every aspect from intelligence analysis to tactical actions. However, the challenge is that our choices are influenced as much by human bias as by concrete data.
Under stress, even experienced professionals fall prey to three dominant biases: anchoring, confirmation bias, and optimism bias.
For example, many security decisions follow some form of the OODA Loop: Observe, Orient, Decide, Act. Biases distort this process at every phase:
Observation: We see what we expect and overlook evidence that contradicts our beliefs. We rely on sources that align with our views and dismiss those that challenge our biases.
Orientation: We interpret new information through old frameworks and missing context. We interpret information in isolation and do not seek alternative perspectives.
Decision: We default to comfortable or previously successful courses of action, ignoring data-driven assessments or a fresh (sometimes inexperienced) perspective.
Action: We justify our choices, reinforcing the same bias in future cycles, and struggle to pivot when unknowns emerge.
The result? Even experienced leaders and teams make predictable mistakes.
A high-profile and well-known example is the Afghanistan evacuation from Kabul - Hamid Karzai International Airport (HKIA) in 2021. The evacuation offers a clear example of how bias can cascade through strategic and operational decision-making.
Anchoring Bias: Senior officials were fixated on early assessments, which predicted that Kabul would remain stable for months after withdrawal. When new intelligence signaled rapid Taliban advances, leaders were still anchored to outdated assumptions. Evacuation planning lagged reality.
Confirmation Bias: Field reports and NGO warnings highlighting the Afghan government’s fragility were discounted because they conflicted with the preferred narrative: that Afghan forces could sustain control (due to US intervention and nation-building). Leadership believed what it wanted to believe.
Optimism and Normalcy Bias: Decision-makers planned for an orderly withdrawal, similar to the one in Iraq. Few prepared for a collapse, a panicked civilian exodus, or the need to defend an airport perimeter under fire.
Groupthink: Internal dissent was suppressed to maintain political alignment. Without structured red-teaming or challenge sessions, flawed assumptions went untested.
The outcome was chaos: a reactive defense at Hamid Karzai International Airport, loss of life, and a strategic blow to credibility. This event is not just a geopolitical failure; it’s a masterclass in how bias at the senior level can unravel tactical execution on the ground.
At senior levels, situational awareness must evolve beyond tactical observation to strategic comprehension.
There are three layers of strategic SA:
Perception - What are the verified facts versus assumptions?
Comprehension - What do these facts mean in the context of our mission and risk tolerance?
Projection - What second and third-order effects could follow from our choices?
Leaders need to foster an environment of open dialogue and welcome challenges to prevent making decisions in isolation. They must challenge perceptions and encourage their team to view information from multiple viewpoints. The difference between perception and perspective is significant, and failing to recognize whether your team is operating from a single, unchallenged perception will eventually lead to failure.
Leaders must also build structures to challenge assumptions, integrate diverse perspectives (diversity of thought), and translate raw intelligence into strategic foresight. In short: awareness without analysis is observation. Awareness with critical thinking is foresight.
Here are some practical tools to strengthen critical thinking.
Structured Decision Frameworks - Utilize tools such as the OODA Loop, decision matrices, and pre-mortem analysis to mitigate reactive bias.
Intelligence Fusion Platforms - Integrate tactical, operational, and strategic data into a unified common operating picture.
Cross-Functional Collaboration - Break silos between security, operations, finance, legal, and communications for a complete situational view.
Red Teaming and “What If” Sessions - Institutionalize constructive dissent and stress-test scenarios.
Bias Awareness Training - Build teams that can identify cognitive traps under pressure and call them out in real time.
In security leadership, bias is not a flaw of intelligence; it’s a failure of awareness. The world’s most complex crises rarely stem from a lack of data. Instead, they occur when leaders interpret that data through narrow lenses, outdated assumptions, or the comfort of consensus.
It is not necessary to change biases to make good decisions, but we need to be aware of our own biases. Once we recognize our own bias, we can compensate for it and incorporate alternative views into our data collection and interpretation, thereby obtaining a more complete and accurate picture from which we develop our courses of action.
The remedy is disciplined, structured, critical thinking, applied not just on the ground but also in the boardroom, the fusion cell, and every high-stakes decision where uncertainty is the only constant.

