“The U.S. military spends billions trying to predict what its adversaries will do — and almost nothing testing whether its own plans make sense.”
“this asymmetry reflects the decline of red teaming, the practice of systematically challenging plans to expose biases, blind spots, and weak assumptions before they become operational liabilities.”
“Red team methodology primarily targets three failure pathways in defense and military planning.”
“Biases like optimism bias or mirror-imaging creep in when planners unconsciously assume the adversary will behave as they would, or that their preferred outcomes are more likely than they are.”
“Blind spots emerge from organizational or institutional silos. Planners may simply not see how logistics, allies, civilians, or political dynamics could shape the battlefield in ways they haven’t anticipated.”
“The most dangerous failure pathway is unevaluated assumptions — uncertain claims upon which entire plans depend, but which rest on weak arguments, weak evidence, or both.”
“Integrating AI into planning amplifies these risks, as it can encode biases in its training data, create blind spots when algorithms function as black boxes, and generate confident outputs based on unvalidated assumptions.”
“I served on the U.S. Central Command Red Team in 2012–2013, when many combatant commands maintained in-house red teams. Our mission goal was simple but uncomfortable: expose flawed logic before it hardened into the command’s war plans.”
“After the U.S. invasion of Iraq in 2003, commanders took stock of planning failures, such as overestimating Iraqi security force capacity, failing to anticipate how de-Baathification could fuel insurgency, and underestimating the rise of sectarian violence. To avoid repeating those mistakes, they began embedding structured contrarian analysis into planning through red teaming. Building on older “devil’s advocate” and wargaming traditions, the Army spearheaded formalizing the practice, founding the University of Foreign Military and Cultural Studies in 2004 to teach planners how to recognize cognitive bias and challenge assumptions. Within a decade, many combatant commands had adopted similar methods, making red teaming a routine feature of the joint planning process.”
“Red teaming declined for understandable reasons: time constraints, resource limits, and resistance to being challenged.”
“Checking your work as you go isn’t waste: it’s prudent planning and the only way to make best guesses as to whether plans can withstand implementation.”
“Red teams don’t exist to second-guess planners or commanders. Rather, they exist to stress-test logic before it hardens into operational plans — to ensure confidence is based on examined assumptions rather than untested optimism.”
“the complaint that “red teams identify problems, but nothing changes” points to a system problem rather than a red team problem. When a red team’s findings reach commanders only as optional briefings rather than mandatory decision points, and when planners face no requirement to document why they accept or reject alternative planning scenarios, warnings become noise.”
“The U.S. Intelligence Community has faced similar challenges with handling alternative analysis. The 2015 revision of Intelligence Community Directive 203 addressed this problem by requiring intelligence products to identify core assumptions, present credible alternatives, and document why those alternatives were set aside. That model could readily be applied to planning: require red team findings to be presented to commanders and require planners to document their rationale for setting aside red team findings.”
“Another common objection to red teaming is something along the lines of “we already do extensive wargaming — isn’t that enough?” It’s not. Wargaming tests plans against an adversary’s courses of action, revealing vulnerabilities in execution.”
“Red teaming challenges the logical foundations and assumptions underlying those plans, revealing flaws in conception.”
“A plan can perform well in a wargame while resting on faulty assumptions about partner force capacity, logistics, or political constraints. Both forms of challenge are necessary, but neither is a replacement for the other.”
“Dissenting views rarely die from bad evidence — they die from lack of process.”
“Without institutional mechanisms to channel alternative analysis into decision cycles, uncomfortable insights get swept aside.”
“If speed is consistently prioritized over ensuring validity, the ability to anticipate failure atrophies.”
“Make Red Teaming Everybody’s Business”
“Study and Resource What Works”
“Link Red Teaming to Ongoing Planning Cycles”
“Integrate Forecasting Methodologies”
“Red teaming didn’t fail — it became inconvenient. But plans built on convenient assumptions don’t survive contact with reality.”
“Red teams were never meant to make planning comfortable. Irritation was the point. Better that friction arise in the planning room than on the battlefield.”