Risk mitigation and the question of Chinese anti-stealth radar

Here’s the excellent Chad Millette, the chief learning officer at Space System Command, on LinkedIn:

Eric Lofgren, you suggest in reference to the story about China’s anti-stealth radar that even if there’s only a 1% chance they can actually do what they claim, the impact ought to require hedging on the part of the US.

 

Brings up an excellent #riskmanagement discussion. My colleague Richard Sugarman despises the 5×5 risk matrix most often used in DoD (and I would suggest all) project management. One of the reasons is that he (rightly) criticizes, while the likelihood axis is bounded (1-99%), the consequence axis is not.

 

This example provides a practical application of this issue. Let’s say the risk (maybe this is an F-35 program risk) is stated as such: If an adversary is able to detect stealth aircraft, then our capability is degraded (I know, I know ‘degraded’ is terribly ambiguous… but should work to make my point). We would characterize the risk as a 1 in likelihood (1% chance) and a 5 in consequence (say something like ‘severe mission degradation’). On most 5×5 matrices, a 1-5 risk would be ‘Yellow’ and therefore wouldn’t require developing a handling plan (where DoD instructions mandate programs to identify handling plans for all ‘Red’ risks).

Example risk matrix with the location of the risk: Successful Chinese anti-stealth radar. While the conclusion of a “Low” risk may be true for a particular program, like fielding the F-35 on-time and on-cost, it is very consequential from a DoD-wide point of view due to the severity of mission impact.

The question is: should DoD stealth aircraft programs employ some manner of risk handling to counter this risk; or is the likelihood so low as to not require diverting scarce program resources?

 

I recognize this may be more of a Department-wide strategic risk than a programmatic risk, but the tenets implied in the question still apply. Do managers/strategists spend time coming up with ways to deal with risks with extremely low likelihoods of occurring, but correspondingly high consequences if they do?

People with more technical knowledge than I can debate whether adversary radar can detect stealth aircraft, how well, and whether that translates to targeting or otherwise. But it’s hard to say there is a zero percent chance of countermeasures to stealth appearing over the next decade or more (F-35s will operate into the 2070s). This problem lays bear some of the shortcomings of DoD’s systems analysis/PPBS process.

Decisions start with a strategic goal, say suppression of enemy air defense. That then creates a requirement that needs to be defined through an analysis of alternatives. To husband the Department’s resources, just one program will be created for the requirement, and so the analysis selects for the most likely contingency: A requirement for a stealth aircraft to penetrate and deliver precision munitions. Then all of the requirement’s resources are funneled into that most likely solution, completely neglecting any countermoves the enemy may make.

Here’s Armen Alchian describing the problem when critiquing the systems analysis approach in 1954:

If the aircraft companies all look to the kind of analysis performed at RAND, each firm will, after frustrating itself trying to predict the future state of the world, pick what it thinks is the most likely, or most acceptable to conventional Air Force thinking… They will regard the most likely future state of events as the one on which to place their development money. If the possible future states under Category I were predicted with probabilities .6, .2, and .2, it would be a little difficult to believe that any companies would design for the .2 contingencies. Their interest is in procurement; it is not in providing a balance of developments.

Alchian was writing at a time when firms took “loss-leaders” to develop most of a system in order to be positioned to respond to military needs. That paradigm was destroyed by the system analysis/PPBS approach, but the same fundamental contingency problem appears in DoD staff planning for new programs.

Be the first to comment

Leave a Reply