Radical uncertainty + cost-benefit analysis = danger

Most big decisions that we take are unique. They’re one-offs. And that means you really do need to think through yourself what is happening now.

 

And the great danger of this precision is that either politicians as decision-makers can defer or deflect responsibility on so-called experts who provide them with a number, or the experts pretend that they know more than they do, in order to have bigger say and expand their own influence. And I think this is extremely dangerous. It means you can often miss big things.

 

Cost-benefit analysis is a good example where people will come up with a precise number for the value of a particular road building or railway building project. But very often, that’s built on very strong assumptions. It ignores important aspects of the decision, which you might actually do better to ask yourself, ‘So, what is really going on here? What are the big issues we should think about?’ Not just rely on some so-called expert coming up with spurious precision in the numbers.

That was Mervyn King, former Governor of the Bank of England, on the EconTalk episode about Radical Uncertainty. That analysis applies just as well to the acquisition process. Major weapon systems programs are always one of these “big decisions” that is fraught with uncertainty, and far more uncertainty than the social welfare derived from a routine road or railway project. Uncertainty about future warfighting conditions, user preferences, and technological achievement are fundamental to weapon systems choices. Much of the acquisition process is there to put an objective number on a decision, or even as “s-curve” of costs.

Here’s a bit more from Mervyn:

And our view of radical uncertainty is that it’s uncertainty that you cannot easily quantify.

 

I mean, the best example, I think is what we’re going through now, COVID-19, in which we knew, well before it happened, that there could be things called pandemics. And, indeed, we say in the book that it was likely that we should expect to be hit by an epidemic of an infectious disease resulting from a virus that doesn’t yet exist.

 

But, the whole point of that was not to pretend that we, in any sense, could predict when it would happen, but the opposite. To say that: the fact that you knew that pandemics could occur did not mean that you could say there was a probability of 20% or 50% or any other number.

Be the first to comment

Leave a Reply