Nassim Taleb says prepare for the worst-case in fat-tailed events like pandemics and war

Russ Roberts: Another way to say it–and, again, to put it into folksy terms that I also got from you–is that you care about–you don’t care about the average depth of the river if you’re going to walk across it and you can’t swim. You care about where it is deepest. And, if the average is three feet high, but there’s a large stretch where it’s 60 feet deep and you’re going to die, that’s what you care about, not the average.

 

And, here you’re saying that if the average death rate from this pandemic forecast, which is based on a zillion assumptions also, is a million people, that that prediction is essentially meaningless.

 

Nassim Taleb: … Which is why we should not forecast numbers. That’s the first reason why a single-point forecast is highly unscientific, naive. What I call naive, thinkatory statistical approaches…

 

And, then there’s a whole science of extrema coming from statistics, applied in insurance, applied in risk management, applied in finance, applied in many fields extremely successfully, that look at what robust claims you can make about maxima.

 

And, that’s the one I used in my paper with Cerillo. We used exactly the same techniques used for the expected maximum or the average maximum–which is different from the average and different from the maximum–water level. And, of course, we modified it. We had to find robust tricks. It was complicated. But, over the years, we developed a technique to deal with extreme value theory applied to some classes of phenomena–namely, wars and the fattest of fat tails of all, pandemics.

 

So, when we started warning about pandemics in January, it was not the cry-wolf variety. We told, ‘Don’t worry about anything else. Just worry about these things, things that are multiplicated, particularly pandemics, and here is the evidence.’

That was from an interesting EconTalk episode with Nassim Taleb, “The Pandemic.” Using his precautionary principle, Taleb made a strong warning about Covid-19 back in January, understanding the systemic risks of pandemics. War is another systemic risk that can result in complete ruin. The United States already invests a great deal in preparedness for war. But it is of a business efficiency that has not invested in mobilization/surge-capacity as a capability in itself. That could lead to failure.

Certainly the ability to quickly ramp up production of the latest systems deters aggression or achieves victory just as current force structure does. I’d argue that the US needs to balance it’s investment between force structure and maximum production rates, investing more in the latter. Consider how much of the investment in force structure may be rendered obsolete by enemy advances. For example, in 1941: “Nazi aircraft innovations, such as leak-proof tanks and heavier guns and armor, had rendered about 2,700 of the contemplated 5,500 planes obsolete oven before they were ordered.”

Today, much of the US weapon systems are older than ever before, with fighter aircraft averaging over a quarter-century and tankers over 50 years. Could they be rapidly obsolesced in a slowly escalating conflict with peer nations?

There are several other reasons to believe industrial mobilization preparedness should be part of the protocol for dealing with a systemic risk. Preparedness should take the form of stockpiling critical materials, and investment into production capital as programs in themselves.

Sometimes, progress is made faster when building better tools rather than end items. For example, a program on biology wouldn’t have gotten very far without investments in the electron microscope. Similarly, making a massive program of 3D printing or space manufacturing may enable wholly new systems concepts that can be rapidly assembled from existing subsystems.

Investment into industrial mobilization should take care of the most important risks to national security — a protracted peer conflict. This should allow the United States to rest assured when taking risks in acquisition processes, such as experimentalism and delegating authority. Here’s Taleb:

We are paranoid for large-scale risks and certain classes of risk. And, what I’ve tried to do with my collaborator is figure out what are the systemic risks we should be avoiding, which is liberating because it allows us to take a lot of risks elsewhere.

And here’s another great part:

So, this is what we said in the paper. And, we said one thing. We said, ‘Science is not about accounting. It’s not bean counting. Science is about understanding a phenomenon.’

Because bean counting only helps you predict outcomes for things that are normally distributed. We do so much bean counting on weapon systems, and baseline them to a specific cost number. Obviously, weapons system developments are fat-tailed distributions. They will very rarely cost much less than the average, but could potentially cost orders of magnitude more. We’ve seen plenty of S-curve cost estimates move to the right, where the updated minimum cost far exceeds the baseline maximum. Weapons development choice (Pre-Milestone C) is *not* about cost-effectiveness analyses, but about “understanding a phenomenon.”

1 Comment

  1. Strange – I read “Black Swan,” “Antifragile,” “Fooled by Randomness,” and “Skin in the game” before I read that Mr. Taleb believes all semi-automatic weapons should be banned (all weapons owned by the citizenry should in precis be single-shot). While I have immense respect for Mr. Taleb, I’m nonplussed. It’s very difficult in my mind – it’s logically impossible – to be Antifragile while helpless. In the three decades during which I was a bodyguard, police officer, and special operations military instructor and advisor (it was I who in 1958 wrote the first paper and T.O.&E for the S.W.A.T team, a paper which I would in 1967 reprise while a criminology student at Iowa U.), I not once so much as thought of anything so illogical and, with all due respect absurd. I think I need to hear more of Mr. Taleb’s reasoning.

Leave a Reply