Tuesday, January 15, 2019

RISK: We Are Terrible, Horrible, No Good, Very Bad at Talking About the One That Matters Most

For citizens to meaningfully participate in eliminating nuclear weapons, there are a number of conversations we need to be prepared to have -- with our associates in the community, and with our representatives in government.

It seems to me that at the top of the list is a conversation about risk.

We tend not to be very good at thinking about risk. Perhaps the canonical example of our difficulty is contained in the question, "What is more dangerous -- flying or driving?" People tend to think of flying as more dangerous, but statistically, per mile traveled, driving is far more dangerous. (Coincidentally, today's New York Times summarizes an updated guide to everyday risks: "Opioids, Car Crashes and Falling: The Odds of Dying in the U.S.")

Two observations:

(1) Many risks -- such as those just mentioned -- represent frequently occurring events. There are specialists (actuaries) who tabulate past occurrences and calculate odds of future occurrences. They provide the basis for the insurance industry.

(2) Even such well-documented and well-described risks are generally met by ordinary people with a very human response: "Great, now can I put this out of my mind?"

In the spectrum of risks, there is another kind of risk: the non-frequently occurring event. To tear an example from today's headlines, what is the risk that a non-politician will be elected president of the US, and then force a government shutdown in order to obtain appropriations for building a wall on the US border with Mexico?

You can venture guesstimates of the likelihood, but there if very little prior information upon which to base them.

(The term "black swan" was recently popularized for the extreme form of such a non-frequently occurring event: something that nobody saw coming.)

Two more observations:

(3) Without actuarial analysis, it is not possible to insure against these kinds of risks in the ordinary way. Still, there are people (risk managers) who do their best to try to make guesstimates of the odds, and to come up with (cost-justified) ways of avoiding such risks.

(4) These kinds of events are particularly susceptible to the natural human response: "I don't have to think about this, do I?"

It was this second type of risk that I focused on when I wrote about the example of a piece of a church facade falling and striking a pedestrian. That example helped me understand how, in advance of an event, it may seem hard to justify devoting a lot of resources to worrying about it; but after the event occurs, its importance becomes all too clear and unavoidable.

How might this be helpful in informing conversations about nuclear weapons risk? Well, it does seem helpful to recognize that nuclear weapons risk is more like the second category of non-frequently occurring event, and less like the category of frequently occurring, actuarial, insurable risks.

But in another sense, we are still not there yet. The truth came home to me the other night when I watched the Errol Morris documentary about Robert McNamara, The Fog of War.  In it, McNamara stresses the point, "With nuclear weapons, there's no learning period." In other words, if and when nuclear weapons are used, there will be no second chance.

This led to an epiphany for me: nuclear weapons risk does not occupy a place within the spectrum of all other risks. It occupies its own unique place. Yes, it is like the other non-frequently occurring events, but it is also different in an important way: the consequences are world-ending. (There was a time when it was fashionable to talk about surviving nuclear war -- contemplating a range of "tragic but distinguishable postwar states" -- but most people have now shed that illusion.)

So: what to do about this unique risk?


*   *   *


It seems to me that the peculiar feature of this risk is that the consequences are so outsized that they obviate any value in trying to suss out the likelihood. For once, we can all agree that something is unknowable.

For instance, I may believe that, under the current circumstances, nuclear war could happen in the next ten years. Another person may believe it could very well happen within one year. Yet a third might say, "The best estimate is that there will be one occurrence in 1,000 years."

But the magnitude of the consequences should make it possible for us to set aside our different guesstimates and focus on the intolerability of the outcome.

(And -- funny thing -- once we set aside our guesstimates it becomes possible to admit to ourselves how little confidence we can have in anyone's assurances about the likelihood/unlikelihood of nuclear war.)

Truly, this is unlike any other risk.


*   *   *


By the way, it might help if this unique risk had a name. For now, I will call it Kappa Risk (like the Greek letter K).

Think of it as the apex or "cap" -- the most outstanding of all risks.

K also happens to be the first letter in Καιρός (Kairos or Caerus), the Greek god of risk.

(Greek letters are used heavily in risk management, though K does not (yet) have a prominent role.)


*   *   *


I guess there is the problem of what to do about people who think there is zero probability of nuclear weapons actually being used -- "It will never happen." In other words, people who believe Kappa Risk doesn't exist.

Perhaps that is a topic for another day.


*   *   *


Even better than a name for this type of risk, I suppose, would be a picture or symbol. If 2018 has taught me anything, it is that an emoji is worth a thousand words.

How to sum up visually the idea of a one-of-a-kind risk, one whose consequences truly threaten to end our world, and whose likelihood is practically unknowable (but certainly real)? Something short and sweet -- representing the need to put this risk squarely on the table and then more forward to eliminate it?

Herewith, a proposal:


Design for an emoji: Kappa Risk.
(The one-of-a-kind risk that characterizes nuclear weapons -
consequences that truly threaten to end our world, and
likelihood that is practically unknowable (but certainly real).)
(Image: Joe Scarry)


(With apologies to The Emoji Movie.)

Now: in what ways might we be able to better accomplish our work as citizens once we can converse clearly about the singular risk of nuclear weapons?

No comments:

Post a Comment