Worst Case Thinking

From this article by Bruce Schneier, a famous name in the security community:

At a security conference recently, the moderator asked the panel of distinguished cybersecurity leaders what their nightmare scenario was. The answers were the predictable array of large-scale attacks: against our communications infrastructure, against the power grid, against the financial system, in combination with a physical attack.

I didn’t get to give my answer until the afternoon, which was: “My nightmare scenario is that people keep talking about their nightmare scenarios.”

It’s an interesting article. His point, which he makes often and which I agree with, is that people tend to be blind, stupid, panicky creatures who are incredibly bad at evaluating risk.

14 Responses to “Worst Case Thinking”

  1. Dan says:

    I had my first university-wide network security meeting a couple weeks ago (Whoo-hoo! I’m a real IT guy now…I just need a goatee). At one point the boss-man went over a couple things he called “extreme circumstances” and he named a couple actual, practical problems (power outages with generator failure and nasty worms more than anything) but then his underlings began getting ridiculous.

    I fought the urge, but I really, really wanted to raise my hand and say, “Personally I think a Resonance Cascade Scenario would be the most devastating thing to hit a campus.” Sigh…they say you only regret what you do not do.

  2. Kevin says:

    Resonance Cascade Scenario? Do we have coverage for that? How much would that run us Dan?

  3. Pitt says:

    On a similar note, there was aFrontline expose the other night on commuter airlines. While a lot of the stuff they “exposed” was pretty questionable stuff (lack of sleep, poor pilot training, maintenance issues) the fact remains that there are something like 25,000 flights every day, each one with an average of 50 to 100 people on board, and there’s a deadly crash about once every two years or so. That’s odds of around 1 in 18 million of being in that flight. Your odds of dying in many other ways (many of which we accept without a second blink) are far, far greater.

  4. Dan says:


    I think of the flu. 30 million cases per year, roughly 36,000 deaths per year. Even if you don’t take into account what demographic die of the flu (mostly people who are already sick with something else and/or very elderly) you’re still talking about a .12% mortality rate. Yet we see the fear mongering every year.


    Just give Doctor Freeman a crowbar and it will cost us exactly one crowbar.

  5. matt says:


    Twice as many people die from misuse of home appliances every year than commercial aviation disasters.

    That’s why I insist on running people through a metal detector before they can use my toaster.

  6. BrianN says:

    OK, just so we have an argument here, I’ll agree with you as long as you have never said any of the following:

    “The government should have forseen the banking meltdown”

    “The FBI should have caught Atta” or any other 9/11 prevention argument”

    “Toyota should have known about its breaking problems, and the government should have been watching them.”

    “There shouldn’t be any offshore oil drilling…”

    Etc. The problem is that there is a tendency to argue about things in hindsight. OK, the flu wasn’t that bad, but the data supporting that didn’t come out until a response was well underway. If it had turned out badly, and no response was made, people would be shouting for someone’s head.

    The other fallacy is to compare completely different types of risk. This is often compounded by taking statistics that apply to whole groups, and not considering what your actual risk is:

    “You have a better chance of getting hit by a car than falling off a mountain”

    That may be true, but A) it doesn’t mean you shouldn’t consider the risks of mountain climbing, B) you don’t know if that statistic compares my chances of getting hit by a car with my chances of falling off a mountain if I never climb a mountain, am an avid mountain climber, or am trying mountain climbing for the first time on my own without proper instruction.

    I think people’s response is to risk they don’t know how to manage. You have a sense of what your chances of getting hit by a car are because every day millions of people cross the street, and some get hit by cars, there is no reason to think today that is going to change today. Single catastrophic events are harder to understand because they are rare, and there is simply less data to help you predict.

    What makes it worse is when there is a perception that a new situation has arisen (such as the powergrid now being accessible to the internet, a new group of crazies making threats, etc.). I’ve walked through Seoul with no fear of being hit by artillery, but if I were over there today, I might have a little irrational fear of that.

  7. matt says:


    For some reason, security people have this weird one-up-manship ritual where they try to come up with more and more obscure, improbable, and paranoid scenarios to prove how “thorough” they are being. I think you may have gotten a glimpse of that.

    I worry about things like fires, theft, and some idjit putting a ditch witch through one of our inter-building fiber conduits. I don’t worry about meteors, terrorists, or the coming zombie apocalypse. I’m not going to worry about whether people can get their email in a disaster scenario that wipes out every building on campus – frankly, if the College doesn’t exist any more, I don’t think that getting your Daily Dilbert is really that important.

  8. matt says:


    There’s a phenomenon (which I think is called the “recency effect”) where people tend to evaluate the likelihood of a risk based on how recently it happened or how recently they learned about it. It’s the reason people throw out all of their oven cleaner or whatever after one of those late night news “exposes”.

    It seems to throw a hell of a wrench in the works.

    Frankly, our risk evaluation mechanisms are optimized for the world that we evolved in as a species, not this modern one we’ve built.

  9. Dan says:


    I won’t disagree with you that a lot of skepticism is in hindsight. Stuff like the oil spill and the market crash are, in fact, worst case scenarios that have happened.

    I guess my point is that there should be a select few who worry about it, and they’re the ones that have uber-failed, and because of *them* the public has a reason to be worried. There is an upper echelon of people whose entire job, entire life is to worry about the intricate workings and when they don’t do their jobs *at all*.

    The entire point of the SEC was to watchdog against the worst case scenario so that the everyday person doesn’t have to lose sleep about it. What good is a watchdog that is too busy licking its balls to bark about an intruder?

  10. Dan says:

    *grammar fail

  11. BrianN says:


    See. I think what really happens in the ‘recency effect’ is that something happens to make you re-evaluate your assumption of risk. An expose might bring forward cases of some unlikely event you never thought of, and force you to re-establish your understanding of the risk. If I didn’t know you could die of exposure to oven cleaner, then hear about several people who did, it might take some time for me to re-establish that this is a rare event. Furthermore, the characteristic of an expose is to concentrate many rare events into a seemingly small amount of time in order to increase the perceived risk.

    I would disagree that our innate risk management was evolved for a different time. I think rather that our cognitive ability gives us extra tools for risk management, which are the product of education and culture. However these tools operate much more slowly and are at the level of large groups rather than instinctively and at an individual level. Without the latter, I don’t think you’d find it easy to go about your daily life.


    Yeah, apparently letting other people manage risk for you seems like a bad idea. Maybe it’s better when more people do think about risk, rather than assume that govt’s go it all covered.

  12. matt says:


    Exactly. Here’s an example – a month or two ago, Dean’s crib was recalled. Out of the hundreds of thousands of them that had been sold, a dozen or so had collapsed.

    Intellectually, it is obvious that the crib was no less safe than it was the night before the recall was announced. But because this reevaluation took place, Sue immediately returned it – that very day – and didn’t even let Dean nap in it that afternoon.

    Our cognitive tools for risk management are fantastic. Unfortunately, there’s an instinctual, visceral side to it that often conquers the rational aspect and makes us fear things like terrorism, plane crashes, and meteor showers while ignoring statistically more likely causes of death like smoking and chili fries.

  13. BrianN says:

    Yeah, with children it’s a lot worse, you and your wife alone are responsible for everything that happens to another human being. I mean whatever he does to himself for the next 14 -17 years is pretty much your fault.

  14. Pitt says:

    mmm…chili fries.

    Good thing its not Lent.

Leave a Reply