Humans are Tone Deaf to Probabilistic Reasoning

In his book The Blank Slate, Steven Pinker makes an interesting point about the fact that our brains haven’t evolved to intuitively grasp probabilities. It is an acquired skill like reading, not something innate like walking.

This can lead to scenarios like this:

Some experts are describing the risks of accident at some nuclear waste storage site. They describe various conceivable sequences of events that could lead to failure. For example, accidental drilling in the wrong place, erosion, undetected cracks in the rocks, which could lead to groundwater contamination. Then natural water movement, or volcanic activity, or a large meteorite impact could damage the site and release radioactive materials in the biosphere. The experts will estimate the probability of each event, or each chain of event, and numbers like 1 in X millions will come up.

To the experts, this is reassuring. They are basically saying: “This is pretty damn safe.” But they are speaking a different language from most people.

As Pinker says:

“When people hear these analyses, however, they are not reassured but become more fearful than ever — they hadn’t realized there are so, many ways for something to go wrong! They mentally tabulate the number of disaster scenarios, rather than mentally aggregating the probabilities of the disaster scenarios.”

How to make it better
How can we improve the situation in the hope that more rational decision-taking will take place? The only short-term realistic solution seems to be education. Just like we have no innate cognitive organs for abstract mathematics or reading but still can learn to use our brains to acquire these skills, we should try to make the understanding of probabilistic thinking more widespread.

We don’t expect people who have never studied physics at all to be any good at it, so why do we expect that decision-makers (and the voters/shareholders that supports them) will be able to understand probabilities – especially in complex situations – when they weigh the pros and cons of some proposal.

11 Responses to “Humans are Tone Deaf to Probabilistic Reasoning”

  1. Gene Says:

    What I find interesting about this particular description is that the analysis of risk by these experts, as described, seems to be based on next to no actual data, merely a finger in the air. A lifetime of experience may make them good judges of risk in situations with which they are familiar and as unsuited to judge risk in any other situation as your average, non-physics studying individual.

    Similar to the executives and risk managers at most of the top financial firms who could not conceive of the series of “unlikely” events that would need to hit to find us in the situation we are in now expert opinion should be question and should not necessarily be trusted any more than the average person’s estimation of risk.

  2. Michael Graham Richard Says:


    Of course nobody can predict the future, but depending on the field, experts can often make informed guesses which are superior to “finger in the air” guessing.

    Eg. A geologist might figure out the average frequency of earthquakes and volcanoes in certain areas, study erosion rates, etc.

  3. ekzept Says:

    The problem is that because, by definition, extreme events are on the far tails of distributions, it’s impossible to calibrate them using empirical data. The data required (almost) never happen.

    Thus, there’s no choice but to construct the probability of the extreme event from a conceptual model of the apparatus involved.

    So, if you consider this as being “based on next to no actual data, merely a finger in the air”, it is necessarily so that it is.

  4. Rob Says:

    While I agree with the message of the article I think Gene has a point about the example given. When an ‘expert’ quotes the odds of a particular place being hit by a meteorite the figure he gives is meaningless; it is unquantifiable. He may as well just be honest and say ‘as close to zero as makes no difference’ As for earthquakes anybody can say that New York is less likely to experience an earthquake that San Francisco but a nuclear plant is likely to be built in an area that has no history of seismic trouble at all and that puts us back to ‘meteor strike’ territory ie nobody, not the best seismologist team in existence, has a clue but they do know the risk is not zero. Basically any time an expert starts quoting odds in multi-millions he has entered the realm of bullshit; he is just unwilling to say what he what he really feels ie ‘Zero’

  5. Michael Graham Richard Says:

    I find it interesting that everybody seems to focus on the specific example given instead of on the general principle. I wonder if that would have happened if the example had been about something with bigger odds (car accident, murder, falling down the stairs, etc).

    In any case, the fact that odds about something that happens very rarely are less precise than odds about something that happens often is besides the point; even those fuzzy odds (and we can often quantify the margin of error too) are better than no odds at all if the goal is to take the most informed decision possible.

    In a way, this might actually be reinforcing the point of this post: Some people seem to see things in a binary way. Either the experts are right or they are wrong. While in facts it amounts more to a probabilistic estimate of the quality of your information: somewhat unreliable information is still better than very unreliable information.

  6. Ken Trough Says:

    I think there is definitely something to this assertion. I have the ability to see lots of probabilities and possibilities and interconnections and this seems to be a skill that those around me tend to lack. In fact, this is a big reason why people label me “smart” (whatever that really means), so I think it is generally perceived as a rare-ish skill by the general population.

    Very interesting stuff. Thanks for drawing attention to this.

  7. jes Says:

    As much of our intuitive abilities are based on embodied cognition (Lakeoff) it makes sense that we would not have more intuition about probabilities than can be literally gained from simple child’s play and experience.

    It would be interesting to trace out what learned, embodied metaphors are the basis for what types of probability we *are* good at and them examine how we either 1) might map our badly intuited cases to existing metaphors (essentially forms a systematic teaching method for important but hard to intuit probabilities) or 2) construct alterations to childhood environments that could generate the metaphors that would aid our badly intuited probabilities.

    I’ll only involve the name of “Nassim Taleb” and “Black Swan” this once but he gives the simple example of how we might have bad rare probability estimation because we may use our intuitions based on embodied metaphors of area (R = p * C, after all) which breakdown when the area is formed by a long, skinny rectangle because we are bad at estimating physical dimensions that vastly exceed our physical body scope of distance. We can estimate “yards”, “feet” and “cubits” well because these are based on our body lengths but longer distances are not.

    The key probability probably of most interest is a power-law probability. This can be “reduced” to a straight line with a logarithm but we don’t have a natural intuition for logarithms – there is no embodied metaphor that captures it. In fact, there is no simply embodied metaphor for multiplication or division either (Lakeoff co-wrote a book on mathematics and embodied metaphors – don’t have my copy handy). Part of the problem is the simple visualization of “powers of ten” but then it also needs a linkage to “cause-and-effect” and “time arrow” metaphors in some way. Again our intuitions of time are very badly constrained to scales as narrow as length. Realistically that’s between about 1 second on the low end and maybe a few minutes for most people to at most an hour on the population tail – everything else outside this range is “tool-aided” and abstracted learned understanding.

  8. Bottlerocket Says:

    I agree. I don’t agree with one aspect of your thesis, however. Namely, that ‘training in physics”somehow immunizes a person from errors in probabilistic reasoning. Scientific training helps with, but in no way assures, correct reasoning skills. You should refer to the work of Daniel Kahnemann and Amos Tversky for more on this topic. Or, check the research on medical errors. You’ll find that medical training in no way immunizes doctors from errors in probability estimation.
    They are psychologists whose work not only point out the difficulties people have with probabilistic reasoning, but also the reasons why people have those difficulties. My favority is the availability heuristic–the more vivid an event is, the great the weight we give it–because its “vividness” makes that event highly available in memory. Kahnemann won the Nobel Prize in Economics for his work. I think it would add to your argument.

  9. Michael Graham Richard Says:

    “I agree. I don’t agree with one aspect of your thesis, however. Namely, that ‘training in physics”somehow immunizes a person from errors in probabilistic reasoning.”

    Sorry if I wasn’t clear. I meant that we don’t expect someone who hasn’t studied physics to be any good at PHYSICS, so why do we expect someone who has never studied probabilistic thinking to be any good at that.

    I’ll check out the works you suggested, thanks. I’m very interested in cognitive biases and heuristics. I actually have “Judgement under Uncertainty” (the Kahnemann/Tversky/Slovic book) on my desk, but I haven’t read it yet. I also have “Rational Choice in an Uncertain World” by Dawes.

  10. Max N. Says:

    Given the possible consequences, evaluating nuclear energy’s safety purely on the basis of the probabilities of an accident is not “rational”, it’s *foolish*.

    For starters, you are not factoring in things like human greed (watch “The China Syndrome” again). That aside, low probabilities of extremely dire consequences is still not a good deal in the mind of a truly rational person.

  11. Michael Graham Richard Says:

    Hi Max,

    Thanks for commenting. This post wasn’t about nuclear power, that was just an illustration, but anyway…

    “Given the possible consequences, evaluating nuclear energy’s safety purely on the basis of the probabilities of an accident is not “rational”, it’s *foolish*.”

    It’s all about opportunity cost. Nothing has no consequences. No nuclear might mean lots of coal for baseload generation, which leads to global warming, mercury poisoning of the food chain, smog, pollution caused by mining, etc.

    “That aside, low probabilities of extremely dire consequences is still not a good deal in the mind of a truly rational person.”

    Depends. It might be better to take that small chance, and work at making i even smaller, than to take a much higher chance of hurting and killing people.

    For example, global warming, smog and mercury poisoning of the food chain caused by coal plants might very well hurt tens if not hundreds of millions of people over the next century, yet we’re building coal plants every week around the world. Is that more rational?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: