Default to safe
Decision scientists study rare events by bringing people into the lab and asking them to make choices. For example, in their Nobel Prize-winning work, researchers Daniel Kahneman and Amos Tversky had people make choices between two options: one safe, one risky.
A typical choice might involve a safe option where you'd walk away with $5, guaranteed. Alternatively, you could choose to take a gamble and receive $15 with 90 per cent probability. However, if you lost the gamble, you would have to pay $35.
If you'd just take the $5, then you're not alone. Despite the gamble being clearly better than taking $5, in terms of what you would win on average (0.9 x $15 - 0.1 x $35 = $10), the loss of $35 looms so large in the mind that many of us tend to choose the safe option.
In this scenario, the loss of $35 is a relatively rare event: it will only occur 10 per cent of the time. Yet we treat the rare event as if it were much more likely to occur than in reality. Kahneman and Tversky termed this the "overweighting" of small probabilities.
Of course, real-world rare events, such as disease control, shark attacks and terrorism threats, are much more complex than this fictitious gamble. But from a purely statistical point of view, it may be that we are disproportionately worried about such events, given their rarity.
For example, a poll conducted by Chapman University in the United States suggests that 38.5 per cent of people were "afraid" or "very afraid" of being a victim of terrorism.
This is despite the fact that only 71 people in the US were killed by terrorism between 2005 and 2015. To put that into perspective, PolitiFact reports that 301,797 people have died from gun violence in the US during a similar period.
So is it fear that drives us to believe that rare events are likely to happen?
According to David Landy, a researcher at Indiana University, who spoke on this very issue at the 2016 meeting of the Society for Judgment and Decision Making, the answer is no.
One question in Landy's survey asked people to estimate the proportion of the US population that was Muslim. The true proportion is slightly less than 1 per cent. People's estimates tended to be higher, around 10 per cent.
It is typically the case that people overestimate the population of Muslims in the US. The overestimate is often interpreted in terms of fear.
The idea is that people are more likely to pay attention to things that scare them, and this leads them to believe they are more common than they really are.
The "fear" explanation is intuitively plausible, but it may not be true.
In a critical comparison, Landy also asked about the probability of other events that also had a small probability, but would be unlikely to make people scared (such as what proportion of the US population had served in the military).
It turned out that people also overestimated the probability of these rare but uninteresting events. In fact, the degree to which they overestimated these other events was practically identical to how much they overestimated the population of Muslims.
Landy's result suggests that we simply have trouble in thinking about small probabilities, regardless of the topic. It may not be that some people overestimate the proportion of Muslims out of fear. Rather, it seems that we will overestimate the incidence of any rare event.
How to think about rare events
So how should we think about and respond to rare events?
One remedy might be to use what some researchers refer to as "meta-cognitive awareness". This is being aware of how cognitive processes, like memory, work when we try to think about and estimate the frequency with which things happen.
One meta-cognitive cue you might use is how easy it is to remember a particular event, such as hearing about shark attacks. But simply reading-off the ease of recall is likely to be misleading. This is because your memory is biased by positive instances: Going swimming and not being attacked by sharks is not surprising so it is not particularly memorable.
This failure of memory to deliver representative samples of evidence suggests a need to think carefully, not only about the bias in memory retrieval, but also in the samples available to us in the world.
Perversely, it suggests that when you want to work out how rare an event is (and an appropriate response), you should try to think about all the times it didn't happen (negative instances) rather than those when it did.
So next time you are at the beach and contemplating taking a dip, just think of the millions of swimmers who have never been attacked by a shark, and not the relatively few who have.
-The Conversation