How We Confuse Real Risks with Exaggerated Ones

  • Share
  • Read Later
Cass Sunstein earns his living researching how misplaced fears skewer our ability to assess risk, so he figured himself the last person to fall into the same trap. But when his teenage daughter planned a long-distance swim last summer, Sunstein found himself dwelling on the remote possibility she would drown. "It's crazy," says Sunstein, a University of Chicago law professor specializing in risk regulation. "But I couldn't counteract my brain's rapid, intuitive emotional system for evaluating risk."

Few of us can, and that's a dangerous problem. When our emotions overtake our reasoning we worry about sensational events which are statistically unlikely to harm us — such as airline disasters, shark attacks, or terrorism — rather than everyday dangers that kill thousands. John Graham, who spent four years as administrator of the federal Office of Information and Regulatory Affairs, says news of SUV tire failures left him besieged with demands for tire pressure warning systems even though government reports listed 41 car-crash deaths per year due to under-inflated tires, versus 9,800 deaths from side-impact crashes. "People's capacity to visualize a risk is an important part of the attention they give to it," says Graham. "If you're within six months of a Three Mile Island, a Love Canal, or a 9/11, the policymakers and the public don't have the patience for the kind of cerebral risk analysis we need."

That falls in line with what Princeton professor Daniel Kahneman coined "the availability heuristic": the concept that if people can think of an incident in which a risk has come to fruition, they will exaggerate its likelihood. "Somehow the probability of an accident increases [in one's mind] after you see a car turned over on the side of the road," says Kahneman, who won a 2002 Nobel prize for his work. "That's what availability does to you: it plants an image that comes readily to mind, and that image is associated with an emotion: fear."

But our experiences also sway us, goading our brains into assessing risks based on rapid whispers of positive or negative emotion. "If you look at genocide, we just don't react," says Paul Slovic, a psychology professor at the University of Oregon. "With 9/11 we lost 3,000 people in one day, but during 1994 in Rwanda 800,000 people were killed in 100 days — that's 8,000 a day for 100 days — and the world didn't react at all. Now you see the same thing with Darfur."

Nassim Taleb, a probability expert at the University of Massachusetts, says the first step to better risk assessment is understanding that most dramatic news images represent the exception rather than the rule. "Television," he says, "messes up the probabilistic mapping you have of the world." Our questionable math skills don't help either; most people have trouble distinguishing the statistical difference between one chance in 1,000 and one chance in ten million. "Both sound small," says Graham, "but one is ten-thousand-fold more likely." Understanding those numbers, rather than taking what Sunstein calls a "risk-of-the-month" approach, will save lives. "Right now we've got a lot of concern about vivid events," Sunstein says. "We'd do much better with a more disciplined approach."