Thursday, December 30, 2004

The Slings and Arrows of Outrageous Fortune

My friend Doug sent me this article by law professor Cass Sunstein on risk analysis.

The truth is that, when it comes to risk, people often think poorly. Research shows that much of the time we fixate on bad outcomes without stopping to assess the probability that we will actually be harmed.

Sure, we want to be "safe" and "protected," but safety and protection are inevitably matters of degree. Often, we neglect the size of the risk altogether.

Consider the astonishing finding, from University of Pennsylvania economist Howard Kunreuther and his colleagues, that many people will pay the same amount for insurance against risks of 1 in 100,000, 1 in 1 million and 1 in 10 million. We don't have much experience in thinking about low probabilities like these, and so we pay no attention to differences that really should matter.

When our emotions are engaged, our judgment gets even more muddled. We focus on what looks like the worst case, giving no thought to the likelihood that it will occur. Vivid, dramatic images of harm -- hazardous waste sites, nuclear accidents, terrorist attacks -- can lead us to excessive fear of highly improbable risks. Social scientists find that when people discuss such risks, their concern usually rises, even if the discussion consists mostly of trustworthy assurances that the likelihood of harm is tiny.

But when we lack vivid images -- as in the case, say, of obesity or sun exposure -- we often treat the risk as if it were zero. The result is that we badly overestimate some risks and underestimate others.

Studies by psychologist Paul Slovic prove the point. Because grisly accidents are more dramatic than deaths from disease, most people think that accidents kill more people than disease. But the opposite is true. (emphasis added) |Link|


Risk analysis is an important topic and I think this article points out how we often get our risk analysis wrong as a society. Because we live in a democracy, popular fears and misconceptions are often codified into law.

Canadian sociologist David Lyon in his book Surveillance After September 11 puts risk management into a larger societal context:
Contemporary societies produce risks on a large scale, just because they intervene so decisively in natural and social life, using a range of technologies to do so. Managing risk is now central to government activity. Since the Cold War era in the 1950s and 1960s, the dominant view was that security against the risk of foreign aggression (of Soviet power against the USA) could be guaranteed by technical and military means. Security technologies have proliferated , and with them two central beliefs: one, the idea that "maximum security" is a desirable goal; and, two, that it can be pursued using these increasingly available [security and surveillance] techniques that are on the market. (Lyons 2003, p. 46]


I think Lyons has a provocative thesis that we are a risk management society, but the question is how successful these efforts will be and what we are willing to trade for security (or the illusion of security) in terms of capital, liberty, privacy, and human dignity.

No comments: