6.17.2010

Risk Denialism and the Costs of Prevention


Sol's posts about the BP oil spill (here and here) got me thinking a little bit about the interplay between denialism and risk and how deeply related that is to the sprawling mess of concepts we put under the umbrella of "sustainable development."

One of the major themes in the coverage of the spill has been how poorly equipped BP was to deal with a spill of this magnitude. Now, regardless of your opinion of BP as a company, ex-post this is a bad thing. BP would much rather, right now, be known for its quick, competent and effective response to a major catastrophe than be roundly (and for that matter, rightly) villainized. They've lost about half their market cap since the spill happened, and now they have to set up a $20 billion clean up escrow account. Why weren't they better prepared?

There are a few potential answers to that. The spill's magnitude may simply have been completely unforeseeable, a "Black Swan" -style event that BP can be forgiven for not anticipating the same way New Yorkers can be forgiven for not buying tornado insurance. Or perhaps, net of prior cost-benefit analysis, the probability of a spill this big was so low compared to the cost of maintaining intervention equipment that BP decided to skimp on it, akin to how most New Yorkers spurn flood or wind insurance despite the fact that hurricanes intermittently hammer the city. But the answer that now seems most likely is that that there was a fundamental disconnect between what the rig workers told upper management (internal BP documents refer to it pre-spill as a "nightmare") and what upper management told them to do. This is akin to New Yorkers not buying renter's insurance after they've been told the burglary rate in their neighborhood is quite high.

What I find interesting about this is how closely the framework of this story jibes with so many of the other narratives in environment and development. A group of technically trained experts warns of a potentially catastrophic risk (climate change, overfishing, pandemic flu) only to have their warnings discarded by cost-bearing decision makers (politicians, corporate executives, voters) who deny that the risk is as great or even extant. In these scenarios, it's not that the decision makers wouldn't face massive costs should they turn out to be wrong, it's that there's a big difference between what their behavior indicates they think the probability of facing those costs is and what they're being told by technical advisors.

Why? Well, it seems that cost-bearing seems to have a very strong influence on how someone interprets difficult-to-verify information about risks, especially social / shared risks. In psychology this is known as the defensive denial hypothesis and there's a fair bit of empirical evidence to support it. The BP managers tasked with running the platform knew the immediate costs of reducing flow, or stopping drilling, or increasing safeguards, and it seems highly likely that this influenced how they interpreted warnings from the rig workers. The same sort of phenomenon seems to occur in a lot of other areas: fishermen are more sanguine about the risks of overfishing, oil executives downplay the risk of climate change, and derivatives traders claim that their activities are nowhere near dangerous enough to warrant regulation. Now, many other factors are clearly at stake in all of these, from discounting to strategic maneuvering to cheap talk, but given the genuineness with which deniers of, say, climate change argue their case, it seems difficult to say that they are not at least somewhat personally convinced that their interpretation of the evidence is correct.

Now that on its own isn't terribly revelatory, but when you combine it with the notion that perceptions of costs can be subject to manipulation as well, you get an interesting result. The more (or less) salient a risk's mitigation cost is made, the lower (or higher) people come to view the probability of that risk. Witness, for example, the repeated attempts to link efforts to combat climate change to personal tax burden by those who think (or claim to think) that it's a bogus risk. This may not even necessarily be an actual cost: the Jennifer McCarthy- led trend in parents refusing to vaccinate their children (leading to the actual risk of polio, measles, etc.) seems to be almost entirely a function of parents being led to believe that there is a potential cost to vaccination (possible autism) that is not supported by scientific evidence. BP's managers faced the immediate and salient costs of risk mitigation steps that they'd need to justify to highers-up and behaved in a way that seems to indicate that they didn't think the risk of a major industrial accident was worth fretting over.

So what? I think the lesson here is that while risk denial is often depicted as stemming from short-sightedness, or ignorance, or political zealotry, it's actually pretty common human behavior. People have preferences and like to align their behavior appropriately, and if that means that they have to subconsciously alter their assessments of how dangerous some far off activity may be, they'll do so. If we are concerned about arresting climate change, or preserving biodiversity, or managing natural resources, then it's important to keep in mind that the way people perceive the incidence of the cost of mitigation will not only affect their preferences in terms of raw cost-benefit analysis, but also legitimately move their perception of the riskiness of their behavior. If we want political support for efforts to deal with these sort of risks, it thus seems similarly important to find and then emphasize ways in which the costs can be made low and painless as it is to stress the potential for future damages.

No comments:

Post a Comment