Having made it this far in the book, you might rightfully surmise that we are not all that optimistic about the ability of people to make good protective decisions when faced with low-probability, high-consequence events. As decision makers, we look too little to the future when thinking about our choices, are too quick to forget the past, and try to overcome these limitations by imitating the behavior of other people who are just as prone to these flaws as we are. In addition, our tendency to be overly optimistic and impulsive, and to choose the status quo when we are unsure about what action to take, creates what could be called a perfect storm of potential decision errors. Our brains, we suggest, are simply not designed to think effectively about how to prepare for rare events that are beyond our domain of experience.
What makes matters worse is that if one looks at the landscape of modern approaches to preparedness, one sees little evidence that attempts are made to acknowledge, much less accommodate, these cognitive limitations. People residing in hazard-prone areas are provided with multipage checklists of preparedness measures they should consider, when in fact they are likely to adopt only one or two. People are urged to invest large amounts in building safer homes, when in fact they are likely to be discouraged by the time, cost, and resources required to get the project started; their mental time horizons for payback rarely extend beyond a year or two. Emergency management offices conduct training exercises designed to simulate disaster response, but these exercises tend to replicate how people will act when they have all their System 2 (deliberative) resources at their mental disposal—not the System 1 (instinctive) processes that will likely rule when a disaster is real. It is thus not surprising that well-intentioned preparedness plans often fail when put to the acid test.
All is not lost, however. While we may not be able to alter how we think, knowledge of the decision biases discussed in this book might nevertheless be leveraged to anticipate how people and organizations may err when deciding how best to prepare for low-probability, high-consequence events. We can then formulate steps to mitigate these errors. We term this approach the behavioral risk audit. Like a financial audit, the behavioral risk audit is designed to provide communities and individuals with a systematic framework for characterizing their state of preparedness for different potential disasters, identify weak links, and suggest remedial solutions.
For novel disasters and hazards for which no existing preparedness plans exist (e.g., new pandemics or cyberterrorism), the audit is a tool for anticipating the biases that can arise when people think about the personal risks these disasters and hazards pose to them, their community, and other stakeholders. These biases, in turn, can then become the focal point of planning when preparedness tactics are designed. For well-known disasters and hazards (e.g., flood and storm risk), the audit provides a tool for identifying the tactical weak points of existing preparedness programs when they are put into practice.
The audit departs from existing practice in that it focuses on those who will be preparing for or responding to the hazard rather than on the hazard itself. Standard approaches start by analyzing the objective nature of the risk faced by individuals or communities, the vulnerability of the buildings and infrastructure, and then consider protective measures that people might take to mitigate that specific risk. The behavioral audit, in contrast, proceeds in reverse order: It starts by encouraging planners to think first about how individuals in hazard prone areas are likely to perceive risks and why they might not adopt different preparedness measures. Given this constraint, planners can then design preparedness plans that work with rather than against peoples’ natural decision biases. In this way, the behavioral risk audit draws heavily on the principles of choice architecture, a term coined by Thaler and Sunstein93 that highlights how people can be “nudged” into undertaking behaviors that benefit them by creating decision environments where better (in our case, safer) choices are the ones that come most naturally.
A behavioral risk audit consists of four steps of analysis for a given hazard context.
The outcome of the behavioral risk audit will be a problem-solution matrix that provides planners with an explanation of: the biases that can lead to distorted perceptions of risk, how misperceptions may be manifested in preparedness errors, and possible remedies. The general form of the problem-solution matrix is given in Table 8.1.
In chapter 9, we will give a detailed example of how the audit might be applied in a given context. Note that the analysis is structured as a series of guided questions that would be considered by a planning team. For example, consider the bias of simplification (chapter 6), which is the inherent tendency for people to process only a small subset of the information available about a risk when making preparedness decisions. Here planners would be encouraged first to think about what a simplified view of a hazard might be from a homeowner’s perspective. A homeowner faced with a hurricane threat might focus only on the wind threat posed by the storm. Given this, he would then be encouraged to think about what this would imply about the kind of preparedness mistakes he might make—such as forgetting to prepare for rain and flood risks. Finally, the discussion would shift to how to overcome such oversights without complicating the decision environment—for example, by recommending single essential actions that vary by location.
Implicit in the behavioral risk audit is the assumption that the biases in the at-risk population will vary by context. Everyone has his own Achilles’ heel when it comes to making decisions about protection. For some people, it is myopia: They live in the moment and struggle to see wisdom in making protective investments whose payback lies in the future, no matter how compelling the appeal or how economically sensible it might seem to others. For others, it is unbridled optimism: No matter how urgent the warnings are, they see risks from hazards as something that will happen to others, not themselves. As such, the output of a behavioral risk audit will not be the recommendation of a single remedy for enhancing preparedness, but rather a suite of measures designed to target the different biases of a population of individuals, each with his own psychic flaws.
Finally, it is important to emphasize that the audit is not envisioned as a one-time exercise, but rather one that is continually revisited and revised as protection plans are developed. In the early stages of planning, for example, the audit provides a tool for envisioning hazards and existing preparedness measures through the eyes of stakeholders, while in later stages, it would be used to assess the sufficiency of existing policies.