Preparedness Paradox

y2k

The preparedness paradox is the proposition that if a society or individual acts effectively to mitigate a potential disaster such as a pandemic or other catastrophe so that it causes less harm, the avoided danger will be perceived as having been much less serious because of the limited damage actually caused.

The paradox is the incorrect perception that there had been no need for careful preparation as there was little harm, although in reality the limitation of the harm was due to preparation. Several cognitive biases can consequently hamper proper preparation for future risks.

Preparing for a pandemic is a particularly evident example of the preparedness paradox. Because adequate preparation means that no mass deaths or visible consequences will occur, there is no evidence that the preparation for the pandemic was necessary.

Historical perspective can also contribute to the preparedness paradox. From the point of view of historians after the Year 2000 problem, the preventative action taken has been described as an ‘overreaction,’ instead of a successful effort to prepare for an upcoming problem. For disaster management professionals, this is an example of a no-win situation.

Organisms with faster life histories and shorter lives are disproportionately affected by chaotic or hostile environments. These types of organisms innately have a greater fear of environmental disasters or emergencies. However, organisms with slower life histories, such as humans, may have less urgency in dealing with these types of events. Instead, they have more time and ability to prepare for such emergencies.

The term ‘preparedness paradox’ was used in 2017 by management consultancy Roland Berger regarding executives in the aerospace and defense industry: almost two thirds of those surveyed reported that they were well-prepared for geopolitical changes, about which they could do nothing, while feeling unprepared in areas such as changes in technology and innovation, to which they should be much more able to respond. In contrast, other surveys found that boards and financial professionals were increasingly concerned about geopolitical risk. Berger concluded that there was an urgent need for more and better business strategies throughout industry to close this gap in preparedness.

Cognitive biases play a large role in the lack of urgency in preparation, hampering efforts to prevent disasters. These include over-optimism, in which the degree of disaster is underestimated, and the fact that many disasters do not reach their breaking point until it is too late to take action. In over-optimism and normalcy bias, people believe that disasters will happen elsewhere, and even if they do happen locally only their neighbors will be affected.

Another obstacle to preparedness is the interval between disasters. When there is a long time between disasters, there is less urgency to prepare. This is due to fewer people remembering the last disaster, which reduces its emotional impact on the group. This effect is heightened when some measure of action is taken to prevent the disaster, which further reduces the memory of the original danger and consequences.

Financial concerns can also contribute to the preparedness paradox. There is a tendency to over-value known short-term costs, as well as to under-value unknown long-term rewards. The fact that preparing for disasters is expensive in the short term and its value in the long term cannot be determined could lead to catastrophic consequences if the choice is made to not prepare.

A specific application of the preparedness paradox is the ‘Levee paradox.’ Levees are structures which run parallel to rivers and are meant to offer protection from flooding. Paradoxically, their construction leads to a reduced awareness of and preparation for floods or breaches. The perception of safety also leads to unsafe land development in the floodplain which is supposed to be protected by the levee. Consequently, when a flood does occur or the levee breaches, the effects of that disaster will be greater than if the levee had not been built.

The 2011 Tōhoku earthquake and tsunami and the resulting Fukushima Daiichi nuclear disaster is another example of the preparedness paradox. Even though the Onagawa Nuclear Power Plant was closer to the epicenter of the earthquake, it withstood the cataclysm because of the preparations made by the facility’s owners. On the other hand, the Fukushima Daiichi Nuclear Power Plant suffered heavy damage because of a lack of preparation due to the perception of less risk.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.