The 2020 coronavirus pandemic for all it's horror and challenge, has highlighted certain uncomfortable truths about the human condition. One of these has been the impact of our cognitive short-comings: our difficulty understanding the non-linear and non-binary, and our susceptibility to cognitive biases.
Many of these problems led to missteps at the beginning of the pandemic response, and now early in the fight continue to impede our decisions. By better understanding these cognitive traps we can at least be more alert to our blind spots and alter our actions in response.
As early data seeped out of China in January, quickly followed by cases appearing in global travel hubs, many national governments along with their populations refused to acknowledge the pandemic threat. Even as Northern Italy's health system first bent then broke, surpassing China's own COVID death count a mere 47 days after Italy's first confirmed case, world governments continued to water-down the threat.
Humans struggle with exponential growth
We perceive our world in linear terms. Although we live each day with many examples of exponential growth and decay – population growth, the spread of a forest fire, compound interest, student debt, the growth of a single fertilised egg into a human baby – we seem to ignore their exponential nature.1 We are instead surprised at the magnitude and complexity of the phenomenon, wilfully ignorant of the exponential journey that brought us to that point.
Governments and populations across the world struggled with the same blind spot in their responses to the 2019 coronavirus. As the number of infected doubled every few days it should have been obvious that this would quickly lead to an unprecedented health crisis. Exponential processes are force multipliers on both the downside, accentuating bad outcomes, but also critically on the upside, allowing early intervention to create disproportionately greater future benefit.
When one person may infect two, three or even more people, with each then leading to a cascading flow of infected contacts, simply by preventing a single early infection we may avoid thousands more only a few weeks later.
As deaths in New York alone cross 10,000, it was estimated that beginning social distancing measures a mere week earlier would have avoided half the total deaths. The UK suffered a similar catastrophic fate due to testing delays and inaction, and today continues to struggle to control the result.
Because populations do not easily understand exponential consequences – and most governments have struggled to communicate it – there is little impetus to act while the sick are few and death is rare. Instead they wait.
In contrast to the European and North American experience, Australia's earlier intervention appears to have so far avoided the expected disaster beyond even the most optimistic projections. This appears to be the result of both the non-linear nature of disease transmission, and some degree of hysteresis2 from rapid behavioural changes in the Australian population.
One of the problems with exponential processes3 is that outcomes are a lagging indicator of what has already happened, particularly when a disease has a long incubation time and a slow severity progression. If you wait for scores of dying patients to arrive in your hospital before acting, a much larger number have already been infected and begun their own downward spiral in the community. This lag between infection and the signal to act is what brings health systems to their knees.
Ignorance of exponential growth and misunderstanding the nature of a lagging signal led many governments to squander the disproportionately lower cost of early intervention. Most notably the United States failed to act when the problem appeared superficially small – but conversely when small interventions would have avoided asymmetrically greater suffering and death. This became Trump's infamous missed month of February, contributing to otherwise avoidable deaths. At the time these interventions may have appeared excessive, but we now look back at those missed opportunities with quaint innocence.
"Everything we do before a pandemic will seem alarmist. Everything we do after a pandemic will seem inadequate." – Michael O. Leavitt (2007) then Secretary of the U.S. Department of Health and Human Services.
Humans are terrible with uncertainty
Perhaps the most jarring effect of the pandemic has been the uncertainty. The reassertion of reality, cutting through the veneer of our optimistic perception that we live predictable, constant, stable lives.
Discomfort with uncertainty encourages us to embrace biased patterns of thinking, to be comforted by the false-certainty they offer. We tend to believe that things will continue the way they always have. This normalcy bias4 is particularly unhelpful during disaster preparation and response, impeding and slowing our acceptance of new information and leading us to preference inaction over action. Yet a posture toward action is the very thing we need when responding to a planet-sized catastrophe.
Once experiencing an emergency or disaster, we can be biased toward wishful thinking, holding an unrealistic optimism that further undermines our response.
Most critical care doctors – anaesthesiologists, intensivists, emergency physicians – have experienced this at least once, early in their careers, irrationally hoping optimism and sheer force-of-will may avert a bad outcome. We grow out of this thinking very quickly.
"One day it’s like a miracle, [the coronavirus] will disappear.” – Trump, 27 Feb 2020.
We may wish this were true, but the obvious conflict between our wants and objective reality will inevitably reveal themselves. The danger of wishful thinking is that, like normalcy bias, it encourages inaction over action. As the reality of the situation diverged from our perception, the degree that we continue to adhere to unrealistic optimism only delays the needed response.
Add a dash of confirmation bias5 and motivated reasoning6, as information and emotional arguments supporting our optimism are prioritised, plus less common traps, like the boomerang7 and backfire effects8 where persuasive arguments lead to the adoption of opposing views, or contradictory evidence is rejected to strengthening false belief – and it's easy to see how cognitive biases have impeded a cohesive response to this pandemic.
For COVID-19, it's cognitive failings and biases all the way down.
And in a pandemic, biases might just get us killed.
-
Which relates to a question you might have, why the lily pad image heading this article? There's a classic thought experiment used to demonstrate our struggle with the non-linear: Lily pads are growing across a pond. On the first day there is only one lily pad, and it covers only a tiny surface of the pond. Everyday the area of the pond covered by lily pads doubles. How much of the pond is covered the day before it is completely covered? The answer? Only half the pond... ↩
-
Hysteresis describes situations where the state of a system is dependent on the history of the system – from what has come before, literally meaning 'lagging behind' in Greek. It is described in many domains, including chemistry, physics, engineering, biology and even economics. In critical care specialties it's most commonly observed in respiratory physiology as lung hysteresis, where the lung demonstrates different compliance during inspiration versus expiration as a consequence of surface tension. Anaesthesiologists also observe it when monitoring depth of anaesthesia, where there is often a different drug concentration required to maintain depth while descending and once reached, versus re-establish depth when it is being raised. Hysteresis is not a concept unknown to epidemiologists, thought to result from population behavioural changes. ↩
-
Note that early in a pandemic, growth is exponential, and this is what we are most interested in when discussing the costs and benefits of delaying action or acting early. However the resulting graph of infected population vs time is not an exponential ('logarithmic') curve, because at some point there will be very few people who are available to be infected, the curve begins to flatten, becoming more linear than exponential, and ultimately asymptotically reaching a maximum value. This gives a sigmoid or logistic curve. ↩
-
Omer & Alon. The Continuity Principle: A Unified Approach to Disaster and Trauma. Am J Community Psychol. 1994 Apr;22(2):273-87. ↩
-
Confirmation bias describes our preferencing of information that confirms and strengthens our belief when we are searching, recalling or interpreting information – that is, we are biased to confirm our existing beliefs even when wrong. This also relates to belief perseverance (our tendency for beliefs to persist even after we have been shown contradictory evidence) and the backfire effect. ↩
-
Motivated reasoning is similar yet distinct from confirmation bias, where instead of preferencing evidence supporting a belief, we employ emotional non-critical reasoning to reach a desired outcome or decision – we find the conclusions we want rather than those which are most correct or rational. For example, arguing that a nation's exceptionalism means it will avoid a coronavirus disaster, and so decide to not begin economically-damaging social distancing or the costs of increased testing. ↩
-
The boomerang effect was first described more than 70 years ago, where a persuasion attempt instead leads a subject to adopt the opposite position. It may be due to psychological reactance or cognitive dissonance, and is a important psychological consideration in modern public health interventions. ↩
-
The backfire effect describes how an argument or evidence may 'backfire' and lead a subject to strengthen their beliefs, even though (in fact, because) the presented evidence actually corrects or contradicts this belief. Although popularised in the aftermath of the 2016 US election, at best the backfire effect is very uncommon and has struggled with replication. ↩