Consider this to be my first attempt at defining the Normalcy Bias. This time I’ll put my back into it.
The Normalcy Bias, to me, explains everything in the universe. The Hitchiker’s Guide to the Galaxy got it all wrong – the Answer to Life, the Universe, and Everything isn’t “42” at all – it’s the Normalcy Bias. I’ll explain – first with words, then with a couple of videos.
My definition (for Part 1): “Even when confronted with overwhelming physical evidence to the contrary, the belief that today will be exactly like yesterday – ‘normal.'” (or that tomorrow will be the same as today)
Part 2 looks like this: “The belief that tomorrow will be exactly like today.” (very similar, but bear with me for a minute)
So even when the entire world is telling you X, you continue to believe in Y (“Y” being normal, “X” being anything else). Here is how thesurvivalistblog.net defines it.
Classic example: on 9/11 there were hundreds and thousands of amazing stories of heroism, terror, love, loss… and of the Normalcy Bias. Some people who saw the first plane hit the first tower, who felt the heat from fire, who felt the building shake, paused for a brief moment… and then went back to their desk to check their email and make plans for lunch.
In that instant, there was a mountain of physical evidence screaming “AAHH!! Something horrible just happened!! Do something!!” But since many were incapable of understanding what was happening, through no fault of their own, their only course of action was to get back into normal life and reject all of those inputs. Simply put: it didn’t fit their expected narrative, they couldn’t process it, so therefore it didn’t happen.
Thankfully, many of these people were still able to escape the situation, but only after multiple other inputs (smoke filling the building, some other people leaving, actually seeing fire, etc.) and brave individuals literally taking them by the hand and leading them to the stairs. Amanda Ripley talks about several of these stories in her book, The Unthinkable: Who Survives When Disaster Strikes – and Why. (great book, by the way)
Another classic example that has happened countless times: someone kicks in your front door in a home invasion and your brain immediately goes to “oh the Girl Scouts must’ve tripped on their way to ringing the door bell.” Or even better: “those Mormon boys have really been working out.”
So the Normalcy Bias has two parts: Part 1 is when disaster is underway; Part 2 is any potential future disaster. Note: I use the word “disaster” here only to make it easier to explain. But this could also mean “anything out of the ordinary” and doesn’t necessarily have to imply massive casualties or even injury at all. When our new dog jumped on our bed for the first time, made himself at home, and then consumed an entire sock – like it was his job – my first reaction was not a reaction at all. It was a blank stare with no words as my brain tried to process the information (and failed miserably). That’s the Normalcy Bias. I was expecting a normal, sweet dog (like our yellow lab), so when the black lab emerged as the spawn of satan, I was simply unable to comprehend what was happening.
Part 1 of the Normalcy Bias is generally straight forward. So I’ll expand a little on what I’m calling Part 2. Part 2 happens when faced with evidence that you choose to ignore (or more appropriately, we choose to just accept the risk). For example, do home invasions and burglaries happen in Colorado? Absolutely they do. Would my family be safer if I had a 14-foot security wall around my house? Absolutely! But for many reasons (cost, what my neighbors would think, would it actually increase my profile potentially attracting other crime, stupid HOA wouldn’t approve it anyway, etc.), my 14-foot security wall idea isn’t going to happen. So, we’ve decided to accept the risk. We haven’t necessarily ignored the inputs, but risk acceptance is a part of this.
That risk acceptance gets to Part 2 of the Normalcy Bias. At the core of that risk acceptance, we’re assuming that it’ll never really happen (that tomorrow will be just like today). We’ve decided that since it’s not worth building the wall, “oh it’ll probably never happen anyway” is the rational conclusion. But if we DID know it was going to happen…
If I did actually KNOW that my house was going to get hit with a tornado on May 14, 2014, a reasonable person would act on that knowledge – you’d move out before then, build a big tornado shelter, jack up your homeowner’s insurance, put everything in a storage locker, whatever.
But here’s the problem: we all choose to ignore the possibility of reasonably-likely events (“X”) because accepting the possibility of X would cause us to change our behavior in some way. Like my 14-foot security wall example.
So we’re constantly making these decisions about what’s reasonably-likely to happen and what’s insane. We only have limited resources and limited ability to process doom-and-gloom scenarios.
Here’s another way to look at Part 2 of the Normalcy Bias: if we knew for sure that X was going to happen, a reasonable person would alter their behavior in some way as a result of accepting this information. So if I knew that on Tuesday I was going to get into a car wreck, whether major or minor, as a reasonable person, I’m going to alter my behavior – I’ll either stay home on Tuesday, rent a Hummer for a day, wear a football helmet while I’m driving, something.
[pay attention to this part]
But if I deem those altered behaviors to be too much too handle — if I refuse to let anything get in the way of my perfect hair-do — then I’m likely to just reject those inputs outright.
[letting that last part sink in]
So if I’m too stuck in my ways (or limited by desire or resources), it’s easier to just go through life assuming that tomorrow will always be exactly like today – and that something abnormal will never happen to me. If I assume that nothing will ever cause me to change my behavior, I’m all set!!
And let me be clear: we all fall victim to this… every single day of our lives.
If I refuse to spend the money on a tornado shelter and refuse to waste time thinking about a tornado ever hitting my house, then whatever information I have pointing to May 14, 2014 gets filed in the cylindrical file cabinet in my brain. That generally manifests itself in one of 4 ways – and all in an attempt to avoid changing your behavior:
- “Oh that will never happen.”
- “That will never happen here.”
- “If it does happen, and it does happen here, it won’t be as bad as you say it will be.” See how this is going? We’re rationalizing the inputs away so that we can wake up tomorrow expecting everything to be exactly as it is today.
- “If it does happen, and it does happen here, and it is as bad as you say, I don’t want to live through it.” This appears extreme on the surface, but I’ve heard exactly this in conversation before – and this philosophy actually is not uncommon.
Since the Normalcy Bias is really only an attempt at explaining human behavior, here’s another one that Amanda Ripley also talks about in her book: if you’ve been through an abnormal event that ended up being not-such-a-big deal, you also immediately assume that the results of other similar events will yield those same not-such-a-big deal results. So if you lived through a lesser (but similar) event, you may actually be worse off than those who didn’t.
- There are many stories of people who were in the World Trade Center in both 1993 when it was bombed then (injuring over a thousand but killing only 6; not-such-a-big deal for tens of thousands that could have been affected) and then again on 9/11/2001. Many of those that had lived through the 1993 bombing flashed immediately back to that event on 9/11… and then quickly transitioned into not-such-a-big-deal-mode because of how the 1993 bombing turned out. You’d probably expect those to have been the first ones in the stairwells and out of the buildings – but for many, that wasn’t the case at all. You see, “normal” for some of them was a horrible and tragic bombing… that yielded results that you could live with. Therefore, similar physical evidence (something blowing up in the building), even when coupled with various other inputs (seeing a plane hit the building, feeling the fire ball, etc.) got filed in the same place in their brain – and so it wasn’t worth changing your behavior about.
- Amanda Ripley also tells the story of Meaher Patrick Turner and his family in New Orleans. Turner had lived through many hurricanes, including a big one in 1965 and Hurricane Camille in 1969. You see that in the lead-up to Hurricane Katrina, those who had survived other hurricanes (and also had no desire to pack up and leave) chalked Katrina up to “oh it’s not going to be that bad.” One quote from the book which sums it up best:
- As it turned out, the victims of Katrina were not disproportionately poor; they were disproportionately old. Three-quarters of the dead were over sixty, according to the Knight Ridder analysis. Half were over seventy-five. They had been middle-aged when Hurricane Camille [which was a category 5 and technically stronger than Katrina] struck. “I think Camille killed more people during Katrina than it did in 1969,” says Max Mayfield, director of the National Hurricane Center. “Experience is not always a good teacher.” (The Unthinkable, p. 28)
That’s the Normalcy Bias.
A physical manifestation of the Normalcy Bias also could be a “freeze” response. Marc MacYoung has a great way of explaining that response (this is from his website – NoNonSenseSelfDefense.com, and to understand the monkey and lizard, read more of Marc’s website):
- First type is you just totally skip a groove. Boy I really am dating myself, using vinyl record references (but a CD skip just doesn’t have the same sound). You are confronted with a problem that your amygdala just doesn’t have an answer for.
- The stimuli comes in, the amygdala goes ‘F**k if I know how to handle this’ and sends the problem back to the higher brain for clarification. But the higher brain is at a loss too. ALL parts of the brain are stuck going ‘homina, homina, homina!!!!!’
- Another type is that you have NO experience letting your lizard brain drive. The monkey often steers the lizard, but few people really have experience letting the lizard brain truly drive. (Actually that’s not true, women who have given birth do). They’ve spent so much of their lives disassociated from this aspect of themselves that they have NO experience giving it control to let it do what it needs to do (e.g. run like hell). This is ESPECIALLY common in people who pride themselves on how ‘smart they are.’ This to the point of arrogance about them being ‘better’ than dumb violent types. I tell them, “You’re smarter than a dog, but you can still get bit … especially if you don’t apply that intelligence to understanding dog psychology.”
He’s right, of course. But to me, it’s all the Normalcy Bias. Kathy Jackson also has her own take on the Freeze Response. Why it happens and how to prevent it are the subject of countless books, movies, essays, blogs… and the biological and chemical factors that contribute to neuroscience are fascinating. But I’ll save that stuff for smarter people than me. All I know is that it’s all part of the Normalcy Bias.
At about the :32 mark, did you see the woman at the desk with her child? Did you see what she did after the shots started flying? Not much. At :43, the female employee practically dragged her behind the desk to a safer area. Normalcy Bias.
Notice the woman in the black t-shirt at about the :16 mark. We have no idea what the bad guys were saying at that instant, and they could have been ordering her to another place – so presumably she was just complying with their demands. Fair enough. But more likely, it seems that she was headed to an area when the bad guys came in, and continued her journey even when she walks right passed a robber with a gun. Normalcy Bias.
And I’ll leave you with Case 3, which also makes me sick and reminds me why I carry a gun: