There are many known psychological processes that cause individuals and organizations to miss the signs of a coming crisis – even when the signs are noticeable.
One reason is known as the "optimism bias" where people think they have a better than average prospect or are overly optimistic about their own future.
One possible reason for the "optimism bias" is found in the way we learn new information. People are quicker to change their beliefs when the information is better than expected, compared to information that is worse than expected.
Outcomes bias it thinking that because things turned out reasonably good, we can underestimate how close they came to going wrong.
In the past 20 years, there have been two outbreaks of diseases caused by the new viruses. The outbreak of 2003 killed 774 people before it was contained, and the Mers outbreak in 2012 has killed 858. The new virus has far surpassed both.
Even if people are given clear evidence that a crisis is unfolding, they may deny the reality of it.
If people want to believe something, they may only look for evidence to support that point of view, and ignore or dismiss anything that contradicts it.
In uncertain conditions, we look to each other for guidance, even if the people are not the best guides. People are tending to do what they see is the social norm. It may explain panic buying.
At government level and other large organizations, the tendency to conform unconsciously make intelligent and experienced decision-makers stop discussing options and uncritically accept whatever plan they think everyone else is settling on.
Organizations often hire smart and talented people, but then create cultures and decision-making processes that do not encourage them to raise concerns or make suggestions.
Everyone is encouraged to look at the positive interpretations, which leads to "self-reinforcing stupidity."
Five characteristics of the best-prepared “high-reliability” organizations:
❤️ Brainstash Inc.