People who are aware of their own biases are not better able to overcome them. Our intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy.
According to scientists, intelligent people have a larger bias blind spot.They can spot systematic flaws in others, but not in themselves. They will excuse their own minds but harshly judge the minds of other people.
There are two thinking systems, each with distinct characteristics.
System 1, or intuition. We think in this way most of the time. We respond to the world in ways that we're not conscious of and don't control. System 1 operations are fast, effortless, associative, and emotionally charged. They're governed by habit, so it's more difficult to modify or control.
System 2, or reasoning. It is a deliberate reasoning system. It's slower, serial, effortful, and controlled.
In statistical thinking, professional statisticians informally think the degree of the probability distribution in a small group will closely resemble the probability distribution in the overall population.
In other words, even people who should know better make these mistakes. When they're not computing seriously in System 2 mode (reasoning), they rely on their intuitions for simple problems.
You put a lot of time and effort into succeeding in your job, education, and relationships. Since you dedicate so much time to these endeavors, you want full ownership of any success related to them. But when it comes to failures, you turn on your heel and run away from them at the speed of light.
It causes you to claim your successes and ignore your failures.
This means that when something good happens, you take the credit, but when something bad happens, you blame it on external factors.
Self-serving bias may manifest at work when you receive critical feedback. Instead of keeping an open mind, you may put up a defense when your manager or team member is sharing feedback or constructive criticism.
When certain events need to take place to achieve a desired outcome, we're overly optimistic that those events will happen. Here's why we should temper those expectations. *** Why are we so optimistic in our estimation of the cost and schedule of a project? Why are we so surprised when something inevitably goes wrong?
Broader categories are always more probable than their subsets. It's more likely someone has a pet than they have a cat. It's more likely someone likes coffee than they like cappuccinos. The extension rule in probability theory thus states that if B is a subset of A, B cannot be more probable than A.
Likewise, the probability of A and B cannot be higher than the probability of A or B. It is more probable that Linda is a bank teller than that she is a bank teller and active in the feminist movement.
A plan is like a system. A change in one component of a system will likely impact the functionality of other parts of the system.
The more steps involved in a plan, the higher the chance that something will go wrong and cause delays and setbacks. For this reason, home remodeling and new product ventures seldom finish on time.