Apart from people, the observer effect is famously highlighted in the thought experiment of the physician Erwin Schrödinger.
He states that if a cat is placed in a box of radioactive atoms that may or may not kill it in one hour, the cat is in the state of limbo until someone observes it by opening the box. The final outcome does not happen until someone observes it.
When a certain disaster or calamity happens, we work towards ensuring that the same calamity can be dealt with in the better way, the next time it happens. The pain or loss that we suffer motivates us to do so.
We forget in our preparation and resource allocation to the ‘last’ disaster, that we have neglected many other things that are more likely to happen.
Not everything we do with the aim of making ourselves safer has that effect. Sometimes, knowing there are measures in place to protect us from harm can lead us to take greater risks and cancel out the benefits. This is known as risk compensation. Understanding how it affects our behavior can help us make the best possible decisions in an uncertain world.
It means to be able to break down a big system into its sections and putting it back together. The target is to identify the strong and weak links: how the sections work, don’t work, or could potentially work and applying this knowledge to engineer useful outcomes.
There is no engineering method, so modular systems thinking varies with contexts.