Ideas from books, articles & podcasts.
"Thinking, Fast and Slow" is an extensive introduction to the biases and inner mechanisms of the human brain.
While primarily a behavioral economics book, you will also find out about your limitations, how to more efficiently convince people, and understand why people make certain decisions despite feeling they are irrational.
All the conclusions are validated through experiments that Kahneman performed along the years.
MORE IDEAS FROM THE SAME BOOK
When it comes to decision-making, there is a distinction between two idealized "species" of people:
People can be determined to consider a strictly worse experience as better than another based on the impression of the remembering self.
Experiments with a low sample (N) are much more likely to have extreme results than ones with a high N. Conversely, experimenting with large samples is more precise than with small samples.
Depending on the situation, people can be either risk averse or risk seeking. It is all about the probability of events happening and whether the end result will be positive or negative. For instance, when the chances are very much in favor and the result is posi...
People tend to overestimate the probability of rare events and then overweight them in their decisions.
Acting differently from the default option has the downside of bringing regret if the decision turned out to be wrong and defaulting was the best course of action.
Not all domains are equal in nature. Some are more reliant on skill (e.g. surgery), while others are on luck (e.g. stock trading). People can be tricked into thinking that their success is actually based on skill despite the nature of their activity.
Think about the different mental accounts that are not, in fact, separate (e.g. gains on stocks vs. crypto) and imagine they were merged.
An answer to a number question can be influenced by previous information (even if it is completely unrelated to it). Using this strategy to influence decisions is called priming.
Success or failure are typically outliers due to luck and people/things regress to their usual performance (the mean, or the prior).
The neglect of experiencing self and the appeal of remembering self (even though both selves constitute us) leads people to like stories more than anything else.
The remembering self leverages System 1 and, when it comes to experiences, it cares about:
The previous result makes sense because people tend to answer harder questions like "Which would you rather repeat?" with easier questions like "Which did you like more?"
The brain by default prefers using System 1 to answer questions. When the actual question can't be answered this way, it aims to answer a proxy question that can be.
People change their opinion about something based on the ease that they retrieve arguments and examples for it.
An easy way of making predictions by taking into account uncertainty and the natures of the 2 systems:
Given all the previous information, you can guide people to a decision based on the framing of the problem.
People tend to like or dislike everything about a person, thing, event or idea because it is easier for System 1 to work like that. Careful analysis of good and bad sides is much more mentally taxing.
The degree of "utility" as perceived by people decreases gradually. $1,000 is perceived wildly different for somebody with a net worth of $10,000 vs. $1,000,000. Similarly, an increase from $1,000 to $2,000 is perceived as much better than one from $10,000 to $11,000 despite the ...
People become more unwilling to let go of the things or benefits they already own. They will only sell them for more than they would be willing to buy them for, even if some time ago they were indifferent to them. In short, they grow attached.
When faced with a gamble or chance event, before declining it because of other biases (e.g. loss aversion), think not about the single instance where you win or lose, but rather about your whole life as a series of such gambles and whether the total number will lead to consistent gains. Think lik...
Predicting the future from current circumstances, heavily influenced by emotion (e.g. how your marriage will go based on your wedding day) is not ideal.
When answering a question, the brain tends to only focus on the available information, disregarding the one missing.
People will regress to the mean and the happiness they experience will always turn back to an average value. For most of us, this average value is 7 on a scale from 1 to 10.
People (through System 1) are more sensitive to formulations like "12 out of 1,000" and will sometimes think it's more likely than a risk of 2%. Why? The image is more vivid and it is easier for System 1 to think in terms of individual cases as opposed to percentages.
Experimentally speaking, intuition is less likely to come to a better conclusion than simple formulas, that are not influenced by human nature.
Due to the halo effect and the fact that System 1 deals in averages, not sums, people tend to view things that have higher overall quality as better than ones with lower overall quality, even though the latter may be, in fact, strictly better.
❤️ Brainstash Inc.