CI

continuous improvement as a way of life

Psychology

Thinking, Fast and Slow

Thinking Fast and Slow

Download this executive summary in PDF: Thinking, Fast and Slow by Daniel Kahneman

 

Part 1: Two Systems

1. The Characters of the Story

  • System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. 
  • System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.
  • System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions.
  • System 1 is easily biased and runs in automatic mode.
  • System 2 can detect the bias but needs effort and can make mistakes.
  • Even if System 2 knows you are looking at an illusion, System 1 cannot be prevented from seeing the illusion.

2. Attention and Effort

  • Pupils dilatation is a sensitive indicator of mental effort.
  • When System 2 is engaged in difficult mental effort, the mind may become blind (The Invisible Gorilla experiment).
  • When we face danger, it is System 1 who automatically reacts even if System 2 has not become fully conscious of it.
  • Under time pressure, switching from one mental task to another is effortful.

3. The Lazy Controller

  • Self-control is required when engaged in effortful thinking.
  • Flow: “a state of effortless concentration so deep that people lose their sense of time, of themselves, of their problems”.
  • When System 2 is busy, System 1 has more influence on behavior (choose sweet food over healthy food).
  • After exerting a high level of self-control, with ego depletion, people have a hard time maintaining their level of self-control.
  • Eating some glucose can replenish mental energy.
  • When System 2 doesn’t make the cognitive effort, people easily believe their intuition which may be wrong.
  • Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed. System 2 needs time to check and search the information.
  • People who have a weaker System 2 tend to be impulsive, impatient, and keen to receive immediate gratification.

4. The Associative Machine

  • Associative activation: ideas that have been evoked trigger many other ideas, in a spreading cascade of activity in your brain.
  • Ex “Bananas Vomit”: System 1 automatically and effortlessly connects these two words with ideas and emotions.
  • We have generated ideas that are not registered by our consciousness.
  • The priming effect influences our thinking and actions unconsciously (walking slower if you read some words related to old age, feeling happier if your face is smiling, being more selfish if you saw pictures related to money).
  • Priming phenomena arises in System 1.
  • Our decisions are influenced by System 1 more than we know without our conscious awareness of its activities. 


5. Cognitive Ease

  • Cognitive ease: System 1 does not recognize any threat. The mind feels the situation is familiar.
  • Cognitive strain: System 2 is mobilized when facing a problem. The mind is vigilant and suspicious.
  • You experience cognitive ease when you see a word that you have already seen because it looks familiar you easily believe it is true.
  • Frequent repetition makes people believe in falsehood.
  • Making a message easy to understand and read reduces cognitive strain.
  • Cognitive strain activates System 2 and rejects intuitive answers suggested by System 1.
  • Mere exposure effect: people have good feelings toward a word or picture when they repetitively see them even unconsciously.
  • Mood affects intuitive performance.
  • In good mood, System 1 performs better to connect ideas (creativity, intuition).

6. Norms, Surprises, and Causes

  • System 1 connects the ideas and defines what is normal.
  • Repetition of an event reduces the surprise.
  • System 1 automatically sees causality and intentions.
  • System 2 interprets the causality and accepts or denies it.
  • Impression of causality is generated by System 1.

7. A Machine for Jumping to Conclusions

  • System 1 does not give alternatives to interpretation and does not keep track of the alternatives.
  • System 2 needs mental effort to doubt and to unbelieve.
  • When System 2 is tired, it will easily believe System 1. 
  • Halo effect: the tendency to like (dislike) everything about a person including things you have not observed.
  • The measure of success for System 1 is the coherence of the story it manages to create, even with limited information.

8. How Judgments Happen

  • System 1 uses basic assessment to rapidly discriminate friend from foe.
  • A politician with a face that inspires competence has more chance to be elected.
  • System 1 can match intensity across diverse dimensions.
  • System 1 computes more than intended and generates conflicts.

9. Answering an Easier Question

  • To answer a hard question, lazy System 2 will substitute the question with a heuristic simpler question.
  • Heuristic: a simple procedure that helps find adequate, though often imperfect, answers to difficult questions.
  • A simpler question is easier to answer for System 1.
  • System 2 searches for information consistent with its existing beliefs to endorse System 1 emotions.

Part 2: Heuristics and Biases

10. The Law of Small Numbers

  • Small samples yield extreme results more often than large samples do.
  • Even statisticians make the mistake of using small sampling relying on their intuition instead of computation.
  • Even if a poll is not reliable because of the size of the sampling, System 1 will find consistency and coherence and jump to a conclusion.
  • Many facts of the world are just random but our mind tends to find causality in them.

11. Anchors 

  • Anchoring effect: when people consider a particular value for an unknown quantity before estimating that quantity.
  • System 2 adjusts from the anchor when it finds deliberate reasons.
  • Anchoring as priming effect: System 1 uses unconsciously the suggested number as a reference.

12. The Science of Availability

  • Availability heuristic: the process of judging frequency by “the ease with which instances come to mind.”
  • In a state of higher vigilance, System 2 is less susceptible to availability bias.

13. Availability, Emotion, and Risk

  • For an example of the availability heuristic, causes of death are warped by media coverage.
  • The availability heuristic simplifies our lives by imagining a world without tradeoffs between benefits and costs.
  • Availability cascade: a nonevent that is inflated by the media and the public.

14. Tom W’s Specialty

  • Prediction by representativeness is not statistically optimal.
  • Judging a probability with our intuition impression is quite accurate.
  • Using base-rate information is more relevant in probabilities than representativeness.
  • When people are presented with information, they tend to ignore base-rate information because of the laziness of System 2.

15. Linda: Less is More

  • Fallacy: when people fail to apply a logical rule that is obviously relevant.
  • Conjunction fallacy: when people judge the conjunction of two events to be more probable than one of the events in a direct comparison.
  • People confuse probability and plausibility when they are presented with information that influences their judgment of representativeness.
  • Less is more: System 1 averages value instead of adding
  • When some good and broken dishes are added to a set of good dishes “A”, people give lower value to the set compared to “A”.
  • When asked for an opinion, System 2 is not alert.

16. Causes Trump Statistics

  • Statistical base rates are facts about a population to which a case belongs, but they are not relevant to the individual case. This rate is often neglected when there is specific information about the case.
  • Causal base rates change your view of how the individual case came to be. This rate is easily combined with other case-specific information.
  • Stereotyping: System 1 represents categories as norms and prototypical exemplars.
  • People don’t adapt their way of thinking to general statistical facts such as the base rate.
  • To make people learn, they need to be surprised by individual cases because System 1 will infer the general from the particular.

17. Regression to the Mean

  • Attaching a causal interpretation to fluctuations of a random process is a mistake.
  • If one sample of a random variable is extreme, the next sampling of the same random variable is likely to be closer to its mean.
  • Luck plays a role in extreme results.
  • Whenever the correlation between two scores is imperfect, there will be regression to the mean.
  • Correlation is not the same as causation.

18. Taming Intuitive Predictions

  • To get a more accurate prediction, you should move from the base-rate probability toward your intuition by applying a correlation rate.
  • Intuitive predictions from Systems 1 tend to be overconfident and overly extreme.
  • System 2 doesn’t comprehend well the concept of regression.
     

Part 3: Overconfidence

19. The Illusion of Understanding

  • Good stories give us the illusion of inevitability (Google success story).
  • Luck and the halo effect are underestimated in the interpretation of success.
  • Hindsight bias: Once you adopt a new view of the world, you immediately lose much of your ability to recall what you used to believe before your mind changed.
  • Outcome bias: Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.
  • Managers who take a risk and succeed have flair, those who fail are gamblers.
  • The combination of the halo effect, outcome bias and hindsight bias makes success stories compelling.

20. The Illusion of Validity

  • Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it.
  • The story is coherent in the person’s mind but it is not necessarily true.
  • In stock-picking, there is no correlation between skills and performance.
  • These traders are ignorant of their ignorance.
  • Experts are overconfident in their forecasts.

21. Intuition vs. Formulas

  • Statistical predictions made by combining a few scores or ratings according to a rule are more reliable than professional intuitions.
  • This formula can predict the value of fine Bordeaux wines better than experts.
  • Experts add too much complexity and are inconsistent in making judgments of complex information.
  • They may contradict themselves when they evaluate twice the same facts. The system may be a victim of priming.
  • During an interview, using standardized factual questions help to fight the halo effect when giving a rate on each trait that matters. And do not invent a broken leg to change the ranking.

22. Expert Intuition: When Can We Trust It?

  • Intuition is recognition, a cue gives access to information stored in memory, and the memory provides the answer.
  • The information comes from learning from experiences.
  • Confidence is not a criteria for the validity of a reliable intuition (cognitive ease and confidence).
  • Developing skills require predictability and the capacity to learn through prolonged practice.
  • Learning skills depends on the quality and speed of the feedback.
  • Intuition is reliable when the expert has practiced for a long time in a predictable environment.

23. The Outside View

  • The inside view focuses on specific circumstances and personal experience to evaluate a project but ignores the unknown unknowns.
  • Baseline prediction should be the anchor for further adjustments.
  • People who have information about an individual case will ignore the statistics, they ignore the outside view.
  • To have a more accurate prediction, determine the baseline prediction and then use specific information to adjust the baseline prediction.

24. The Engine of Capitalism

  • Optimistic people tend to be healthier, happy, and resilient. They are the inventors, entrepreneurs, the political and military leaders.
  • Optimistic bias: as they are confident, they take more risks.
  • Optimistic entrepreneurs over evaluate their chance of success and ignore competition in their evaluation of success. 
  • Social and economic pressures favor confidence and uncertainty is unacceptable.
  • Premortem: before launching a project, people are told to explain the causes of potential failure. It is the occasion to express doubts and reduces optimism bias.

Part 4: Choices

25. Bernouilli’s Errors

  • For economists, “the agent of economic theory is rational, selfish, and his tastes do not change”, while for psychologists, people are not fully rational.
  • People’s choices are based not on dollar values but on the psychological values of outcomes, and their utilities.
  • Bernoulli’s insight was that a decision-maker with diminishing marginal utility for wealth will be risk-averse.
  • Bernouilli did not take into consideration the reference point from which the agent considers his options. The outcome can be good for someone and bad for someone else.

26. Prospect Theory

  • People become risk-seeking when all their options are bad.
  • Loss aversion: people dislike losing more than winning.
  • The loss aversion coefficient tends to increase when the stakes rise.
  • Prospect theory and utility theory fail to include regrets and disappointment.

27. The Endowment Effect

  • Endowment effect: Owning a good appears to increase its value.
  • The response to a loss is stronger than the response to a corresponding gain.
  • The good must be for use (bottle of wine) and not just for exchange (money).

28. Bad Events

  • The human brain processes faster the picture of an angry face than a happy face without consciously seeing the picture.
  • System 1 reacts similarly to the ideas of threats.
  • Human reacts faster to threats for survival purposes.
  • The aversion to the failure of not reaching a goal is much stronger than the desire to exceed it.
  • Loss aversion favors the status quo.
  • When a company exploits market power to increase the price, the hike is perceived as unfair and a loss by the buyer.

29. The Fourfold Pattern

  • When evaluating an object, System 1 adds some weight to characteristics consciously and unconsciously.
  • The decision weights that people assign to outcomes are not identical to the probabilities of these outcomes.
  • Possibility effect (0% to 5%) and certainty effect (95% to 100%), weight is higher in the extremes.

30. Rare Events

  • Emotion and vividness influence fluency, availability, and judgments of probability.
  • People overestimate the probabilities of unlikely events. 
  • People overweight unlikely events in their decisions.

31. Risk Policies

  • Broad framing blunts the emotional reaction to losses and increased the willingness to take risks.
  • A risk policy is a broad frame that embeds a particularly risky choice in a set of similar choices.

32. Keeping Scores

  • Sunk-cost fallacy: because we have already invested in a project, even if it is a failure, we tend  to double down instead of quitting.
  • Regret comes when one acted out of character, and the emotion is stronger when we acted instead of doing nothing.

33. Reversals

  • The emotional reactions of System 1 are much more likely to determine a single evaluation; the comparison that occurs in joint evaluation always involves a more careful and effortful assessment, which calls for System 2.

34. Frames and Reality

  • Framing: System 1 has stronger negative emotions towards losses than costs. Even if the outcomes are the same consequences, System 1 will avoid loss and prefer cost.
  • When outcomes are positive, people are loss averse and will choose the safer choice.
  • When outcomes are negative, people will gamble to have a chance to limit the loss.

Part 5: Two Selves

35. Two Selves

  • System 1 remembers the most intense moment of an episode of pain or pleasure (the peak) and the feelings when the episode was at its end.
  • System 1 neglects the duration of pain/pleasure in its decision-making.

36. Life as a Story

  • Duration neglect and the peak-end rule apply also to the evaluation of entire lives.
  • People choose by memory when they decide whether or not to repeat an experience.
  • People are their remember selves, experiences that are not remembered don’t matter.

37. Experienced Well-Being

  • Statistically, about 50% of people don’t experience unpleasant episodes while a significant minority does most of the suffering.
  • Money increases life satisfaction but doesn’t on average improve experienced well-being.
  • Misfortunes of life such as illness have a stronger impact on the poorest than those who are comfortable.
  • An increase in income beyond 75,000 dollars/year has zero impact on well-being.

38. Thinking About Life

  • The capacity to experience well-being is determined by genetics.
  • People who have goals and achieve them get higher satisfaction.
  • Focusing illusion: Any aspect of life to which attention is directed will loom large in a global evaluation (ex: nice weather on well-being). They focus their attention on a selected moment and neglect what happens at other times.

Download this executive summary in PDF: Thinking, Fast and Slow by Daniel Kahneman

Related articles

Leave a Reply

Your email address will not be published.