Thinking, Fast and Slow

by

Daniel Kahneman

Themes and Colors
Intuition, Deliberation, and Laziness Theme Icon
Human Fallibility and Overconfidence Theme Icon
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
Choices, Losses, and Gains Theme Icon
LitCharts assigns a color and icon to each theme in Thinking, Fast and Slow, which you can use to track the themes throughout the work.
Human Fallibility and Overconfidence Theme Icon

After introducing the two modes of thinking he calls “System 1” and “System 2,” Kahneman illuminates some of the underlying fallacies people rely on as they process information. In addition to humans’ natural tendency towards laziness, people also tend to be overconfident in their abilities to correctly answer questions and make calculations. This overconfidence leads not only to biased conclusions based on a person’s subjective experiences, but often leads to outright error.

In order to make their lives easier, people tend to streamline their thoughts and feelings when answering questions, often ignoring relevant outside data. This makes them extremely confident in their answers, but only because they have simplified their mental processes. Kahneman describes how people tend to commit “confirmation bias”: believing themselves to be generally right, people look for evidence that confirms a belief they already hold rather than looking for evidence that disproves it. For example, if people are asked, “Is Sam friendly?” they look for evidence to confirm that Sam is, in fact, friendly, rather than finding examples that disprove it. Therefore, they are biased to agree with the question.

Similarly, the “halo effect” is another facet of confirmation bias, in which people’s positive predispositions toward people, places, or things make them want to like everything about that person, place, or thing; the same is true of things people harbor negative feelings about. Kahneman uses a principle, “What you see is all there is” (WYSIATI) to explain how people only use information presented to them in their consideration of facts and calculations. This also makes them overconfident in their predictive abilities. For example, when Kahneman worked in the Israeli Army, he evaluated soldiers for officer training by watching a team-building exercise for a few hours. He would then make predictions about who might be the best candidate for the training. Ultimately, his predictions were only slightly better than blind guesses. He only used what he was able to see, assumed that a few hours would be representative of soldiers’ performances more generally, and had great confidence in his ability to predict. People also place too much confidence in what they know from their own experiences and ignore potential outside data, because they inherently feel more secure in their own knowledge and experiences than those of others.

People tend to overestimate statistics when they can readily relate to them—for example, people will give a higher estimate for the rate of car crashes if they have personally been in one or witnessed one recently. Kahneman gives another example in which two partners each estimate that they do 70 percent of the housework, because they usually only remember the instances in which they do the chores. This also happens with news stories: people fear terrorism because it dominates the news cycles, yet fatal car crashes are far more common. As Kahneman writes, “Emotion and vividness influence fluency, availability and judgments of probability—and thus account for our excessive response to the few rare events that we do not ignore.” The more readily people can think of examples, the more they overestimate frequency and probability.

In addition to being overconfident about their own knowledge and experiences, people are overconfident about their own personal attributes and abilities. Kahneman writes how CFOs (chief financial officers) are shown to be grossly overconfident about their abilities to predict the market. Likewise, medical experts who were asked about their diagnoses and were “completely certain” about them were actually wrong 40 percent of the time. The issue is that overconfident people are rewarded, as they are more easily able to gain the trust of clients. People also become more confident in hindsight. People think they understand the past, which implies that they should understand the future. In reality, however, they understand the past less than they think they do. A survey was conducted in 1972, just before President Nixon travelled to China to meet with Mao Zedong. Respondents assigned probabilities to fifteen different possible outcomes of the meeting. After Nixon’s return, respondents were asked to recall the probability they assigned to different outcomes. If the event had actually occurred, they exaggerated what they had thought the probability was. This hindsight bias leads people to evaluate decision-making processes in a faulty way—by the outcome, not by whether the decision was sound at the time. Kahneman explains, “We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact.”

People also assume that statistical rules do not apply to them; they assume the best-case scenario for themselves and rarely consider the worst-case scenario. Kahneman experiences this himself when he tries to draft a textbook for the Israeli Ministry of Education. The team he assembles works quickly together, and he asks them to estimate how long it might take to complete their job. The team members’ average answer is around two years. When he asks his colleague, Seymour, to estimate based on his knowledge of other teams, Seymour realizes that only about 40 percent of teams finish the job, and it usually takes seven or eight years. Kahneman realizes in hindsight that they should have abandoned their project, but they assumed that they might be the exception—even though they were not.

It makes sense that people rely on their own experiences to determine answers to questions—after all, their personal experiences are the only ones they have access to. However, this method is ultimately problematic because people rely too heavily on that experiential knowledge without understanding fully how their experiences fit into bigger patterns.  People often fail to account for the fact that they simplify their thought processes and exaggerate their responses. If Kahneman’s primary goal is to allow people to recognize how they make mistakes, revealing the situations in which people are typically overconfident is the first step in raising that awareness.

Related Themes from Other Texts
Compare and contrast themes from other texts to this theme…
Get the entire Thinking, Fast and Slow LitChart as a printable PDF.
Thinking, Fast and Slow PDF

Human Fallibility and Overconfidence Quotes in Thinking, Fast and Slow

Below you will find the important quotes in Thinking, Fast and Slow related to the theme of Human Fallibility and Overconfidence.
Part 1, Chapter 1 Quotes

The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.

Related Characters: Daniel Kahneman (speaker)
Page Number: 24
Explanation and Analysis:
Part 1, Chapter 3 Quotes

The bat-and-ball problem is our first encounter with an observation that will be a recurrent theme of this book: many people are overconfident, prone to place too much faith in their intuitions.

Related Characters: Daniel Kahneman (speaker)
Page Number: 45
Explanation and Analysis:
Part 1, Chapter 4 Quotes

The results are not made up, nor are they statistical flukes. You have no choice but to accept that the major conclusions of these studies are true. More important, you must accept that they are true about you.

Related Characters: Daniel Kahneman (speaker)
Page Number: 57
Explanation and Analysis:
Part 1, Chapter 7 Quotes

Contrary to the rules of philosophers of science, who advise testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold.

Related Characters: Daniel Kahneman (speaker)
Page Number: 81
Explanation and Analysis:

We often fail to allow for the possibility that evidence that should be critical to our judgment is missing—what we see is all there is.

Related Characters: Daniel Kahneman (speaker)
Page Number: 87
Explanation and Analysis:
Part 2, Chapter 12 Quotes

The explanation is a simple availability bias: both spouses remember their own individual efforts and contributions much more clearly than those of the other, and the difference in availability leads to a difference in judged frequency.

Related Characters: Daniel Kahneman (speaker)
Page Number: 131
Explanation and Analysis:
Part 2, Chapter 13 Quotes

The lesson is clear: estimates of causes of death are warped by media coverage. The coverage is itself biased toward novelty and poignancy.

Related Characters: Daniel Kahneman (speaker), Paul Slovic
Page Number: 138
Explanation and Analysis:
Part 3, Chapter 19 Quotes

A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.

Related Characters: Daniel Kahneman (speaker)
Page Number: 202
Explanation and Analysis:
Part 3, Chapter 20 Quotes

The illusion of skill is not only an individual aberration; it is deeply ingrained in the culture of the industry. Facts that challenge such basic assumptions—and thereby threaten people’s livelihood and self-esteem—are simply not absorbed.

Related Characters: Daniel Kahneman (speaker)
Related Symbols: Müller-Lyer Illusion
Page Number: 216
Explanation and Analysis:
Part 3, Chapter 21 Quotes

Applying Apgar’s score, the staff in delivery rooms finally had consistent standards for determining which babies were in trouble, and the formula is credited for an important contribution to reducing infant mortality.

Related Characters: Daniel Kahneman (speaker)
Page Number: 227
Explanation and Analysis:
Part 3, Chapter 23 Quotes

In this view, people often (but not always) take on risky projects because they are overly optimistic about the odds they face. I will return to this idea several times in this book—it probably contributes to an explanation of why people litigate, why they start wars, and why they open small businesses.

Related Characters: Daniel Kahneman (speaker)
Page Number: 253
Explanation and Analysis:
Part 3, Chapter 24 Quotes

Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients.

Related Characters: Daniel Kahneman (speaker)
Page Number: 263
Explanation and Analysis:
Part 4, Chapter 30 Quotes

You read that “a vaccine that protects children from a fatal disease carries a 0.001% risk of permanent disability.” The risk appears small. Now consider another description of the same risk: “One of 100,000 vaccinated children will be permanently disabled.” The second statement does something to your mind that the first does not.

Related Characters: Daniel Kahneman (speaker)
Page Number: 329
Explanation and Analysis: