Thinking, Fast and Slow

by

Daniel Kahneman

Thinking, Fast and Slow: Conclusions Summary & Analysis

Summary
Analysis
Kahneman concludes by reexamining some of the larger principles in the book, beginning with the experiencing and remembering selves. The remembering self is a construction of System 2, but duration neglect and the peak-end rule originate in System 1. We do not treat all moments the same—some are more memorable, and some are more important.
Kahneman’s conclusions highlight some of his most important points: in the case of the experiencing and remembering selves, it is that people rarely favor objectivity over subjectivity, especially when it comes to evaluating their own experiences. 
Themes
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
The issue of which of the two selves matters more is important to both medicine and welfare. He wonders, through a series of rhetorical questions, whether investments should be made based on the actual suffering that people experience, or how much, holistically, they want to be relieved of their condition.
Kahneman questions the broader implications that subjectivity has on policy, but perhaps illustrates the limits of our knowledge best in not having an answer to his own questions in terms of the best way to go about policy.
Themes
Stories and Subjectivity vs. Statistics and Objectivity Theme Icon
Kahneman then returns to the idea of Econs and Humans, as well as basic economic theory. He argues that the definition of rationality as being consistent and coherent removes reasonable people from the definition of rationality. Humans are not irrational, but they need help to make accurate judgments and better decisions. In a nation of Econs, government should keep out of the way, but Humans require more guidance.
Returning to standard economic theory and prospect theory, Kahneman also reiterates that Humans do not always make decisions based on the intrinsic values of money and probability. Although informing people of their biases is important, Kahneman also realizes that in some cases, government action and guidance may be even more crucial to counter people’s faulty intuitions.
Themes
Choices, Losses, and Gains Theme Icon
In Richard Thaler and Cass Sunstein’s book Nudge, they address the dilemma of how to help people make good decisions without curtailing freedom. They come up with libertarian paternalism, which has great appeal across a broad political spectrum. One example of a “nudge” is that joining a pension plan is the default option. Another example lies in a policy that many firms now offer employees: those who sign on allow the employer to increase their contribution to their savings plan by a fixed proportion whenever they receive a raise. It improved the savings rate and brightened the future prospects of millions of workers.
In a way, government “nudge” policies do not overcome people’s biases, but instead simply use frames and the inherent laziness of our brains to their advantage. Like the example of organ donation, making the option of joining a pension plan or saving money the default option greatly increases the amount that people will save—an objectively positive outcome that people barely have to think about. 
Themes
Intuition, Deliberation, and Laziness Theme Icon
Get the entire Thinking, Fast and Slow LitChart as a printable PDF.
Thinking, Fast and Slow PDF
Additional applications of libertarian paternalism introduced by Sunstein include the gas mileage example from earlier, a new version of dietary guidelines that eradicated the Food Pyramid and replaced it with a Food Plate, and an inclusion of both frames on labels like “90% fat-free” alongside “10% fat.”
In these scenarios, the policies don’t play into our inherent laziness, as with the pension plan, but they do illuminate how we are affected by frames and try to help us combat them by including both the positive and negative phrasings.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Kahneman ends by returning to the two systems: the automatic System 1 and the effortful System 2. System 1 is the origin of much of what we do wrong, but also much of what we do right. Our thoughts and actions are generally on the mark, but sometimes it becomes unreliable. The way to block errors that originate in System 1 is simple: “recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2.” This is what occurs when we encounter the Müller-Lyer illusion after we have learned that our intuition is incorrect.
Kahneman acknowledges that our intuitions can often be right, but still highlights the necessity for calling in backup when we are presented with cognitive illusions. Reiterating the comparison with the Müller-Lyer illusion reminds readers that even after we understand concepts, we may not be able to apply them unless we explicitly recognize the heuristics upon which we are relying.
Themes
Intuition, Deliberation, and Laziness Theme Icon
Quotes
Organizations are better than individuals when it comes to avoiding errors, and can institute and enforce the application of checklists, reference-class forecasting, and the premortem.
Instituting policies that force people to rely on objective data allow people to avoid making costly mistakes.
Themes
Human Fallibility and Overconfidence Theme Icon
Ultimately, having a vocabulary for the different heuristics is also important in avoiding their errors. Labels like “anchoring effects,” “narrow framing,” or “excessive coherence” reminds us of our potential biases, their causes, effects, and what can be done about them.
Kahneman returns to his aim in writing the book: by giving people the tools to recognize the different heuristics (and often fallacies) that they use, they can recognize when these mental shortcuts are helpful versus when they allow for lazy mistakes.
Themes
Intuition, Deliberation, and Laziness Theme Icon