On glancing down the chapter list of Daniel Kahneman’s Thinking, Fast and Slow, I saw a list of heuristics and biases that behavioural scientists have discovered over the last few decades. I am often critical of behavioural science in that it is often presented as a list of biases with no linking framework, so the chapter list played to my fears. I was not confident that the book would present me with many new ideas.
Thankfully, the chapter list was deceiving. Thinking, Fast and Slow is a magnificent book. Kahneman has such a clear writing style and ability to draw simple examples that my grasp of many of the heuristics and biases has increased. I am sure that I will be using many of Kahneman’s examples when trying to explain these biases to others in the future. Thinking, Fast and Slow is my first recommendation to anyone wanting to come to grips with behavioural economics, regardless of their background or level of technical skill.
More importantly (for me), was Kahneman’s ability to link many of these heuristics and biases. Kahneman’s use of the dual process model of the brain allows him to thread a coherent story through each bias and to give them something resembling a framework. The basic concept is that the brain has two modules: System 1 which is fast and applies quick, efficient but occasionally wrong intuitive decisions; and System 2, which is more analytical, but lazy and prone to its own biases.
When placed into an evolutionary context, the dual model framing makes intuitive sense. System 1 take over in emergencies. That quick, intuitive reaction allowed our ancestors to survive. System 2 is only called upon when there is time to decide and the energy used in deploying System 2 is worthwhile. This model allows Kahneman’s story to be less about “irrationality” and more about the costs and benefits of different forms of decision-making, particularly when the human brain, which evolution has shaped over many generations in an environment vastly different from today’s, is placed in the modern world.
Kahneman’s discussion of prospect theory adds to his story’s coherence. Under prospect theory, people evaluate losses and gains from their current reference point, and not as a calculation of their total wealth. People weight losses more than gains, and as such, are risk seeking in the domain of losses and risk averse when it comes to gains. This leads to the fourfold pattern of risk preferences: risk-averse behavior towards a high probability of gains (fear of disappointment) and towards a small probability of losses (hope to avoid loss – insurance); and risk-seeking behavior towards a high probability of losses (hope to avoid loss) and towards a small probability of gains (lotteries). Many of the biases discussed sit within this framework.
There is not much to dislike about the book, but Kahneman’s endorsement of libertarian paternalism to nudge people away from their behavioural biases was one area in which I would tread more carefully. As other chapters in the book show, groups of experts are also prone to biases. As Kahneman described his experience on a curriculum committee, which saw vast underestimates of the time involved and the probability of a negative result, I kept picturing government committees committing the same planning fallacy.
Similarly, Kahneman describes a debate between himself and Gary Klein over the usefulness of expert prediction. Part of their disagreement stemmed from the types of experts they were considering. Kahneman saw little usefulness for experts in fields where the outcome is inherently predictable. Klein saw more utility in fields where consistent feedback and practice allow a real degree of expertise to be achieved. Government largely sits in the former category. However, that small complaint should not detract from one of the best books I have read.