Nudging citizens to be risk savvy

I should start this review of Gerd Gigerenzer’s least satisfactory but still interesting book, Risk Savvy: How to Make Good Decisions, by saying that I am a huge Gigerenzer fan and that this book is still worth reading. But there was something about this book that grated at times, especially against the backdrop of his other fantastic work.

In part, I continue to be perplexed by Gigerenzer’s ongoing war against nudges (as I have posted about before), despite his recommendations falling into the nudge category themselves. Nudges are all about presenting information and choices in different ways – which is the staple of Gigerenzer’s proposal to make citizens “risk savvy”. Gigerenzer’s use of evidence and examples throughout the book also fall well short of his other work, and this is ultimately the element of the book that left me somewhat disappointed.

The need to make citizens risk savvy comes from Gigerenzer’s observation (which matches that of most of Gigerenzer’s faux adversaries – the behavioural scientists) that people misinterpret risks when they are presented in certain ways. If I say that screening reduces the risk of dying from breast cancer by 20 per cent, most people will interpret it to mean that 200 of every 1,000 people will be saved, rather than understanding that it means screening reduces the risk of death from 6 in 1,000 to 5 in 1,000 – effectively saving one out of 1,000.

Gigerenzer’s contribution to this area is to show that if presented in natural frequencies (i.e. tell people about the statistics as proportions of, say, 1,000 people), people are better able to understand the actual risks. This includes doctors, who are equally confused by statistics as everyone else, and who Gigerenzer suggests need training to communicate risks in ways that their patients can understand.

This ability to make citizens and experts risk savvy leads Gigerenzer to argue that people do not always need to be at the mercy of their biases. People can be educated to understand risks and experts can present them in ways that others understand. He advocates risk literacy programs in school, showing that simple decision tools can dramatically increase understanding of probability and statistics, although he spends little time discussing how well this education sticks. In making his point, Gigerenzer takes aim at the behavioural science crowd by claiming that natural frequencies wouldn’t have helped if people are subject to cognitive illusions – a strawman argument. As he does at semi-regular intervals through the book, Gigerenzer clouds an interesting argument with an attempt to engage in a battle that doesn’t really exist.

That said, I did enjoy this part of the book and have found myself quoting a lot of the examples. His arguments about how to present risk are compelling. Further, it is enjoyable to read Gigerenzer’s evisceration of the presentation of risk by various high-profile cancer organisations.

There are parts of the book where Gigerenzer is more pessimistic about the ability to educate the masses, such as when he channels Nassim Taleb and berates the finance industry for not understanding the difference between risk and uncertainty. In a world of uncertainty – where we do not know the probability of events – simple rules often outperform more complex models that are overfitted to past data. This provides a natural entry point to Gigerenzer’s well-established work (and subject of some of his better books) on the accuracy of heuristics. Risk Savvy has plenty of additional advocacy for their use with Gigerenzer arguing that we can be trained to use useful heuristics in making better decisions. Gigerenzer covers areas from marriage (set your aspiration level and choose the first person who meets it) to business to the stability of financial institutions, building on decades of evidence he has accumulated on the accuracy of simple rules.

Gigerenzer’s heuristics don’t always match up with his optimism that we can make people risk savvy. One heuristic he suggests is: “If reason conflicts with a strong emotion, don’t try to argue. Enlist a conflicting and stronger emotion.” He also recognises the limits to education, with heuristics such as “don’t buy financial products you don’t understand.” But given that a lot of people don’t understand compound interest, we might need to rely on the Dunning-Kruger effect to allow people to follow this rule and still make any investments.

One interesting point made by Gigerenzer is that there is still a role for experts (and even consultants) in a world where we use simple heuristics. Suppose we replace our complex asset allocation models with a 1/N rule – allocate our assets equally across N choices. This still leaves questions such as the size of N, what we will include in N, or when you should rebalance. For many heuristics, there may be more complex underlying choices – although I imagine heuristics could be developed for many of these too.

Gigerenzer is also a stout defender of gut instinct – again, as covered in his other books. Gigerenzer suggests (and I agree) that data is often gathered due to a culture of defensive decision-making and not because data is the major reason in the decision. This is, however, the weakest area of the book, as Gigerenzer’s stories reek of survivorship bias. Gigerenzer notes that leading figures in business reveal in surveys that they rely on gut instinct and not data in making major decisions. But how many corpses who relied on gut instinct are strewn along the road of entrepreneurship?

As another example, Gigerenzer talks of a corporate headhunter who had put a thousand senior managers and CEOs into their positions. The headhunter said that nearly all the time he based his selection on a gut decision. He was now being replaced by tests by psychologists. Gigerenzer puts this down to a negative error culture, with the procedures designed to protect the decision makers. But what is the evidence that the headhunter has been good at their job and could outperform the psychologists armed with tests?  Similarly, Gigerenzer suggests listening to those with good track records in business. Again, survivorship bias could make this a useless exercise. When talking of predictions of exchange rates in other parts of the book, Gigerenzer effectively makes this very same point – the successful people you see in front of you could simply be the lucky survivors.

However, the evidence that Gigerenzer has developed in the past would make it folly for anyone in business to throw gut instinct out the window – or to completely discard Gigerenzer’s arguments. But the way he makes the case through Risk Savvy feels built on anecdote and weak examples.

There is one rule I am going to take away from the book – an extension of my usual habit of flipping a coin for decisions about which I’m indifferent. Gigerenzer suggests flipping a coin and as it spins, considering what side you don’t want to come up. He used this example in the context of choosing a partner, but it’s not a bad way to elicit that gut instinct that you can’t otherwise hear.

3 comments

  1. I’ve read four of Gerd’s books, and although much Risk Savvy is work he has previously covered, I quite enjoyed it. Calculated Risks was the first book of Gerd’s that I read, and my favourite.

    I’m not sure I agree that Gerd’s recommendations would fall into the category of “nudging”. The tools he suggests (natural frequencies, fact boxes, heuristics, etc) are about presenting the information in a manner that makes it easier to comprehend, which allows the individual to better informed when they make a decision. Unlike nudging, it does not try to sway the decision one way or another. Nudging, on the other hand, is a top-down method used by the nudger to try to persuade the nudgee into making a decision that the nudger wants them to make.

    I think Gerd’s grudge with “nudging” and the “irrational” view of human behaviour is mainly due to two reasons:
    1. “Nudging” and the “irrational view of human behaviour” side of behavioural science has received so much attention, both in the media and popular psych books aimed at laypersons. The wisdom of gut instinct and heuristics are being drowned out, and I Risk Savvy is probably Gerd’s response to this. It’s his version of getting the message out to the masses, and probably explains why so much of the book’s content is rehashed from his earlier work.
    2. The premise of the nudger knows what’s better for the nudgee than the nudgee himself does. Gerd believes it is better to empower individuals to be informed and make their own decisions rather than relying on an expert nudging them in the direction that the nudger believes is best. In other work Gerd has also criticised the “irrational” view of human behaviour because of its narrow definition of a rational optimiser; whereas Gerd argues we are “ecologically rational” (see Rationality For Mortals).

    1. Hi Gav, thanks for your thoughts.

      I’m not convinced that Gigerenzer isn’t interested in swaying the decision. Through the book, he suggests we’re having too much cancer screening, trusting financial charlatans who we shouldn’t, not maximising our financial returns and driving when we should be flying. To recognise that people could make better decisions, you need at least some conception of whether they are currently making the right one. It also requires a reasonable degree of hair splitting to differentiate the philosophy behind presenting risk in natural frequencies and work behavioural scientists have done on presenting credit card fees etc. (although it can be done).

      As a result, I still feel that the issue of “the nudger knows best” is more a matter of framing than substance. This comes through relatively starkly at times – Gigerenzer is more than happy to highlight the second best decision making made by defensive decision makers in business.

      That said, I agree with much of your point 1 and the second half of point 2 – particularly the point “The wisdom of gut instinct and heuristics are being drowned out”. But how about this for an alternative framing “The behavioural scientists have shown a set of domains where people seem to make poor decisions. Many of these poor decisions can be explained by their use of heuristics in environments that they are unsuited. However, those heuristics are amazingly useful and accurate tools in many other environments, as are our gut instincts – in some circumstances we are great decision makers. By analysing these tools and what environments they work well in, we can greatly expand our understanding of what the behavioural scientists are finding and it gives us a new range of mechanisms by which we can help people make better decisions.”

      As an end note, I sense part of Gigerenzer’s dislike of nudging comes from the way nudging gets used in practice and the types of people putting it into action – they end up looking nothing like nudges – and I’m certainly with him there.

  2. Hi Jason, thanks for taking the time to reply.

    I completely agree that Gigerenzer could probably frame his message a bit better (like your brilliantly written example above – I wish words came to me so easily), so as not be sound so aggressive towards the “irrational behaviour” and “nudging” crowds. I also think a few who write about irrational behaviour are equally as harsh in their criticisms of System 1 type thinking. You only need to look at some of the book titles as examples: “Predictably Irrational” and “You Are Not So Smart”.

    Gigerenzer may be trying to sway the decision, but in a manner that informs and empowers the decision maker, which I believe is a little different than simply nudging them in in the nudger’s preferred direction (even if it does make most nudgee’s better off). I guess that may make it a different type of nudge, but it’s just semantics. He does however try to inform the reader that he isn’t trying to push them one way or the other:
    “While the fact box clearly shows that there is no good reason to push women into screening, my point is exactly not replace the old paternalistic message with a new one by telling women not to go to the screening. Every woman who wants to make her own decision should get the facts she needs – without being told what to do.”

    I also suspect Gigerenzer is a libertarian, which probably also plays a role in his dislike for nudging. Toward the end of the book, he talks about soft and hard paternalism, liberty and personal responsibility:
    “The term “paternalism” stems from the Latin pater, for father, and means treating adults like children. Paternalism limits the liberty of people, whether they like it or not, allegedly for their own good. Hard paternalism, like antismoking legislation, coerces people into behaving a particular way and can be morally defended as long as it protects people from being hurt by others. Soft paternalism, such as automatically enrolling people into organ donation programs unless they opt out, nudges people into behaving a particular way. The idea is that governments should steer people’s choices without coercing them. As a general policy, coercing and nudging people like a heard of sheep instead of making them competent is not a promising vision for democracy.

    “This book’s message is more optimistic about human liberty. People are not doomed to be at the mercy of governments and experts who know what is best for you and me. As I have shown for health and wealth, those are a rare breed: The average doctor or financial advisor has conflicting interest, practices defensive decision making, or does not understand the evidence. That’s why we have to think for ourselves and take responsibility into our own hands.”

    Again, thank-you for taking the time to respond. I enjoy your posts.

Comments welcome

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s