Saturday 7 June 2014

Nudge economics: has push come to shove for a fashionable theory?

A rival psychologist has published a book debunking the behavioural economics of Daniel Kahneman and the men behind Nudge, who, along with the authors of Freakonomics, were once the PM's pet thinkers. So how do you choose between them?


Blogger Ref Link http://www.p2pfoundation.net/Transfinancial_Economics

Illustration by Jack Hudson.View larger picture
Illustration by Jack Hudson.
In a TED talk in Monterey, California in February 2010, just before he came to power and had to make decisions, David Cameron was extremely keen to look like the future. "Politicians will only succeed if they actually try to treat people as they are, rather than as they would like them to be," was his fresh-faced rallying cry, as he attempted to channel the spirit of Bobby Kennedy in an open-necked shirt. "If you combine this very simple, very conservative thought – go with the grain of human nature! – with all the advances in behavioural economics, I think we can achieve a real increase in wellbeing, in happiness, in a stronger society without necessarily having to spend a whole lot more money.
This "revolution in government" would be brought about in particular, Cameron suggested, by his devotion to the theories of a group of thinkers who had come to establish and dominate a new self-help/psychology/economics corner of the bookshop, the decision-makers' decision-makers. "We are working with these people," Cameron said proudly, and flashed up a slide featuring three of them: Cass Sunstein and Richard Thaler, authors of the bestsellingNudge, and Daniel Kahneman, Nobel prize winner and author of the soon-to-be bestsellingThinking, Fast and Slow.
Cameron then gave a few examples of how this revolution would work in practice. The "best way" of making homes energy efficient, he suggested, was not by the "bullying or badgering of government" but through bills that showed citizens how much less their neighbours spent on gas and electricity than they did. The tricks of behaviourial psychology (which, you couldn't help thinking, had been used by the advertising industry for about a century) could "nudge" the public into doing the right economic thing. The government would in this way embrace the lucrative counterintuitive lessons of Freakonomics, it would listen to evidence-based data, not political gut instinct, it would, as the authors of that latter bestselling book – Steven Levitt and Stephen Dubner – had it, "think like a freak".
With all this in mind, that spring, Cameron established his behavioural insights team, led by the former Blair adviser David Halpern, which immediately became known as his "nudge unit". In a book published this month, Levitt and Dubner, who made their names with a behavioural-economics agony aunt column in the New York Times, recall how they were called in to address this new team and the prime minister, back in 2010. Cameron could not have been more chuffed to meet them: "Right, where are the clever people?" This ebullience drained quite quickly, however, when Levitt and Dubner turned their particular brand of freak-thinking to the NHS.
"We tried to make our point with a thought-experiment," they recall. "We suggested to Mr Cameron that he consider a similar policy in a different arena. What if, for instance, every Briton were also entitled to a free, unlimited lifetime of transportation? That is, what if everyone were allowed to go down to the car dealership whenever they wanted and pick out any new model, free of charge and drive it home?"
At this, Levitt and Dubner expected the prime minister to "light up" and say: "I see your point about the free healthcare we are doling out!" They expected him to embrace their behaviouralist argument that no one places a value on anything unless they are charged for it. In fact Cameron said nothing at all, but "the smile left his eyes", there was a quick handshake and he "hurried off to find a less ridiculous set of people with whom to meet". The freakonomists can't help feeling this was a missed opportunity, though they concede "fixing a huge problem like runaway healthcare costs is about a thousand times harder than, say, figuring our how to take a penalty kick" (for which their book offers a more workable solution).
This was, in retrospect, perhaps also the moment when the prime minister began to lose a little of his messianic faith in his behavioural revolution. Certainly, in the years since he has come to power and had to make actual messy decisions, he has had a good deal less to say on the decision-making theorists about whom he was previously so evangelical. The nudge unit has had some successes in testing and rethinking how choices are presented to citizens, championing opt-out rather than opt-in pension provisions, for example, or proving the efficacy of labelling rubbish bins with the word "landfill" to encourage recycling. Some of Cameron's ambitions have come to seem a cognitive fallacy in themselves, however: telling people how much their neighbours pay for electricity produces only about a 1% change in usage; traditional tough economic decisions – a carbon tax, or an increase in fuel duty – are far more effective in the goal of energy-saving.
Daniel KahnemanDaniel Kahneman, the 'godfather' of behavioural economics, has been challanged by psychologist Gerd Gigerenzer, who claims that Kahneman presents 'an unfairly negative view of the human mind'. Photograph: Richard Saker
Though nudge-economics remains seductive, what once seemed like a panacea has come to look a bit more like a series of sticking plasters. Earlier this year the nudge unit was removed from direct government control, partly sold to the Nesta innovation charity run by New Labour guru Geoff Mulgan, a move which seemed to suggest the prime minister no longer viewed it as quite so central to his philosophy. That move has coincided with a backlash, or at least a critical analysis, of some of the tenets on which its brand of behavioural economics is based.
Cameron would not have seen it in these terms but in his freakonomics moment over the NHS he was also confronting an extremely crude version of one of the most heated academic debates of the last two decades: the question of whether purely rational decision-making is feasible in the real world. That has been part of an ongoing argument between the "godfather" of behavioural economics, Kahneman, and his most serious opponent, a psychologist named Gerd Gigerenzer, director of the Centre for Cognition and Adaptive Behaviour at the Max Planck institute in Berlin. The substance of this argument concerns the best way for human beings to make decisions.
It is generally possible to judge the depth of an academic disagreement by the way the two principal opponents address each other in the footnotes of their life work. In Kahneman's hugely influential Thinking, Fast and Slow he notes that "a prominent German psychologist has been our most persistent critic" and goes on to list the references to articles in which those criticisms have been made (a list that cumulatively makes Gigerenzer seem a little obsessive, a behaviouralist stalker). Back at the turn of the millennium, when Kahneman was not yet an academic superstar, he admitted to being troubled by the feud to one interviewer. "It was embarrassing, the level of hostilities," he said. "Gigerenzer speaks very well. Even when he's completely wrong, it's hard not to be impressed… "
The shorthand of Gigerenzer's criticism then and now was that Kahneman presents "an unfairly negative view of the human mind". Or, as Gigerenzer himself explained it when I spoke to him on the subject in London last week, "in concentrating only on fallacies and biases Danny [Kahneman] pushes the idea that people are dumb." That shorthand – that because of various provable fallibilities in reasoning when making decisions, human beings are incapable of choosing the best outcome for themselves – is the basis of the philosophy behind nudge economics.
Gigerenzer thinks Kahneman is wrong. People are not stupid, he argues, just ill-educated in "risk literacy". Gigerenzer does not, for his part, mention Kahneman's name once in his new 320-page book, Risk Savvy, though arguably the entire volume is dedicated to a dismantling of some of the tenets of Thinking, Fast and Slow.
Gigerenzer talks about his new book on the Guardian's Science Weekly podcastLink to this audio
In an increasingly complex and specialised world, Gigerenzer preaches a gospel of greater simplicity. He suggests that the outcome of decisions of any complexity – a complexity of, say, trying to organise a successful picnic or greater – are impossible to accurately predict with any mathematical rational model, and therefore more usefully approached with a mixture of gut instinct and what he calls heuristics, the learned rules of thumb of any given situation. He believes, and he has some evidence to prove it, that such judgments prove sounder in practice than those based purely on probability.
Gerd GigerenzerGerd Gigerenzer, who believes that, with education, the teaching of critical thinking about statistical probability, people can become more usefully 'risk savvy'.
"This is not," Gigerenzer concedes, "an easy message to convey to economists, who have been trained in optimisation techniques or to managers in corporations who believe that one can maximise every decision." There are, too, plenty of people who have a vested interest in preserving the idea that the future is complex but calculable, even if in most cases they know that the illusion of certainty that they present is untrue.
In Gigerenzer's view this group includes much of the medical profession, most financial analysts and advisers, and, of course, many academics. The desire for ever greater complexity in the process of decision-making, driven by ever greater access to data, in practice produces what he calls risk-averse "defensive decision-making", or covering your backside. You don't do what your instinct and experience tells you is right: you find the data to support an inferior, but less personally risky choice.
Gigerenzer believes above all in the power of simple rules in the real, unfathomably complex world. "Probability theory is the best thing in a world where you can measure the risks exactly and the parameters are not too complicated. But for most problems it provides another illusion of certainty, and becomes part of the problem," he argues. His favourite example is that of Harry Markowitz, who won a Nobel prize in economics for creating a formula to produce maximum gain from an investment portfolio, in a sophisticated mathematical weighting of gain and risk. When it came to investing for his own retirement, however, Markowitz didn't bother with that his formula – which Gigerenzer proved to be certainly effective only over a 500-year period – he simply divided his investment equally among a given number of assets. He used the heuristic of "not putting all his eggs in one basket" because he knew that would probably be good enough.
Gigerenzer often talks with investment analysts and bankers who, despite the illusions of sophisticated risk management they sell to their clients, tell him that more often than not they tend to rely on similar rules of thumb when it comes to placing their own bets. Politicians, he observes, also routinely trade on our inability to gauge competing risks, not least in exploiting anxieties over terrorism to erode civil liberties. Though Kahneman himself carefully limits its potential political application, his argument that we are irretrievably in thrall to our fallacies, in Gigernzer's view, only strengthens the argument for such paternalism from government. Nudge theory becomes the more palatable expression of a deliberate wider manipulation. It makes us weaker and less questioning citizens.
Gigerenzer proposes an alternative solution. He believes, with education, the teaching of critical thinking about statistical probability, people can become more usefully "risk savvy" (again, you would have to say that Kahneman's work absolutely shares this ambition, even if his route to it is different, and arguably more rigorously scientific).
The day before I spoke to him Gigerenzer had met the newly privatised behavourial insights team. How did that go? He told them two things: "that given the will you can teach even fourth-graders critical reasoning", and second, that "I am not against a bit of nudging here and there – but it can never be a philosophy for a country." It is, he argues to me, more the evidence of political cowardice, of giving up on trying to inform or make clear arguments. A political cop-out. Why not, he asks, "instead do something sustainable? Teach children statistical thinking, not as a branch of mathematics, but with evidence from the real world, in health and finance. And teach them heuristics, give them a tool kit that will help them to understand risk and question choices."
He offers one such heuristic, which, if ingrained, might have saved a lot of grown-up difficulties in recent years: "Never buy financial products you do not understand." And another, which he swears by: "At a restaurant, don't study the menu for clues, say to the waiter, 'if you were eating here tonight what would you have?'"
The one certainty of our lives is that we are never going to be stuck for uncertainty. Choices multiply along with information. So which book should you buy, which philosophy should you follow in the ever-growing literature of the art and science of making decisions? You decide.

No comments:

Post a Comment