lunedì 23 gennaio 2012

Nudging individual to improve decisions


Carolyn is the director of food services for a large city school system, during her work she gives the directors of school cafeterias specific instructions on how to display the food choices.
Her friend Adam, who designs supermarket floor plans, told her that by re-arranging the cafeteria displays she will be able to increase or decrease the consumption of many food items.
He was right, she did the experiment and she found a 25 percent increase/decrease in re-arranged items purchases.
Her influence can be exercised for better or worse changes, she can increase the consumption of healthy foods and decrease the consumption of unhealthy ones.

She is the choice architect in Thaler and Sunstein’s (2008) view. A choice architect has responsibility for organizing the context in which people make decisions. They explain how many real people, despite this simple example, turn out to be choice architects, most without realizing it (i.e. a Doctor in describing alternative treatments available to patients). That’s because of the framing effect, it means that the choice depends on how the problem is presented.
Small and apparently insignificant details can have major impacts on people’s behavior: everything matters. In many cases, the power of these small details comes from focusing the attention of users on a particular direction (i.e. at the Schiphol Airport in Amsterdam, authorities have etched the image of a black housefly into each urinal, reducing the spillage by 80%).
The insight that everything matters can be both paralyzing and empowering. Good architects realize that although they can’t build the perfect building, they can make some design choices that will have beneficial effects.
Very often decisions would be made differently if the individual paid full attention, possessed complete information, unlimited cognitive abilities and complete self control.
As Herbert Simon said, the real humans are not (and cannot be) fully rational as the economists considered them (Simon, 1957). We can’t do perfect forecasts, but we should do unbiased forecasts.
We err.
We plan in the morning to eat a cake at dinner, and then we eat an ice cream at dinner, forgetting the cake in the fridge.

Before continuing we need to state a false assumption and a misconception.
The false assumption is that almost all people, almost all of the time, make choices that are in their best interest or at the very least are better than the choices that would be made by someone else.
The first misconception is that it is possible to avoid influencing people’s choices. In many situations, some organization or agent must make a choice that will affect the behavior of some other people. In this case there is no way of avoiding nudging in some direction, and whether intended or not, these nudges will affect what people choose.
Studies evidenced how we have two systems of thinking: automatic and reflective. This approach involves a distinction between these two kinds of thinking, on one hand intuitive and automatic, and on the other hand reflective and rational (Chaiken and Trope, 1999).
The automatic system is rapid and is or feels instinctive: when we duck because a ball is thrown at our unexpectedly or get nervous when our airplane hits a turbulence. The reflective system is more deliberate and self-conscious. We use it when deciding which route to take for a trip, or which course to attend this semester.
One way to think about all this is that the automatic system is our gut reaction and the reflective system is our conscious thought. Gut feelings can be quite accurate, but we often make mistakes because we rely too much on our automatic system.
Since our automatic system is faster than the reflective in making decisions, people developed thousands rules of thumb. We use rules of thumb to help us to make judgments, such as the distance between Cleveland and Philadelphia (or to calculate the distance between Milan to Rome, my answer is 5 to 6 hours by car but I don’t know the real distance).
Although rules of thumb can be very helpful, their use can also lead to systematic biases (Tversky and Kahneman, 1974). The bias is the human tendency to make systematic errors in certain circumstances based on cognitive factors rather than evidence. Such biases can result from information-processing shortcuts called heuristics, or rule of thumb.
Anchoring: you will estimate the magnitude of something by adding or subtracting a little from known magnitudes; so if you are in a town with 3 million you will calculate the population of the next town a bit bigger or smaller to yours. The problem is we may be way off but we believe in our estimates.
Anchors serve as nudges. They can influence the figure we will choose in a particular situation by ever-so-subtly suggesting a starting point for our thought process.
Availability: How much you worry about something depends on how imminent a threat it is perceived to be. We worry more about swine flu than obesity because the threat of obesity is not available to our minds at this time. We are constantly reminded of the threats of swine flu but obesity is much more dangerous, it is responsible for 112,000 deaths and for more than 100,000 cases of cancer annually in the USA.
People assess the likelihood of risks by asking how readily examples come to their mind. Biased assessments of risk can perversely influence how we prepare for and respond to crises, business choices, and the political process. These decisions may be improved if judgments can be nudged back in the direction of true probabilities.
Representativeness: We tend to find meaning in random patterns that suit our theories. People often see patterns because they construct their informal tests only after looking at the evidence.
Optimism and over confidence: we tend to overestimate our chances of success in marriage, in business, in investing on mortgages securities, etc.
Unrealistic optimism can explain a lot of individual risk taking, especially in the domain of risks to life and health. Asked to envision their future, students typically say that they are far less likely than their classmates to be fired from a job, to have a hearth attack or get cancer, etc.
Loss aversion: We don’t like to lose and this reinforces inertia. Inertia means that things will tend to remain in their resting state or trajectory unless an external force is applied. Most people don’t bother changing the settings of their electronics, retirement plans or magazine subscriptions even if these entail significant monetary loss. Therefore the way in which the default option is defined is very important. The power of inertia can be harnessed.
Kahneman et al. (1991) experimented how people that received, for free, coffee mugs are not willing to sell them. Results show that those with mugs demand roughly twice as much to give up their mugs as others are willing to pay to get one. Once I have a mug, I don’t want to give it up. Loss aversion operates as a kind of cognitive nudge, pressing us not to make changes, even when changes are very much in our interests.
Status Quo Bias: If you have to call to cancel a subscription you are more likely to stay subscribed because you follow the “yeah whatever” pattern, have no time or pay no attention. Default options attract a large market share, people have a more general tendency to stick with their current situation (Samuelson and Zeckhauser, 1988). Setting the best possible default is very important in inducing change.
Framing: Choice depends on how the problem is stated. If a Doctor says that ninety out of a hundred people do well with a particular procedure, you are bound to be more reassured and likely to undergo the procedure than if the doctor tells you that ten out of a hundred people die with this procedure. Information campaigns framed in terms of losses are much more effective; if you tell someone “you will save X amount if you do Y, they are less likely to do Y than if you say “you will lose X amount if you don’t do Y”.
Apart from the biases there are many other factors determining our choice behavior at a given time. Temptation is a big one.
If a bowl of nuts is placed on a table during drinks before the dinner, likely it will be consumed in its entirety even if people would rather just eat a few and wait for dinner, most will have a hard time stopping and would actually like it if someone took the nuts away. Most people realize that temptation exists, and they take steps to overcome it, take into example Ulysses.
Temptation is linked with the self-control problems. Thaler and Sunstein (2008) say that we face every time with two parts of us: the planner who thinks of the long term welfare and the doer who acts somewhat automatically searching to satisfy desires, everyone’s Homer Simpson. In addition, We tend to do things in automatic. Food is a particular problem; we could mindlessly eat whatever is put in front of us until we finish it. Bigger packages mean more eating because we are simply not paying attention.
To deal with the doer there are plenty of self-control strategies applied during the normal life by the planner. For example, the Christmas clubs, that have no meanings in economic terms, but psychologically they facilitate people to save for Christmas expenses. In Italy we don’t have that, but usually in a job contract the retribution is divided into 13 or 14 months, instead of 12, and they give you the additional wage on Christmas and before summer holidays.

This article intended to briefly summarize and review the book Nudge, in order to introduce the concept of nudging, my last research interest.
The book is a good flowing review of the hidden traps in decision making, it is a good smattering of some important assumptions of the behavioral psychology and, finally, it is a manifesto for politicians to adopt policies that could drive the people’s behavior in a right way, both for them, for the economy and for the nation.
The most interesting parts of the book are the introduction and the first part. After these ones the authors presents a series of chapters dedicated to hot topics in the U.S. politics, such as the welfare reform, or how to drive people investment choices to a “save for their retirement” way, or how to nudge people to have a correct behavior on sanitary issues.
The core element of the book is the richness of real life examples, nothing is presented by a theoretical point of view, the theory is introduced and then explained by several real life examples.

Cited references

CHAIKEN, S. & TROPE, Y. 1999. Dual Process Theories in Social Psychology, New York, Guilford.
SUNSTEIN, C. & THALER, R. 2008. Nudge: Improving Decisions About Health, Wealth, and Happiness, New Heaven & London, Yale University Press.



Nessun commento:

Posta un commento