Do heuristics help us make good decisions in uncertain times?

Source: By Biju Dominic: Mint

Do heuristics, the shortcuts that the brain takes, support efficient decision making or does it impede efficient decision making?

The Nobel Prize in 2002 for Daniel Kahneman not only brought the field of behavioural economics into the limelight, but it also started establishing the point that human decisions are riddled with several biases. These biases, which are systematic deviations from rationality, result from heuristics applied incorrectly or in unintended contexts. One of the frequently used strategies to tackle biases is to use “nudges”, as suggested by University of Chicago economist Richard Thaler and Harvard Law School professor Cass Sunstein in their book Nudge: Improving Decisions About Health, Wealth, And Happiness.

The David Cameron government in the UK and the Barack Obama administration in the US using Sunstein as an adviser and setting up “nudge units”, and the 2017 the Nobel prize for Thaler further consolidated the notion that decision-making has to be “de-biased”.

However, there is another side to the story. Humans have neither unlimited resources nor unlimited time to take decisions. So, the brain has always developed smart heuristics, shortcuts to take efficient decisions. In this worldview, heuristics are an integral part of decision-making. This theory was put forward by Gerd Gigerenzer, director emeritus of the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development and director of the Harding Center for Risk Literacy, both in Berlin. He believes that behavioural economics is tainted by a “bias bias”, the tendency to spot biases even if there are none.

Gigerenzer believes that logic and related deliberate systems have monopolized Western philosophy of the mind for too long. Gigerenzer’s approach is based on his strong belief in the powers of the unconscious processes of the brain. His body of work illuminates the intelligence of the unconscious—our ability to know, without thinking, which rule is likely to work in which situation.

Gigerenzer has shown that smart heuristics can even be more efficient than decisions taken after much deliberation. He showed that a simple smart heuristic of 1/N rule—allocate your money equally to each of the N funds—tended to provide equal or better returns than a portfolio chosen by financial experts.

In his book Gut Feelings, Gigerenzer provides the example of how players catch a ball, whether in baseball or cricket. They do not solve complex differential equations in their heads to predict the trajectory of the ball. In fact, players use a simple heuristic. When a ball comes in high, the player fixates on the ball and starts running. The heuristic is to adjust the running speed so that the angle of gaze remains constant—that is, the angle between the eye and the ball. The player can ignore all the information necessary to compute the trajectory, such as the ball’s initial velocity, distance, and angle, and just focus on one piece of information, the angle of gaze. This is an example of what Gigerenzer calls a smart heuristic, which is part of the adaptive toolbox that has evolved in humans for millions of years.

One other key thought put forward by Gigerenzer is that there is a big difference between risk and uncertainty. We are dealing with risk when you know all the alternatives, outcomes and their probabilities. We are dealing with uncertainty when you don’t know all the alternatives, outcomes or their probabilities. So, when one is dealing with risk, complex mathematical models and other optimisation models work. However, when you are dealing with uncertainty, they don’t work well, because we live in a dynamic world. According to Gigerenzer, smart heuristics help humans make good decisions in times of uncertainty.

Gigerenzer’s other body of work has been in risk evaluation and communication. He believes that humans can be taught to better evaluate risk. Consider this situation from the field of healthcare—mammograms to detect breast cancer. The chance that a woman in the age group of 40-50 has cancer is roughly 1%. If a woman has breast cancer, the probability that she will test positive on a mammogram is 80%. The false-positive rate—that is, the probability of testing positive even if she doesn’t have cancer—is approximately 10%. In this scenario, how do you answer the question of a woman who’s just tested positive for cancer?

Gigerenzer has found that most doctors, like most humans, do not understand conditional probabilities. So, most doctors in the above case tend to inform the woman that she has close to 90% probability of having breast cancer. So, he developed a simple diagram that explained the probabilities involved. Out of every 1,000 women who get tested, 990 (99%) will not have breast cancer. Out of these 990 women, 99 (10%) will test positive for cancer due to false positives. Out of 10 who are likely to have breast cancer, 8 (80%) will test positive. This means that the chance that a woman has breast cancer given that she tests positive for it is 8/(8+99), or 7.5%. It has been found that if doctors are trained how to translate conditional probabilities into natural frequencies, the ability of the doctors to communicate the risk to their patients goes up dramatically. Just imagine the huge difference this can make to customer satisfaction in the healthcare business.

The bias behavioural economists have towards heuristics and biases have given the discourse about decision-making far too negative a frame. Gigerenzer’s work should help us balance the narrative and bring the focus back to the importance of the same heuristics in helping make efficient and effective decisions.

 

Print Friendly, PDF & Email