In 1999, an English solicitor named Sally Clark went on trial for the murder of her two infant sons. She claimed both succumbed to sudden infant death syndrome. An expert witness for the prosecution, Sir Roy Meadow, argued that the odds of SIDS claiming two children from such an affluent family were 1 in 73 million, likening it to the odds of backing an 80-1 horse in the Grand National four years in a row and winning every time.
The jury convicted Clark to life in prison.
But the Royal Statistical Society issued a statement after the verdict insisting that Meadow had erred in his calculation and that there was “no statistical basis” for his stated figure. Clark’s conviction was overturned on appeal in January 2003, and the case has become a canonical example of the consequences of flawed statistical reasoning.
A new study in examined why people struggle so much to solve statistical problems, particularly why we show a marked preference for complicated solutions over simpler, more intuitive ones. Chalk it up to our resistance to change. The study concluded that fixed mindsets are to blame: we tend to stick with the familiar methods we learned in school, blinding us to the existence of a simpler solution.
“As soon as you pick up a newspaper, you’re confronted with so many numbers and statistics that you need to interpret correctly.”
Roughly 96 percent of the general population struggles with solving problems relating to statistics and probability. Yet being a well-informed citizen in the 21st century requires us to be able to engage competently with these kinds of tasks, even if we don’t encounter them in a professional setting. “As soon as you pick up a newspaper, you’re confronted with so many numbers and statistics that you need to interpret correctly,” says co-author Patrick Weber, a graduate student in math education at the University of Regensburg in Germany. Most of us fall far short of the mark.
Part of the problem is the counterintuitive way in which such problems are typically presented. Meadows presented his evidence in the so-called “natural frequency format” (for example, 1 in 10 people), rather than in terms of a percentage (10 percent of the population). That was a smart decision, since 1-in-10 a more intuitive, jury-friendly approach. Recent studies have shown that performance rates on many statistical tasks increased from four percent to 24 percent when the problems were presented using the natural frequency format.
That makes sense, since calculating a probability is complicated, requiring three multiplications and one addition, according to Weber, before dividing the resulting two terms. In contrast, just one addition and one division are needed with the natural frequency format. “With natural frequencies, you have one reference set that you can vividly imagine,” says Weber. The probability format is more abstract and less intuitive.
A Bayesian task
But what about the remaining 76 percent who can’t solve these kinds of problems? Weber and his colleagues wanted to figure out why. They recruited 180 students from the university and presented them with two sample problems in so-called Bayesian reasoning, framed in either a probability format or a natural frequency format.
This involves giving subjects a base-rate statistic—say, the probably of a 40-year-old woman being diagnosed with breast cancer (1 percent)—along with a sensitivity element (a woman with breast cancer will get a positive result on her mammogram 80 percent of the time) and a false alarm rate (a woman without breast cancer still has a 9.6 percent chance of getting a positive result on her mammogram). So if a 40-year-old woman tests positive for breast cancer, what is the probability she actually has the disease (the “posterior” probability estimate)?
The mammogram problem is so well known that Weber came up with their own problems. For instance, the probability of a randomly picked person from a given population being addicted to heroin is 0.01 percent (the base rate). If the person selected is a heroin addict, there is a 100 percent probability that person will have fresh needle marks on their arm (the sensitivity element). However, there is also a 0.19 chance that the randomly picked person will have fresh needle marks on their arm even if they are not a heroin addict (the false-alarm rate). So what is the probability that a randomly picked person with fresh needle marks is addicted to heroin (the posterior probability)?
Here is the same problem in the natural frequencies format: 10 out of 100,000 people will be addicted to heroin. And 10 out of 10 heroin addicts will have fresh needle marks on their arms. Meanwhile, 190 out of 99,990 people who are addicted to heroin will nonetheless have fresh needle marks. So what percentage of the people with fresh needle marks is addicted to heroin?
In both cases, the answer is five percent, but the process by which one arrives at that answer is far simpler in the natural frequency format. The set of people with needle pricks on their arms is the sum of all the heroin addicts (10) plus the 190 non-addicts. Divide that 200 by 10, and you have the correct answer.
A fixed mind
The students had to show their work, so it would be easier to follow their thought processes. Weber and his colleagues were surprised to find that even when presented with problems in the natural frequency format, half the participants didn’t use the simpler method to solve them. Rather, they “translated” the problem into the more challenging probability format with all the extra steps, because it was the more familiar approach.
That is the essence of a fixed mindset, also known as the effect. “We have previous knowledge that we incorporate into our decisions,” says Weber. That can be a good thing, enabling us to make decisions faster. But it can also blind us to new, simpler solutions to problems. Even expert chess players are prone to this. They ponder an opponent’s move and choose the tried and true counter-strategy they know so well, when there might be an easier solution to putting their opponent in checkmate.
“You can rigorously define these natural frequencies mathematically.”
Weber proposes that one reason this happens is that students are simply overexposed to the probability format in their math classes. This is partly an issue with the standard curriculum, but he speculates another factor might be a prejudice among teachers that natural frequencies are somehow less mathematically rigorous. That is not the case. “You can rigorously define these natural frequencies mathematically,” Weber insists.
Changing this mindset is a tall order, requiring on the one hand a redesign of the math curriculum to incorporate the natural frequency format. But that won’t have much of an impact if the teachers aren’t comfortable using it either, so universities will also need to incorporate it into their teacher training programs. “This would give students a helpful tool to understand the concept of uncertainty, in addition to the standard probabilities,” says Weber.