“
In 2012, psychologists Richard West, Russell Meserve, and Keith Stanovich tested the blind-spot bias—an irrationality where people are better at recognizing biased reasoning in others but are blind to bias in themselves. Overall, their work supported, across a variety of cognitive biases, that, yes, we all have a blind spot about recognizing our biases. The surprise is that blind-spot bias is greater the smarter you are. The researchers tested subjects for seven cognitive biases and found that cognitive ability did not attenuate the blind spot. “Furthermore, people who were aware of their own biases were not better able to overcome them.” In fact, in six of the seven biases tested, “more cognitively sophisticated participants showed larger bias blind spots.” (Emphasis added.) They have since replicated this result. Dan Kahan’s work on motivated reasoning also indicates that smart people are not better equipped to combat bias—and may even be more susceptible. He and several colleagues looked at whether conclusions from objective data were driven by subjective pre-existing beliefs on a topic. When subjects were asked to analyze complex data on an experimental skin treatment (a “neutral” topic), their ability to interpret the data and reach a conclusion depended, as expected, on their numeracy (mathematical aptitude) rather than their opinions on skin cream (since they really had no opinions on the topic). More numerate subjects did a better job at figuring out whether the data showed that the skin treatment increased or decreased the incidence of rashes. (The data were made up, and for half the subjects, the results were reversed, so the correct or incorrect answer depended on using the data, not the actual effectiveness of a particular skin treatment.) When the researchers kept the data the same but substituted “concealed-weapons bans” for “skin treatment” and “crime” for “rashes,” now the subjects’ opinions on those topics drove how subjects analyzed the exact same data. Subjects who identified as “Democrat” or “liberal” interpreted the data in a way supporting their political belief (gun control reduces crime). The “Republican” or “conservative” subjects interpreted the same data to support their opposing belief (gun control increases crime). That generally fits what we understand about motivated reasoning. The surprise, though, was Kahan’s finding about subjects with differing math skills and the same political beliefs. He discovered that the more numerate people (whether pro- or anti-gun) made more mistakes interpreting the data on the emotionally charged topic than the less numerate subjects sharing those same beliefs. “This pattern of polarization . . . does not abate among high-Numeracy subjects. Indeed, it increases.” (Emphasis in original.) It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs.
”
”
Annie Duke (Thinking in Bets: Making Smarter Decisions When You Don't Have All the Facts)