For a practical persuader, one of the most disturbing features of some biases is their self-sealing nature. When an audience suffers from confirmation bias, for example, you would hope that they could be taught out of it. That is, if they're primed to think that all big corporations are evil, then you might think the prescription would be a few doses of "No, they're not." But that's the thing about confirmation bias; your audience will notice, understand, trust, and remember all the examples that support their pre-existing belief, and they'll be prone to dismiss the rest. There is a spotlight on evidence that supports what we already believe: That's the best evidence, and the evidence that is most mentally accessible. The bias is powerful, and it protects and perpetuates itself. So does that mean any effort to persuade an audience away from the bias is doomed? Not according to psychologist Tom Stafford in an article in BBC.com from earlier this year. The article is entitled, "How to Get People to Overcome Their Bias," and follows up with the subtitle, "Asking them to be fair, impartial and unbiased is not enough."
Dr. Stafford reports on a classic psychology experiment conducted by researchers at Princeton University (Sawin, 1988). Charles Lord and his colleagues recruited people with strong attitudes either supporting or opposing the death penalty, then presented them with evidence that either confirmed or refuted the effectiveness of the death penalty. According to Stafford, "Confirmatory evidence strengthened people's views, as you'd expect, but so did disconfirmatory evidence." That is called biased assimilation. That part was not new, but then Lord's team reran the study with two different instructions. One instruction was a motivation-focused instruction asking participants to consider the evidence, "as objective and unbiased as possible," to consider themselves "as a judge or juror asked to weigh all of the evidence in a fair and impartial manner." The other instruction was cognition focused: "Ask yourself at each step whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue."
That instruction suggested to participants that if, for example, a given study showed the death penalty lowered murder rates, the participants were asked to analyze the study's methodology and imagine the results showed the opposite result. That is called the "consider the opposite" instruction. And the results of the two instructions were substantially different. The motivational instruction simply did not work: Participants still showed the same assimilation bias, with the new information strengthening their existing views regardless of whether it reinforced or refuted the effectiveness of the death penalty. Supporters became more supportive, opponents became more opposed.
But those receiving the "consider the opposite" instructions were able to completely avoid the biased assimilation, and unlike the others, did not become more certain of their prior views regardless of the new evidence. Asking them to imagine that the results pointed in the opposite direction apparently encourages them to think about how they are processing the information, and that works against the bias. The moral is one that is critical for lawyers and judges to take to heart: According to Dr. Stafford, "Wanting to be fair and objective alone isn't enough. What's needed are practical methods for correcting our limited reasoning."
Implications for Trial
One implication of this research is that we cannot simply place our faith in the admonition to be fair and unbiased. People believe they are being fair and unbiased when they are reviewing and assimilating information. So that central instruction of the law to be fair is likely read by jurors as, "Keep doing what you're doing," and that means perpetuating confirmation bias.
The 'consider the opposite' instruction would work differently. Remember the language used in the study was, "Ask yourself at each step whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue."
It is possible to imagine versions of that frame used in trial. In situations where you would expect the jury's initial leaning and confirmation bias to be running against you, you might invite jurors to consider the opposite:
In evaluating this expert witness, ask yourself whether you would evaluate her in the same way if her conclusions supported the other side in this lawsuit.
As you evaluate the evidence showing that there is causation, you should apply the same tests you would apply if the evidence was showing no causation.
Looking at this eyewitness identification and deciding if it is credible, try to apply the tests you would apply if the testimony was that there is no identification.
Ultimately, you will be deciding whose reasoning is persuasive and whose is not. When you make that decision, ask yourself, 'Would I also find it credible if the same reasoning pointed to the opposite conclusion?'
Of course, that requires some perspective taking and some higher-level thinking. The exact form of the request will depend on the case and the situation, and it is likely that not all jurors will be able to do it. But it is encouraging to learn that confirmation bias is not entirely self-sealing, and there is at least one way to encourage people to set it aside.
Other Posts on Confirmation Bias:
- Reject Your Confirmation Bias (Or At Least Try To)
- Know Your Cognitive Biases
- Know (and Use) the Cognitive Biases of Mediation
Sawin, G. (1988). Consider the Opposite. Et Cetera, 45, 190.
Image credit: www.cartoonstock.com, used under license