Category Archives: Bias

December 14, 2017

Account for Social Facts

By Dr. Ken Broda-Bahm:

For those defending the reality of human-induced climate change, it is a familiar datapoint: A consensus of 97 percent of climate scientists supports the conclusion that our species is contributing to global warming and other effects on the climate. Climate change skeptics, of course, have their own consensus: a “Petition Project” including some 31,000 who say there is “no convincing evidence.” The latter has been debunked on the basis that signers to the document don’t have to be climate scientists, or necessarily scientists at all. But one might be understandably cynical about whether either side’s consensus figure is going to be convincing to the other. After all, attitudes like these tend to have a documented self-sealing nature, since the presentation of information that might threaten my worldview tends to create a motivation to debunk that information, and that exercise of motivated debunking just makes the original belief even stronger.

Based on recent research, however, there might be an exception to this self-sealing belief system. Based on recent findings of researchers from George Mason and Yale Universities (van der Linden, Leiserowitz, & Maibach, 2017), when presented with information of a consensus, study respondents are more likely to shift their own views in the direction of the perceived norm. Not all of them will do that, of course, but a substantial number, particularly among conservatives, do seem to be influenced by the consensus. This finding, described in a recent release in ScienceDailypoints to a rare bright spot on our current ‘Alt-Fact’ horizon, and it carries some implications for the legal persuader who will sometimes need to win over the skeptical judge or juror. 

Continue reading

November 6, 2017

Expect Empathy to be Driven by Similarity

By Dr. Ken Broda-Bahm:

On October 1st, a gunman opened fire on a crowd of concert attendees, injuring nearly 500 and killing 58. In response, the President offered condemnation and condolences, but said the event should not be politicized and offered no policy changes. Thirty days later, a man drove a rented truck through a crowded bike and pedestrian area, injuring a dozen and killing eight. In response, the President used executive power to further increase vetting of foreign immigrants, called for an end to diversity-based immigration, and intensified his emphasis on a Southern border wall. Then yesterday, a gunman killed at least 26 in a Texas church, and the President was back to the more general message: Americans should “stand strong,” but no policy changes are needed. Why the difference? One explanation is that, in the first and third instances, the perpetrator was native-born and white, but in the second instance, the perpetrator was an immigrant from Uzbekistan.

No one, including the President, is going to consciously decide, “Well, the white shooters are more similar to me than the New York driver, therefore, despite the greater carnage, I will have a less-intense reaction to those cases.” However, there is good social science to support the idea that this is exactly what is going on, at least in part. When it comes to evaluating both those who have done wrong and those who are the victims of that wrongdoing, our reactions will be strongly influenced by our empathy, which is in turn strongly determined by similarity. In other words, we are less punitive when the perpetrator is like us, and we are more punitive when the victim is unlike us. In this post, I’ll share some recent research on this tendency and discuss the implications for legal persuasion.

Continue reading

October 19, 2017

Account for the Contagion of Bias

By Dr. Ken Broda-Bahm:

Every persuader, and legal persuaders in particular, understand that bias is both pervasive and powerful. The idea that potential jurors will be carrying attitudes and experiences that could influence their decision is the norm and not the exception. As a trial attorney, your goal is to eliminate it. In practice, however, it is more likely that you’ll be minimizing it. There aren’t enough strikes in the world. But is it enough if the biased jurors on your panel are numerically outweighed and outvoted by the other relatively unbiased jurors on your panel? Will deliberation take care of the problem when the biased jurors discover that most on the panel have different views and experiences? According to some recent research, the answer is “No.”

Using the example of bias based on pretrial publicity, the researchers from the University of South Florida (Ruva & Guenther, 2017) found that it is more likely that the bias will be contagious rather than being contained. The article entitled, “Keep your bias to yourself: How deliberating with differently biased others affects mock-jurors’ guilt decisions, perceptions of the defendant, memories, and evidence interpretation,” looks at the results from 648 mock jurors, half of whom were exposed to various forms of pretrial publicity on a criminal case. They looked at the influence of those exposed jurors when they were placed in groups with non-exposed jurors. The results suggest that, “during deliberations, pretrial publicity bias can spread to jurors not previously exposed to pretrial publicity.” These results serve to underscore the priority that litigators already place on rooting out bias, and remind us that we cannot count on bias being easily checked. And the findings also suggest that we devote further thought to ways to discover bias without furthering the spread of it.

Continue reading

October 10, 2017

Know Your Cognitive Biases, Part 2

By Dr. Ken Broda-Bahm:

The law expects legal decision making to work like a smooth and well-oiled machine. But as any experienced legal persuader knows, there is sand in those gears. That sand takes the form of cognitive biases: mental shortcuts or heuristics. They’re not necessarily mistakes, but factors that make legal decision making from a judge or jury less linear and logical than the legal model might presume. I wrote last year on advantages of knowing your cognitive biases based on a newly-published list of such biases. To advance the taxonomy, Jeff Desjardins of  the media website Visual Capitalist has more recently developed a handy image: “Every Single Cognitive Bias in One Infographic.” The list is detailed, including fully 188 known and documented biases. The infographic, together with a brief explanation, is available at the link above, and a high-resolution version is also available.

The Visual Capitalist illustration follows the same approach as the “Cognitive Bias Cheat Sheet” I wrote about earlier, but adds a couple of levels of organization that helps to show the relationships between the biases, and because you aren’t really going to memorize 188 biases, it emphasizes the broader concepts involved. The outer ring lists these main principles focusing on selective memory, too much information, not enough meaning, and a need to act fast. The next inner circle lists 20 general tendencies, like “We edit and reinforce some memories after the fact” and “we simplify probabilities and numbers to make them easier to talk about,” each covering around 5 to 20 cognitive biases. In this post, I’ll walk through those 20 forms of cognitive bias to briefly highlight the roles they play in legal persuasion.

Continue reading

September 18, 2017

Look for an Increased Perception that Racism Is a Major Problem

By Dr. Ken Broda-Bahm:

51757389_s

There’s a quote most often associated with Martin Luther King: “The arc of the moral universe is long, but it bends toward justice.” When applied to historical progress, these words generally connote the comforting message that “Things get better.” Our recent history, however, seems dedicated to showing that if there’s an arc, it isn’t necessarily a smooth one, and sometimes that bend toward justice takes some jagged turns. For example, the completed administration of America’s first African-American president did not soothe the country’s troubled experience with race. Rather, it inflamed it. Perceptions of racism as a major problem grew throughout the Obama Presidency, and then exploded afterward.

According to a Pew Research survey of 1,893 adults conducted last month, the percentage reporting that racism is a “big problem” has increased by eight points within just the last two years. That growth has been almost entirely among Democrats, widening what has been an already-large gap between the parties. The gap is even greater between the races, with about half of whites and eight in ten blacks agreeing that racism is a big problem. But across the population, 58 percent agree, and that is up from 50 percent in 2015, and  41 percent in 1995. These social perceptions can drive reactions in litigation, not just in racial discrimination cases, but also in cases involving diverse parties and witnesses. In this post, I will take a quick look at the survey results and discuss a few implications. Continue reading

August 17, 2017

Sunshine: Support Open Records as One Part of the Answer to Discriminatory Jury Selection

By Dr. Ken Broda-Bahm:

5684130_m

America is not yet post-racial, and the Nazis marching this week in Charlottesville, Virginia should be a reminder of that. Continuing tensions on race are played out in courtrooms as well. The as-yet unresolved issues of racial bias in jury selection provide one example. Race-based removals impact the criminal sphere more than civil sphere, and also matter more in some cases than others. Still the continued presence of strikes that seem to be based on race has led to some calls to eliminate the peremptory challenge altogether. For example, in a case earlier this year before the Washington State Supreme Court (City of Seattle  v. Erickson), two judges joined in calling for “the complete abolishment of peremptory challenges” as the only sure-fire way to eliminate the constitutional problem of jurors struck due to their race.

But perhaps it isn’t necessary to throw that peremptories baby out with the racial-strike bathwater. A recent paper proposes one alternative that has worked in other contexts: open records that, the authors hope, will lead to a little more disinfecting sunshine on the exercise of strikes. In the article (Wright, Chavis, Parks, 2017), Ronald Wright, Kami Chavis and Gregory Scott Parks of Wake Forest University School of Law write on their newly-formed “Jury Sunshine Project,” which started at the state level, assembling records from more than 100 North Carolina courthouses on 1,306 felony trials involving approximately 30,000 removed jurors in 2011. Prosecutors in the state, they found, removed nonwhite jurors about twice as often as white jurors, and defense attorneys excluded white jurors more than twice as often as nonwhite jurors. It varied widely by city as well, with prosecutors in Charlotte, Winston-Salem, and Durham accepting significantly fewer nonwhite jurors than prosecutors in the rest of the state. In this post, I will write a bit on the project and what it potentially offers as a way to retain peremptory strikes while addressing their abuses.  Continue reading

August 10, 2017

Fight Confirmation Bias: Consider the Opposite

By Dr. Ken Broda-Bahm: The Witch Can Read Minds Cartoon

For a practical persuader, one of the most disturbing features of some biases is their self-sealing nature. When an audience suffers from confirmation bias, for example, you would hope that they could be taught out of it. That is, if they’re primed to think that all big corporations are evil, then you might think the prescription would be a few doses of “No, they’re not.” But that’s the thing about confirmation bias; your audience will notice, understand, trust, and remember all the examples that support their pre-existing belief, and they’ll be prone to dismiss the rest. There is a spotlight on evidence that supports what we already believe: That’s the best evidence, and the evidence that is most mentally accessible. The bias is powerful, and it protects and perpetuates itself. So does that mean any effort to persuade an audience away from the bias is doomed? Not according to psychologist Tom Stafford in an article in BBC.com from earlier this year. The article is entitled, “How to Get People to Overcome Their Bias,” and follows up with the subtitle, “Asking them to be fair, impartial and unbiased is not enough.”

Dr. Stafford reports on a classic psychology experiment conducted by researchers at Princeton University (Sawin, 1988). Charles Lord and his colleagues recruited people with strong attitudes either supporting or opposing the death penalty, then presented them with evidence that either confirmed or refuted the effectiveness of the death penalty. According to Stafford, “Confirmatory evidence strengthened people’s views, as you’d expect, but so did disconfirmatory evidence.” That is called biased assimilation. That part was not new, but then Lord’s team reran the study with two different instructions. One instruction was a motivation-focused instruction asking participants to consider the evidence, “as objective and unbiased as possible,” to consider themselves “as a judge or juror asked to weigh all of the evidence in a fair and impartial manner.” The other instruction was cognition focused: “Ask yourself at each step whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue.” Continue reading

August 7, 2017

Prepare for your Post-Fact Jury: Top Posts

By Dr. Ken Broda-Bahm:

Alt-facts-book1

The news cycle these days seems to be dedicated to keeping the question front and center: “Are we as a society losing our grip on facts?” And if we are, I’d add a complementary question, “What does this say to legal persuaders?” An article at the end of last year appeared in Law 36o written by trial consultant, Ross Laguzza, citing data from his own company to support the view that jurors may be on their way to becoming more fact-resistant. For example, 54 percent say “Beliefs guide my life,” as opposed to 46 percent who say “Facts guide my life,” and fully 58 percent agree with the statement, “If you feel really strongly about something you don’t need facts to prove you are right,” with two-thirds of those agreeing say they “strongly” agree. This creates the possibility of a declining reliance on external sources for validity. In deciding whether a drug is safe or unsafe, for instance, 60 percent would go with the opinions of patients rather than the opinions of doctors and scientists.  Continue reading

August 3, 2017

I Could Be Wrong: Cultivate Intellectual Humility

by Dr. Ken Broda-Bahm:

9931231853_dfb3fefafb_z

Among the readers of this blog, there are a few people who write to me and let me know what they think about various posts. Sometimes it is to applaud a post, or to share an example where they’ve faced something similar. And sometimes, it is to take issue with what I’ve written. I appreciate that. It’s actually one of the benefits of blogging: The chance to interact over something substantive, and the chance to sometimes learn that I’m wrong. And I try to be open to the possibility. I believe what I write, and that’s why I write it, but I like to see it all as part of a dialogue, and that dialogue includes being open to the possibility of being wrong. So, as is sometimes pointed out to me, I could be wrong, I could be off base, I could be showing my biases in a hundred different ways.

The attitude I’m working on is called “intellectual humility,” and being aware that you could be wrong is an important personality trait. According to a recent study (Leary, et al., 2017), in fact, those high in it are better thinkers, better able to assess evidence and more likely to stick to their principles once those principles are established. The research article, discussed in a recent Psyblog post, involves four studies built around a new survey called the “Intellectual Humility (IH) Scale.” The trait is related to openness, curiosity, tolerance of ambiguity, and low dogmatism. Based on the experimental results, people with higher intellectual humility are more likely to be nonjudgmental, better able to evaluate evidence, less likely to flip-flop on issues. Those high in intellectual humility are also more attuned to the strength of persuasive arguments, making the personality dimension similar to another factor I’ve written about: rhetorical sensitivity, or the awareness that there are multiple ways to fulfill a particular communication goal. Humility also helps to facilitate better interaction and communication. “Not being afraid of being wrong – that’s a value,” says the study’s lead author, Mark Leary, “and I think it is a value we could promote. I think if everyone was a bit more intellectually humble we’d all get along better, we’d be less frustrated with each other.” Reading about this research got me thinking about the roles intellectual humility might play in different contexts, so this post will cover a few.  Continue reading

July 24, 2017

Expect Bias Statements to be Unreliable and Often Overcorrected

By Dr. Ken Broda-Bahm:

19300156_s

Jurors and judges sit in court and evaluate credibility. They continuously assess who is telling the truth and who isn’t. But what is the bias in those determinations? Lie detection itself is a notoriously uncertain ability, with confidence often high, but with actual ability tending to hover more around the coin-flip level. But independent of accuracy, our beliefs about lie detection can tell us something about bias. Based on some recent research, it tells us something about racial bias and, more specifically, about the bias we bring to the task of telling whether witnesses of a different race are telling the truth or not.

The research (Lloyd et al., 2017) appears in the journal Psychological Science and is covered in a ScienceDaily release. The article is entitled “Black and White Lies: Race-Based Biases in Deception Judgments,” and reports on a series of experiments involving 605 research participants. The participants watched videos of Black and White individuals, some telling the truth and some lying.  As they watched the videos, two boxes appeared on the screen: “Truth” and “Lie,” and participants simply made a judgement and clicked the appropriate box as they watched. In some versions of the study, the monitors were equipped with eye-tracking technology so that the researchers could tell which box the participants focused on first and foremost before choosing one to click.  After watching videos, participants completed a survey on their attitudes toward fairness and prejudice, rating their level of agreement or disagreement with statements like, “It is important to my self-concept to be nonprejudiced toward Black people.” The results of the study carry two important implications for lawyers trying to identify or adapt to biases in the courtroom: Bias statements aren’t necessarily reliable and can be prone to overcorrection.   Continue reading

Related Posts Plugin for WordPress, Blogger...