top of page

Changing hearts and minds: On overcoming our own biases


Published on Psychology Today


Politically, the United States is more divided then ever. And in an election year, discussions are getting heated, potentially leading to some awkward moments on the playground, at the dinner table, or on social media. This has caused many of us to struggle with how to talk to people whose views differ from our own, how to convince other people to change their minds, and how to argue your side effectively in casual conversations.

 

Unfortunately, we’ve already become quite polarized, and both our brains and our social media habits make it so that this polarization only gets worse over time. First, humans have a well-documented bias called the “confirmation bias,” which is a tendency to seek out information that confirms our existing views, and to avoid information that disconfirms them (Wason, 1960). There are countless studies that have reported this bias. For example, in one study where adults were asked about their stance on capital punishment, people rated information that confirmed their view on the topic to be more convincing than information that disconfirmed that view (Lord et al., 1979). This suggests that the way we naturally think and view the world already puts us in a place to look for more and more information that confirms our point of view.

 

With the advent of social media, our confirmation bias has only grown, fed by specific algorithms that show us information that appeals to our tastes. For example, when you are on Facebook or Twitter or Instagram and you spend time looking at something you like, the app records it and shows you more information that is similar. So if you’re clicking on pages that have content that aligns with a democratic or a republican perspective, you are just going to keep getting information that matches with that viewpoint. Not only that, but we tend to hang out with people who are like us and believe the same things we do. What you end up with is exposure to a ton of information and people who give you positive feedback on everything you already believe. This creates the illusion that everyone in the world thinks just like you do, and that people who feel differently are in the minority, and frankly, are wrong. The reality is, our brains are biased to confirm what we already think about the world, and our online experiences are also tailored to fit that world view.

 

These effects only get bigger when we are passionate or feel strongly about our beliefs. Indeed, research shows that confirmation bias is stronger for topics that are especially emotional or heated, like many of our political arguments today (Kunda 1990). In fact, one study showed that when people receive information that goes against their beliefs, their brains aren’t as active as when they receive information that confirms their beliefs. In other words, when we see something we don’t like, our brains literally shut off and ignore it (Rudorf et al., 2016; Grant, 2023).

 

So what do we do? How do we approach these political conversations when we all feel so strongly about our beliefs?  As a scientist, my approach has always been to provide as much data as I can to make my point. My assumption was that people are logical and want to know the truth, so if I just bombard them with data, it should sufficiently work to make my point and even change their minds, right? Wrong. It turns out that throwing data at people isn’t at all effective in changing minds and especially hearts. Instead, people just find it super annoying (sorry everyone).

 

Luckily, research has provided some guidance about best practices, and not surprisingly, they all suggest what our parents have been telling us for decades: You catch more flies with honey than with vinegar (or data). In fact, it turns out that instead of citing why you are right and someone else is wrong in a heated conversation, one of the most effective strategies you can use for changing someone’s mind is to find areas where you agree.

 

For example, in one study, researchers looked at the strategies of average versus highly successful debaters and found that successful debaters spend less time actually arguing, and instead, spend about one-third of their time acknowledging common ground between themselves and their opponent. This technique disarms the opponent and creates a friendlier atmosphere (see Rackman, 1999). On top of that, these successful debaters make fewer overall points, sticking to only the strongest ones. So (contrary to my own instincts) throwing facts at people doesn’t do much to convince them of your view. Instead, starting with what you have in common, or what goals you have in common sets the stage for a more friendly interaction where both parties might be more willing to give the other the benefit of the doubt, and perhaps even change their minds.

 

Another thing we can do is to be open to other perspectives. We can start by acknowledging our own confirmation bias and making a conscious effort to override it (Lilienfeld, et al, 2009). We can also try to take other people’s perspectives. And better yet—ask them for it. Ask them why they feel the way they do and what would change their mind. Be open to admitting that you could be wrong and make it easy for others to admit their own mistakes. Finally, make new friends—friends that are different from you. Make your bias a DISconfirmation bias; in other words, when you have an opinion on something, don’t just read information that confirms it. Instead, make a concerted effort to find information that could potentially prove you wrong. If you were right all along, this exercise will only make your arguments and your convictions stronger. If you were wrong, your mind will have been opened up to new ideas. Either way, this kind of open-mindedness will give you a brand-new perspective on life, and you might even make some unlikely new friends in the process.


Photo by Public Domain/pxhere

 

References

 

Grant, A. (2023). Think again: The power of knowing what you don't know. Penguin.

 

Kunda Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108, 480–498.

 

Lilienfeld, S. O., Ammirati, R., & Landfield, K. (2009). Giving debiasing away: Can psychological research on correcting cognitive errors promote human welfare? Perspectives on psychological science4(4), 390-398.

 

Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of personality and social psychology, 37(11), 2098.

 

Rackham, N. (1999). The behavior of successful negotiators. Negotiation: Readings, Exercises, and Cases. Burr Ridge, Illinois: Irwin.

 

Rudorf, S., Weber, B., & Kuhnen, C. M. (2016). Stock ownership and learning from financial information. Tech. rep., Working Paper.

 

Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly journal of experimental psychology12(3), 129-140.

Comments


bottom of page