Yesterday Twitter announced a new feature that is designed to combat misinformation. Birdwatch is a feature that allows the Twitter community to identify dangerous, harmful, and/or completely false or misleading information and even offer factual content to combat the misinformation. The new feature is called Birdwatch. Social media has created great value in many respects, but one of it’s most harmful side effects has been the ease of spreading misinformation. Unfortunately, behavioral psychology has a great deal to offer us that informs us why controlling misinformation will be a near-impossible task.
Prior to social media, if someone wanted a voice online or to build an audience, they started a blog. They relied on things like Google or other destination aggregators for people to find them online. Social media turned that concept on its head and gave birth to platforms where anyone can share anything and very quickly go viral. This means important ideas or voices who never had a voice now do, and that is a huge benefit to society. But the same is true for bad ideas and bad information, and that is why most social media platforms are working hard to control this problem. The challenge is humans and their relatively predictable nature. Two behavioral psychology concepts are important to understand when we think about the spread of information and how humans process it and allow it to form their beliefs and opinions.
The first is a concept most may be familiar with called confirmation bias. The general definition of confirmation bias is: Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one’s prior beliefs or values. This concept is why information, of any kind, spreads in the first place. And the Internet is the master of fueling confirmation bias, and social media adds gas to that fire. Most social media platforms understand this dynamic and, in some cases, capitalize on it for engagement and economic benefit. The problem is once that snowball starts rolling downhill, there is no stopping it.
The lesser-known psychological dynamic at play is called the backfire effect, which is also often referred to as belief perseverance. Defined, belief perseverance/the backfire effect is: maintaining a belief despite new information that firmly contradicts it. Such beliefs may even be strengthened when others attempt to present evidence debunking them.
David McRaney has a chapter dedicated to the backfire effect in one of my favorite behavioral psychology books called You Are Now Less Dumb. After a section where researchers show subjects misinformation, then show them the corrections, they found the backfire effect in full swing. His commentary:
Once something is added to your collection of beliefs, you protect it from harm. You do this instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you and tries to dilute your misconceptions, it backfires and strengthens those misconceptions.
– McRaney, David. You Are Now Less Dumb (p. 145). Penguin Publishing Group.
If anyone has tried to have an argument with a friend or family member who has fallen into the trap of believing misinformation, fully understands how hard it is to convince them of the truth using reasoned arguments and facts. Essentially, humans do not want to believe they are wrong, and once a belief takes hold, it is exceptionally difficult to get them to change their minds. As both these psychology concepts enforce, people actively seek out information that confirms their beliefs and double down on those beliefs despite mountains of evidence of the contrary.
This is the ultimate challenge social media is up against. Despite Twitter’s best efforts, which are necessary, they are up against these dynamics that once the idea gets out there, and it fits people’s confirmation bias, no amount of corrections or experts notes, facts, etc., will help and, in many cases, it will make it worse.
This opens up the entire conversation about how to keep information like that from getting out in the first place, which would lead down roads of censorship or limiting of speech, which I think always has to err on the protection of such rights.
There is honestly no great solution to the online problem of misinformation. However, I would as this moment of time serve as a lesson to which we consider adding more critical thinking into our education system. Namely, building into the learning process more philosophies of the scientific method, which includes aggressively challenging any hypothesis or conclusion and seeking out as much information to the contrary as possible. I came across a quote, which I, unfortunately, do not know who to attribute it to, speaks to the scientific process, and how mind change is a critical part of science. The quote is: “The ability to change one’s mind is not a sign of the weakness of their conviction but of the strength of their process.”
It has to start with education and early in the education process. Once people get set in their ways and lose the ability to think critically, it is very hard to learn later in life. While the efforts Twitter, or Facebook, etc., will take to combat the viral spread of misinformation is needed, it will also largely fail. Battling this begins in our education of the next generation of adults, or we will be stuck in this vicious cycle.