Lie + Lie + Lie = Truth
With confirmation bias, exposure and familiarity effect, truth isn't what it used to be
The Familiarity Effect and Social Media
In our digital age, social media seems supreme as the central pillar in how we communicate, learn the “news”, consume information, and form opinions. Yet, the very platforms that connect us also exploitive. The familiarity effect - that repetitive wearing down of our defenses and our very sense of credulity - potentially driving society toward a polarized and partisan "new dark age."
But it isn’t like the old days when something was arbitrarily familiar, like you heard it from a neighbor, then a coworker, and then someone at church. No, these days we don’t just bump into familiar news from people we have known. It is fed to us. It is driven and delivered to us and our group of familiars, leading to polarization and fragmentation.
(Photo by Greg Bulla on Unsplash)
Understanding the Familiarity Effect
The familiarity effect, also known as the mere-exposure effect, is a psychological phenomenon in which repeated exposure to information increases its perceived truth. On social media, this effect plays a significant role in how information is shared and accepted. The algorithms designed to keep users engaged prioritize content from friends and family, amplifying the familiarity effect and inadvertently aiding the spread of misinformation (PBS, 2023).
The Role of Algorithms
Social media platforms use algorithms to curate content that aligns with users' interests. That makes sense. I like to load a bunch of fishing content, like a bunch of it, then I keep seeing fishing content.
But it also has a weird effect. Because it thinks I like fishing, then I might like hunting. If I like hunting, then I like guns. If I like guns, then I might like 2nd Amendment advocacy groups. And on it goes until I am getting content for militia movements and Christian nationalists.
This personalization often results in echo chambers, where users are primarily exposed to information that reinforces preexisting beliefs, and steers them deeper into its communities.
Maria Ressa, a Nobel Prize-winning journalist, highlighted this concern by pointing to a study from the Massachusetts Institute of Technology (MIT). The study found that false news stories are 70% more likely to be retweeted than true ones and reach 1,500 people six times faster (MIT Sloan, 2018). What the platforms are reinforcing is not truth or facts, but connections with a community that will ensure a longer time interacting on the platform. And since lies are more striking, more salient, they bubble to the top.
The speed and reach of misinformation are further fueled by human tendencies. According to MIT researchers, false news often embodies novelty and emotional appeal, making it more likely to be shared than mundane truths (MIT News, 2018). This interplay between human psychology and algorithmic design accelerates the spread of misinformation.
Think about that finding for a moment. The inverse way to saying the same thing is that truth is not intriguing. Truth, or linking together of truths into a meaningful concept is not as gripping or engaging. This psychological phenomenon says something crucial about us as contemporary human beings.
A New Dark Age: Partisan Polarization becomes Artisan Polarization
This manipulation of the familiarity effect deeply messes with social cohesion. Artisan polarization - when the partisan divide is actually an artisan design - could be defined as a design, creating divisiveness and extreme content. Like cattle chutes driving livestock into different pens. It raises the question as to who’s being served when divisions in society are being exacerbated by selective exposure to information that aligns with personal biases. Social media platforms, designed to maximize engagement, prioritize divisive and polarizing content because it generates higher user interaction. The social cohesion then, the cohering part, is a design, not the natural interaction and attraction of people to people like you have in a neighborhood coffee shop or at the pub on trivia night.
Partisan polarization has far-reaching implications. It undermines democratic institutions, erodes trust in credible information sources, and fosters an environment where misinformation thrives unchecked. The MIT study highlights that misinformation is particularly damaging in the political sphere, where it deepens partisan divides and diminishes the possibility of constructive dialogue (MIT Sloan, 2018).
The Psychological Mechanisms at Play
Human psychology amplifies the impact of the familiarity effect on social media. People are more likely to accept and share information that aligns with their preexisting beliefs, a phenomenon known as confirmation bias. Additionally, the emotional nature of false news—often laced with anger, fear, or surprise—makes it more engaging and thus more likely to be shared (PBS, 2023).
As Maria Ressa observes, "When a lie spreads faster than a fact, when you say a lie a million times, and it becomes a fact, people cannot tell fact from fiction" (PBS, 2023). This erosion of the ability to distinguish truth from falsehood creates fertile ground for partisan polarization.
Mitigating the Impact of Misinformation
Addressing the manipulation of the familiarity effect requires a multifaceted approach. Social media platforms will not likely take greater responsibility for the content they promote. It is unlikely there will be implementation of stricter policies to identify and reduce the spread of misinformation. Transparency in algorithmic processes which could help users understand how content is curated, is also highly unlikely.
So, education becomes the most critical component. Media literacy programs can empower individuals to critically evaluate the information they encounter online, recognizing the influence of algorithms and their own cognitive biases.
But seriously, if the majority of the population in industrialized societies, get their information from social media, that critical education requires a different platform for dissemination.
(Photo by Priscilla Du Preez 🇨🇦 on Unsplash)
Conversations still work. Face to face, embodied, not virtual. Actually, doing something as radical as sitting down and talking about everything and nothing in particular, builds a basic level of trust. Trust. That simple thing which has been eroded by the social media silos we have slid into.
References
MIT Sloan. (2018, March 7). Study: False news spreads faster than the truth. MIT Sloan School of Management. Retrieved from https://mitsloan.mit.edu
MIT News. (2018, March 7). Study: On Twitter, false news travels faster than true stories. MIT News. Retrieved from https://news.mit.edu
PBS. (2023, October 6). Nobel laureate Maria Ressa on defending truth and the danger of A.I. in the wrong hands. PBS. Retrieved from https://pbs.org