The Virus You're Not Thinking About: Misinformation
Updated: Jul 30
If you asked me at the end of 2019 if I expected to experience a global pandemic just a few months in the future, I would have thought you were crazy! However, nowadays it seems as though COVID-19 has taken over every aspect of my life: I’ve been home from my university since March, interning with U4I virtually, and don’t really leave the house very often in fear of getting myself and others sick. To me, the scariest aspect of the virus is its ability to stay undercover, giving some people painful symptoms while others are asymptomatic. This makes it more difficult to address because someone could be a silent carrier and still pass it on to another person, to whom the virus could be deadly.
The COVID-19 pandemic has many eerie similarities to what is termed the misinformation pandemic, the increasing amount of false information on the Internet that is spreading rapidly like the virus itself. Now that a majority of people have been stuck at home for at least a few months, the power of information has become even stronger. With daily lives thrown off and many even out of work, people are now finding the time to read the news, follow up on social media pages, and become informed about what’s going on around them. However, how easy is it to know if the statistics, videos, and other content you’re seeing on social media are true? The power of misinformation spreading rapidly on the Internet is just as daunting as a real virus to me.
How Does Misinformation Spread?
False information has little power and resilience if it doesn’t reach people who will believe it. For example, if I posted, “My cat developed a free vaccine for COVID-19,” perhaps a couple friends would see it, but they would recognize that I am not credible and have little evidence for my statement. While statements like these are easy to identify as lies, others are much more difficult to distinguish, especially when coming from those with power.
A study by Pew Research Center in late 2017 estimated that two-thirds of Americans get some, if not all, of their news from social media. While it is beneficial for people to have multiple news outlets that create diversity in headlines, social media is much more relaxed when it comes to the investigation of the truth. This means there’s a lot more information shared than might actually be true, and it’s harder to identify and monitor than your larger headlines.
Research from multiple accredited universities calls it an online misinformation ecosystem. The spread of information has been shown to follow what is called the power law of social networks, where information is targeted at a smaller group of people who share it with a large following. Psychology Today explains how sharing and promoting amplify the misinformation, especially to others who share the same views and are likely to believe it and continue sharing. In this sense, misinformation spreads just as easily as COVID-19, especially with the asymptomatic carriers. Each like, share, and comment is not necessarily fact-checked, but it can algorithmically promote the information onto other peoples’ feeds. This creates a constant cycle of shared information, whether or not it is necessarily true.
Further, the algorithms used on forums like Twitter and Facebook, where many find their news, are personalized through automation to fit what the technology thinks you want to see. Coined by the term filter bubbles, this algorithmic method can be helpful in the way it presents information to each individual, but can also isolate each individual from a holistic perspective by feeding them the side they’d prefer to read. Check out this TED Talk to learn more about filter bubbles!
When it comes to information regarding coronavirus, a lot of the ‘facts’ we see are limited to public opinion surveys and some preprint studies that haven’t been peer reviewed. Due to lack of understanding of false information, lack of energy to fact check every source, and the tendency to believe statements that align with your own personal beliefs, the information people believe about COVID-19 is all over the place and often partisan rather than centralized and solidified.
Who is Susceptible?
Traditionally, those most susceptible to believing misinformation on the Internet are the elderly, the young, and those who are less educated. Further, those at political extremes are more likely to believe false information if it supports their own agenda. This is called confirmation bias, where people tend to believe information, opinions, and stories that justify their own perspectives.
Researchers with Stanford University developed a model to understand diseases that can infect people more than once, like COVID-19. However, this model can also be applied to the idea of a misinformation pandemic because those who see and believe fake news once are more likely to see and believe it repeatedly, making themselves most susceptible without acknowledging the consequences. Check out this PBS Video on why our brains love fake news to understand how easy it is to become susceptible!
Why is this Important?
I’m not an expert on viruses, but I do think there is insight to be taken from the similarities between an actual pandemic and a pandemic of misinformation. First off, it requires cooperation. People must recognize that there is a virus that has killed thousands of people in the US alone, just like they must recognize that not everything they read on the Internet is necessarily fact-checked and true. To go against that belief is as naive as someone who says, “Well, I don’t know anyone who has died of COVID-19, so I don’t think it’s an issue.”
Once the issue is explained, that’s where addressing it comes in. Decisions can be made to shut down certain locations where the spread of the virus is likely and require people to wear facial coverings to limit the spread. Similarly, once misinformation is acknowledged to be a true phenomenon, we can make efforts to find it and combat it. Once we learn to fear false information the way we fear the spread of a deadly virus, we can make the efforts to minimize the consequences.
It is crucial to be conscious that not everything you see or read on the internet is true. Unfortunately, as technology progresses, making all false information disappear is impossible. On the bright side, educating users on the consequences and having large media companies actively monitor misinformation is a step in the right direction. This does pose the additional challenge of complying with the right to free speech, so more debates about how to best minimize misinformation on the Internet need to ensue. Most importantly, however, the conversation has begun, and it will not stop until change is made. We at U4I are passionate about identifying and overpowering the misinformation on social networks and media to make accurate news and information accessible to everyone, and we hope you are too!
Allcott, Hunt, et al. “Trends in the Diffusion of Misinformation in Social Media.” Stanford Institute for Economic Policy Research, 2018, web.stanford.edu/~gentzkow/research/fake-news-trends.pdf.
Andrews, Edmund. “How Fake News Spreads like a Real Virus.” Stanford School of Engineering, 22 Oct. 2019, engineering.stanford.edu/magazine/article/how-fake-news-spreads-real-virus.
Gottfried, Elisa Shearer and Jeffrey. “News Use Across Social Media Platforms 2017.” Pew Research Center's Journalism Project, 30 May 2020, www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/.
Miami Dade College. “Fake News (and How to Fight It): Filter Bubbles & Confirmation Bias.” LibGuides, 2020, libraryguides.mdc.edu/FakeNews/FilterBubbles.
Rogers, Kaleigh. “How Bad Is The COVID-19 Misinformation Epidemic?” FiveThirtyEight, FiveThirtyEight, 21 May 2020, fivethirtyeight.com/features/how-bad-is-the-covid-19-misinformation-epidemic/.
Starbird, Kate, et al. “Ecosystem or Echo-System? Exploring Content Sharing Across Alternative Media Domains.” University of Washington, 2018, faculty.washington.edu.
Wood, Mike. “How Does Misinformation Spread Online?” Psychology Today, Sussex Publishers, 6 Dec. 2018,
Written by: Annie Pollack