Despite ongoing efforts to combat it, misinformation about the COVID-19 pandemic remains widespread online — and a new study from the Reuters Institute for the Study of Journalism at the University of Oxford provides some key insights about the nature, the focus and the sources of these falsehoods.
The study, published April 7, analyzed 225 pieces of COVID-19 misinformation (as determined by international fact-checking groups) that were posted between January and the end of March. Among its most notable findings is that most (59%) of the misinformation about COVID-19 consisted of content that was manipulated or selectively “reconfigured” in some way, such as photos presented out of context or medical “advice” that blended inaccurate and accurate information. This type of false content drove 87% of online interactions, the researchers found.
Falsehoods that were entirely fabricated made up 38% of the examples studied, but generated only 12% of the social media interactions. Also, the researchers were careful to point out that for all of the recent attention on the emergence of deepfake technologies, none of the examples in the study used these methods and tools. Rather, they relied on “cheapfake” approaches — simple manipulations and tricks of context.
And while only 20% of the examples studied came from public figures, such as politicians and celebrities, the misinformation they posted or spread accounted for 69% of the overall engagement on social media.
Finally, the study found that fact-checkers are marshaling a strong response to coronavirus misinformation, but social media platforms’ enforcement of their stated content policies was less robust. Even though English-language fact checks increased by 900% during the study period, a significant portion of the false claims studied remained live on social media, despite having been debunked. For example, 59% of the false claims on Twitter were still online when the report was published, the researchers wrote, while 27% of YouTube examples and 24% of Facebook examples remained live.
Note: An April 6 piece published by BBC Future — “Why smart people believe coronavirus myths” — offers additional ways to understand the spread of COVID-19 misinformation, including the fact that people who rely on intuition (what “feels” right) to make judgments about what to share online are more likely to share misinformation. This is one reason why false claims that are accompanied by images are shared at higher rates.
Resource: The examples studied by the Reuters Institute were drawn from a searchable archive of COVID-19 fact checks created by First Draft, a nonprofit journalism training and fact-checking organization.
Discuss: Do you consider the spread of medical misinformation online a serious problem? If so, how serious? Have you, or has anyone you know, ever been affected by medical misinformation? Do people have a right to share inaccurate and even dangerous “opinions” about health-related matters? How should social media platforms handle medical misinformation?
Idea: In groups, ask students to choose one of the major social media platforms and research and evaluate how it handles COVID-19 misinformation. For example, you might ask student groups to document public statements from the company about its approach to such misinformation, then map out its content reporting menu options to see if its stated policy is reflected in the options it provides users for flagging problematic content.
Viral rumor rundown
NO: The image in this tweet does not prove that “the media” is wrong about the severity of the COVID-19 pandemic. YES: This is footage of a training session at the University of Nebraska Medical Center’s Training, Simulation and Quarantine Center. YES: This footage was used in a Fox and Friends segment (go to 0:15 for the clip) on April 6. YES: This image continues to circulate — as do similar photos and video clips of medical manikins — as “evidence” of a conspiracy to stage or exaggerate the COVID-19 pandemic.
Note: Medical manikins are frequently used in simulations and for training, including in preparations for emergency response efforts.
Also note: For privacy and safety reasons, news outlets are typically unable to publish images of actual patients being treated.
Discuss: Are conspiracy theories like this dangerous? What can news organizations that use footage of training simulations with manikins do to prevent the clips from being misunderstood or used out of context to promote conspiracy theories?
NO: This photo of a tank in downtown Toronto was not taken this month — or even this year. YES: Canadian Armed Forces Operations posted it on Twitter on Oct, 1, 2016, when members of the Canadian military arrived at Yonge-Dundas Square for Nuit Blanche, an all-night arts event. YES: On April 5, Toronto-area residents were advised “to expect a larger number of [military] personnel and vehicles” on the roads in the next few days as units relocated to a base about 60 miles from the city to set up a COVID-19 response task force.
Idea: Share an archived version of this post with students and give them a sequence of digital verification challenges, including:
Determining whether the claim (that the photo was taken in 2020) is true. (It’s not.)
Finding the original tweet from the Canadian military (linked above).
If you want to make the challenge even harder, ask students to use different ways to prove that the image is not from this year. For example, they might use critical observation skills to note that one of the billboards in the background, though partially cut off, announces the opening of a Uniqlo store on Sept. 30. A quick lateral search for “Uniqlo store opening Toronto Sept. 30” should produce results confirming that the chain announced Sept. 30 as the opening day for a store in CF Toronto Eaton Centre. Or students might try to use the billboards in another way: If they can find another photo of Yonge-Dundas Square with the same billboards displayed, they can demonstrate that this photo wasn’t taken in 2020.
Comparing a user-contributed photo on Google Maps that is time-stamped October 2016 (top) with the Facebook post falsely claiming that the tank photo was taken in 2020 (bottom) offers further proof that the photo is not a current one. Click the image to view a larger version.
NO: Former San Francisco 49ers quarterback Colin Kaepernick did not sign a contract with the New York Jets. YES: Kaepernick opted out of his contract and became a free agent after the 2016 NFL season, when his decision to kneel during the National Anthem to protest police brutality and social injustice caused a furor. YES: Kaepernick held a workout on a high school field in Riverdale, Georgia, in November to show NFL teams that his skills had not declined; representatives of the Jets were there. YES: A Twitter account with a history of fabricating sports news as hoaxes to troll influential people and news outlets made this false claim about Kaepernick and the Jets. YES: Influential people and news outlets believed this false claim. YES: The hoax tweet attributed this “news” to Adam Schefter, a Senior NFL Insider at ESPN, using Schefter’s Twitter handle; neither Schefter nor the real SportsCenter account (@SportsCenter) reported this false claim.
Discuss: Is this kind of online hoax harmless fun, or is it more dangerous? Should Twitter suspend this parody account? Why or why not? What steps should the news outlets that fell for the hoax have taken to avoid being fooled? Now that the tweet has been proven false, what steps should these outlets take now?
NO: Microwaving a fabric protective face mask in a plastic bag for several minutes will not sanitize it. YES: Doing this can melt plastic and burn the fabric — and can cause a fire if the mask has a metal wire. (Sewing pipe cleaners or twist ties, both of which contain metal, into the top and bottom edges of a homemade mask has been suggested as a way of helping to ensure a tighter seal around the nose and mouth.) YES: Many people said that they tried this — and posted pictures of the aftermath on Facebook:
NO: A pandemic does not occur exactly once every hundred years. NO: The outbreaks in this meme did not all begin in the “--20” year of the centuries in which they occurred. NO: The plague outbreak in France that began in 1720 wasn’t a pandemic.
Note: This false claim has appeared in severaldifferentgraphics since the World Health Organization declared COVID-19 a pandemic on March 11.
Idea: Use this example to help students learn how selectively using facts (also known as “cherry-picking”) can, at first glance, seem compelling.
Another idea: Challenge students to (politely and respectfully) correct this false claim where they find it online.
Three to teach
Online conspiracy theories pushing false connections between 5G technology and the COVID-19 pandemic are continuing to rapidly gain momentum. Social media accounts and groups dedicated to advancing these theories have accumulated hundreds of thousands of new followers, including a number of celebrities, in recent weeks. In the United Kingdom, where these theories have particularly taken hold, there have been more than 30 acts of arson and vandalism to cell towers and roughly 80 incidents in which technicians have been harassed while working.
The theories range across a spectrum of false beliefs, including provably untrue claims that 5G radio waves cause COVID-19 or suppress the immune system, and the even more outlandish notion that the entire pandemic is a hoax perpetrated by governments and media outlets around the world to facilitate the installation of 5G networks. One factor behind the recent rise of such theories was an interview published on Jan. 22 by the Belgian newspaper Het Laatste Nieuws, in which a doctor made a number of highly speculative and unfounded claims about 5G technology. (The article was taken down “within a few hours,” the publication’s editor said, but not before the doctor’s claims had been disseminated widely.) There is also evidence that the recent surge in 5G conspiracies is being intensified by coordinated disinformation campaigns online — some of which appear to be state-sponsored, researchers say.
Note: A New York Times report last May highlighted the role of RT America, the U.S.-focused arm of a Russian state-run television outlet, in kindling unwarranted public fears about 5G in other countries, possibly as part of a Kremlin strategy.
Also note: Conspiracy theories often appeal to people seeking to make sense of complex, alarming events. Not only do such “explanations,” however unfounded, offer comfort by offering a scapegoat, they also (falsely) suggest solutions — which is also appealing.
Discuss: Why do conspiracy theories appeal to people? What other technological advancements have spawned conspiracy theories? Should social media companies delete posts that encourage violence against 5G towers? Should they ban all speech that attempts to connect 5G with the pandemic? Why or why not?
Idea: Use this example as an opportunity to teach students about how coordinated disinformation campaigns work, and specifically how state-run media — such as RT America — can use speculation and story selection to strategically advance state interests.
A network of dozens of Facebook pages created to exploit interests and sentiments considered typically American — such as “classic cars” and “support for law enforcement” — were actually operated by residents of or émigrés from Kosovo, according to an investigation by the fact-checking organization Snopes. Altogether, the pages had more than 2 million followers, most of them in the United States. Facebook removed some of the pages as Snopes was investigating, while others were taken down by their administrators. Some pages were found and removed by the platform’s automated systems even before Snopes contacted Facebook, and the company removed others for violating its rules after Snopes turned over what it had discovered.
The network also capitalized on COVID-19 fears, the investigation found, by promoting fraudulent giveaways for groceries, mobile homes and “tiny houses.” Followers could “enter the contest” by replying “Me” to the posts and then “liking” and sharing them. Instead of awarding a prize, though, administrators posted that it could not be collected for some reason, allowing them to use it in a new “contest” that was promoted in the same way.
“It’s unclear what the long-term aims are of this coordinated inauthentic behavior,” Dan Mac Guill and Jordan Liles wrote in their report, published on April 10. “But Snopes has found evidence of financial motives in some cases, and in others, identified patterns of behavior that bear worrying similarities to previously revealed misinformation campaigns,” such as the “notorious Russian efforts during the 2016 presidential election.”
Note: This method of explicitly asking for page “likes,” shares and other forms of engagement on social media is called “like-farming.”
Discuss: What are some signs that giveaways promoted on social media (like the ones detailed in the Snopes investigation) are inauthentic? What are some steps that you can take to confirm a claim’s authenticity before “liking,” sharing or further engaging with it? Why do you think the administrators of these groups (who live outside the United States) targeted Americans? Why do you think they chose the page themes they did?
Idea: Use this report as an example of the importance of Facebook’s “page transparency” features, which allow users to explore the locations, name history and other details of interest-based groups on the platform.
As news organizations have continued to lay off, furlough or cut the pay of their employees during the COVID-19 pandemic (28,000 and counting, The New York Times reported last week), fundraisers and other supportive efforts are emerging to assist affected journalists and newsrooms. For example, Microloans for Journalists, a program created by journalists, connects journalists in the United States who need financial assistance with other U.S. journalists who are willing to lend them $500, interest-free. The Philadelphia COVID-19 Community Information Fund is providing more than $2.5 million in grants to several local news organizations to support their coverage of the pandemic.
On April 7, the News Media Alliance, a trade association for news publishers, noted that the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency had refined its language about “essential” businesses to include “publishing news.” The next day, 19 Senate Democrats sent a letter (PDF) to the chamber’s Republican and Democratic leaders and the chairman and the vice chairman of the Senate Appropriations Committee, requesting that any additional COVID-19 stimulus package include financial support for local news outlets, which have been hit particularly hard in this crisis, even as their work becomes more important.
Discuss: What role do news outlets play during a public health crisis? What are some of the ways that you can support local journalism in your community? Are you seeking out more local news coverage now than you were before the pandemic? What are some differences between national news coverage of the pandemic and the news provided by your local outlets?
Idea: Ask students to track local news coverage of the pandemic and select what they believe is the most valuable piece of coronavirus-related local news. Then have students share their selections in small groups and agree on one story to share with the entire class. Finally, have the class vote on the best local story about the pandemic, then email or message the journalist to see if they would be willing to join the class for a short videoconference about their reporting.