GSAN: Beware ‘pink slime’ | NyQuil chicken overblown? | ‘Killer Underwear Invasion’

 

Learn news literacy this week
Beware 'pink slime' | NyQuil chicken overblown? | 'Killer Underwear Invasion'

 

Top picks

Partisan local news outlets are filling the gap left behind by the steep decline of local media in the United States, including the disappearance of more than 2,000 local newspapers since 2004. “Pink slime” publications produce algorithmic stories and local news — including some fraudulent stories — and often lack transparency about who funds them. Meanwhile, openly partisan newsrooms may include some reporting from on-the-ground writers, but their work is nonetheless guided by political goals.
There aren’t enough fact-checkers to keep up with social media content in English and there’s even fewer resources to vet Spanish-language posts. Waves of political and health misinformation reach Latinos across the United States on social media and through encrypted messaging apps like Telegram and WhatsApp. Researchers found that the spread of misinformation among Latinos increased in 2020, much of it focused on the COVID-19 pandemic and the contentious presidential election. Misinformation targeting Latinos is also found in political ads invoking the specter of “socialism” in connection with liberal policies.
What makes some rumors go viral online? The Election Integrity Partnership compiled 10 factors that shape the potential for a falsehood to spread, including emotional appeal, novelty and repetition. Social networks, online algorithms and automated accounts also play a role in amplifying misinformation.
 

Viral rumor rundown

CGI video of car evading police under truck goes viral … again

A tweet reads “The way to hide your luxurious car temporarily” and features a video supposedly showing a car driving under a truck to evade police. The News Literacy Project has added a label that says, “CGI VIDEO.”

NO: This is not a genuine video of a car escaping police by driving underneath a semitrailer. YES: This video features computer generated imagery (CGI).

NewsLit takeaway: This video of a sports car evading police by sliding under a semitrailer appears to show a real-life version of a fantasy that has been playing out in blockbuster movies for decades, but this viral video is even less realistic than those movie stunts. The YouTube user who created this video explained in 2019 that computer animation software was used to add a sports car, truck and police vehicle to a genuine video of a highway.

While this video was debunked in 2019, it continues to circulate online and generate millions of views for social media accounts that share it without credit or disclaimers. This provides two important lessons for better social media behavior. First, take a moment to critically analyze content before you share it. Was it posted by a credible source? Are there any credible news stories about it? Second, if it seems too good to be true, it likely is. Debunk this on your own by performing a reverse image search.

 

Audio altered to include ‘Let’s go, Brandon’ chant in Biden video

A tweet reads “As Joe rides through the streets of London, crowds chant “LET’S GO BRANDON” and shows a video of President Joe Biden in a car. The News Literacy Project has added a label that says, “ALTERED AUDIO.”

NO: The crowd in this video was not chanting “Let’s go, Brandon” at Biden during his September 2022 trip to London. YES: The audio in this video was altered to add the chant.

NewsLit takeaway: Doctored photos often include indications they have been altered. Manipulated audio, however, can be more difficult to detect. This video seems to capture a London crowd’s reaction to Biden’s presidential motorcade. How can you determine if it’s real? First, check the source. Which accounts shared this video? The video was shared by meme and partisan accounts that often amplify falsehoods aligning to their interests. Then, find the original footage and compare it to the viral version. The video was initially captured by a British writer for the Telegraph and shows Biden in his motorcade as a few people in the crowd cheer, but none are chanting “Let’s go, Brandon,” as in the doctored video.

Altering audio is one tactic purveyors of misinformation use to mislead their audience. While this content may seem genuine at first glance, social media users can determine its authenticity by searching for the original source.

 

Miscaptioned footage goes viral during Typhoon Hinnamnor

A screenshot of a video featuring a shed being blown away in high winds carries a Korean caption translated to: “The person who lost a house because of the typhoon.” The News Literacy Project has added a label that says, “OLD FOOTAGE.”

NO: This video was not taken during Typhoon Hinnamnor, which hit Korea and Japan in early September 2022. YES: This is a genuine video of a rooftop shed being destroyed during Typhoon Jebi in September 2018.

NewsLit takeaway: Miscaptioning old but dramatic footage is a proven way to generate engagement on social media. These mislabeled and misleading videos are not without harm, however. They have two notable and negative impacts on online discourse. They give people a false impression of reality. Yes, Typhoon Hinnamnor caused devastation in Japan and Korea, but this video does not accurately depict what happened. And when these videos go viral, they amplify untrustworthy accounts and create larger audiences for them. Social media algorithms pick up on the increased engagement and further raise their profile. With larger followings, the accounts can be repurposed to sell products or to spread more misinformation.

While it is tempting to jump into the fray and share viral content during fast-moving news events, it is important to do some quick verification work. A reverse image search revealed that this video was several years old. Additionally, a quick check showed the account primarily shared videos of cheerleaders before scoring this viral hit. In other words, it likely was not the most trustworthy source for breaking news information.

 
Kickers
Ahead of the 2022 midterm elections, TikTok is requiring American politicians and government agencies to verify their accounts with a blue check mark. The new policy comes as QAnon conspiracy theories continue to get millions of views on the platform.
The revolting NyQuil chicken recipe began as a 4chan joke — but after the FDA declared it unsafe, TikTok searches for the recipe increased.
How does a disinformation expert navigate the information environment? Former CIA analyst Cindy Otis shares her tips in this recent podcast interview.
Two Indian engineers started a nonpartisan fact-checking website that has drawn about 80 million page views since 2017, but the site’s efforts to call out misinformation and hate speech have “put it on a collision course with the government.”
Comics can be people’s first encounter with local newspapers, but in light of recent changes some readers and cartoonists worry about the future of daily comics pages.
A new graphic novel for kids breaks down misinformation and news literacy with quirky illustrations and humor. The title? “Killer Underwear Invasion.”
 

Thanks for reading!

Your weekly issue of Get Smart About News is created by Susan Minichiello (@susanmini), Dan Evon (@danieljevon), Peter Adams (@PeterD_Adams), Hannah Covington (@HannahCov) and Pamela Brunskill (@PamelaBrunskill). It is edited by Mary Kane (@marykkane) and Lourdes Venard (@lourdesvenard).

Sign up to receive NLP Connections (news about our work) or switch your subscription to the educator version of Get Smart About News called The Sift® here.

 

Check out NLP's Checkology virtual classroom, where you can learn to better navigate today’s information landscape by developing news literacy skills.