Former conspiracy theorists say that conspiracies provided meaning when they felt empty. Illustration credit: Liftwood/Shutterstock.
Millions of people follow conspiracy theories online, but not everyone finds their way out like Ramona, a Tennessee woman who quit following them and is now a fifth-grade teacher. She fell down the rabbit hole during the coronavirus pandemic, when she moved in with her boyfriend, Don, who held strong conspiratorial beliefs. In an interview with The Associated Press (link warning: domestic violence), Ramona, 23, said some of Don’s unfounded beliefs — that the pandemic was orchestrated, vaccines were unsafe and Armageddon was coming — sounded plausible amid the fear and confusion of the pandemic.
Eventually, “doomscrolling” through conspiratorial content online about topics like satanic sacrifices increased Ramona’s anxiety. After taking a break from social media, she stepped away from conspiratorial circles for good, broke up with Don, got vaccinated against COVID-19 and went back to school to become a teacher.
Note: This story is part of an Associated Press series on the role of conspiracy theories in American society.
Discuss: What attracted Ramona to conspiracy theories? How did she find community in QAnon circles? Why did it also make her feel isolated? How did Ramona dig herself out of the rabbit hole? Why do some people fall for conspiracy theories?
Idea: Watch this five-minute AP video featuring experts on conspiracy theories, a former conspiracy theorist and a self-proclaimed conspiracy “questioner.” Discuss the video with students. What is the “Ikea effect” explained by academic Francesca Tripodi? How is this technique used to spread conspiratorial thinking?
Advertisers and online influencers have a growing, new revenue stream: AI-generated product placements in TikTok and YouTube videos. Realistic-looking products like soda cans and shampoo bottles are seamlessly added into social media videos that aren’t necessarily formatted like standard ads — reaching younger viewers and offering a look at how AI could impact the future of advertising. Product placements in the U.S. are estimated to be a nearly $23 billion industry.
Discuss: How can you tell if there’s product placement in a TV show, movie or social media post? How does product placement compare with other forms of advertising? How should sponsored content and ads — including AI-generated product placement — be labeled or disclosed?
Amid the stress of extreme weather events, misinformation runs rampant. It happened in California as the state was hit with heavy rain and flooding, and a viral tweet by an emergency preparedness enthusiast — not a meteorologist or weather expert — made a false warning of a catastrophic megaflood. The post reached millions of X users and prompted responses from the National Weather Service and climate scientists, who debunked the rumor and directed people to more credible sources of weather information. Meteorologists and emergency officials say they are increasingly faced with how to respond to viral falsehoods and disinformation.
Discuss: Why does misinformation spread during extreme weather events? How do bad actors capitalize on natural disasters to gain cheap likes and shares? How can you tell if a source of weather information is credible? How would you evaluate a science-based claim on social media?
NO: A cache of more than 50,000 write-in ballots for President Joe Biden were not suspiciously “found” in New Hampshire.
YES: Biden received 79,455 write-in votes in New Hampshire in a grassroots effort after he skipped the contest due to the state’s refusal to reschedule its Democratic primary.
YES: Former Democratic presidential candidate U.S. Rep. Dean Phillips called the social media claim “shameful and absolutely untrue” and said Biden won.
NewsLit takeaway: The goal of disinformation is often to create doubt and make people skeptical of evidence and verified facts. By spreading the baseless assertion that suspicious votes were found to secure Biden a win, bad actors can raise questions about whether the victory was illegitimate, even though they have no proof.
NO: This video does not show singer Jessica Simpson endorsing Donald Trump for president in 2024.
YES: This is a video from January 2017, and shows Simpson saying “be president” when asked if she had any advice for Trump as he began his first term.
NewsLit takeaway: Manufacturing fake political support, endorsements or statements from celebrities is a common tactic used to generate additional support for a candidate or by those simply trying to raise their own profile online. These misleading “endorsements” can be created by manipulating photos or by sharing genuine media out of context.
Clips presented in false contexts can be quite convincing at first glance because the images and video are authentic. It’s always important to take a few moments to carefully consider a claim before accepting it as true — or liking or sharing it online. In this case, the source video can be put back into its actual context by opening a new tab and searching for “Jessica Simpson” and keywords such as “Donald Trump” and “endorsement,” or by taking a still from the video and plugging it into a reverse image search engine.
Could local newspapers pair up with universities to stay afloat? The University of Iowa’s student newspaper purchased two community papers, and the partnership will provide opportunities for students to gain reporting experience while bolstering local news coverage.
Controversy over journalism ethics arose after The Boston Globe ran a heartfelt story about a terminally ill woman who died by assisted suicide, with an editor’s note that the reporter signed a document that allowed her die.
Real journalists won’t ask you for money in exchange for writing a story. That’s a news literacy tip from CoinDesk, a news site that covers digital currency, after they found that scammers were impersonating the news outlet’s journalists.
Black celebrities like rapper Sean “Diddy” Combs and actor Denzel Washington are being targeted with AI-generated fake content on YouTube.
Meta CEO Mark Zuckerberg apologized to families of online child abuse victims during a tense Jan. 31 congressional hearing with the heads of major tech companies.
After an AI-generated robocall mimicking President Joe Biden told New Hampshire residents last month not to vote, a federal agency is proposing to make AI-generated robocalls illegal.
Texas did not declare war on the U.S., but this false rumor was trending on Chinese social media network Sina Weibo.
What happens when people are unable to tell the difference between satire and real news? It’s harder to do on social media, where both kinds of pieces are on news feeds — but experts say consumers need to take responsibility and make sure to verify before sharing.
How do you like this newsletter?
Love The Sift? Please take a moment to forward it to your colleagues, or they can subscribe here.
You’ll find teachable moments from our previous issues in the archives. Send your suggestions and success stories to [email protected].
Sign up to receive NLP Connections (news about our work) or switch your subscription to the non-educator version of The Sift called Get Smart About News here.
Check out NLP's Checkology virtual classroom, where students learn how to navigate today’s information landscape by developing news literacy skills.