As the situation surrounding the Russian invasion of Ukraine continues to unfold, it’s important for everyone to practice news-literate information habits. Understanding where to find credible information about the events in Ukraine and how to avoid being duped into sharing false and misleading information is vital. Here are three tips to help you keep your footing and avoid unintentionally spreading confusion and harm:
Be cautious and discerning with what you believe and share.
Keep in mind: Russian state-run propaganda outlets have an established presence on major social media platforms and can be hard to identify. A quick web search for the name of any unfamiliar news sources or unsubstantiated claims is your best first step to avoid misinformation.
Put yourself in the way of credible information.
For all the mis- and disinformation that circulates on social media, these platforms can also be powerful tools for accessing and curating timely, credible information. Be intentional about following professional journalists on the ground, reputable news organizations and fact-checkers debunking falsehoods in real-time.
Keep in mind: Breaking news events create chaotic, fast-moving and challenging information environments. Resist the urge to demonstrate your concern by reflexively sharing sensational but unconfirmed posts and updates. Also remember that it's important to consider fairness and framing — as well as accuracy — as you follow news on Ukraine in the days ahead. Some have criticized recent news coverage, for instance, for including racist and Eurocentric sentiments that reveal a troubling double standard in coverage of this conflict compared to those outside of Europe.
Dig deeper: Use this think sheet to help students further explore Russian disinformation tactics.
Russia has a history of using disinformation to “justify” actions condemned by the international community. However, its recent efforts to fabricate a rationale for invading Ukraine were quickly debunked by open-source investigators. They used geolocation to disprove a claim about supposed Ukrainian saboteurs sneaking into Russia; heat-sensing satellite data to disprove Russian claims of an attack; and video analysis tools to show that footage of a purported act of sabotage included audio of explosions taken from a different video.
Discuss: Why might Russia want people around the world to believe that it was provoked into attacking Ukraine? How do these kinds of narratives seek to provoke or manipulate people’s emotions? How might this kind of disinformation be useful as Russia tries to build support among its citizens for invading Ukraine? What distinguishes propaganda from other kinds of misinformation?
Note: The ongoing conflict in Ukraine has resulted in an upswell of viral rumors, which we can’t comprehensively address in our Viral Rumor Rundown. For real-time misinformation updates, follow the work of professional fact-checking organizations devoting significant attention to Ukraine.
NO: The Centers for Disease Control and Prevention (CDC) did not say that human DNA was being collected through COVID-19 tests. YES: In a tweet on Feb. 16, the CDC said that “there’s a 10% chance” that swabs used in COVID-19 PCR tests end up “in a lab for genomic sequencing analysis” — a process used to analyze the genetic makeup of viruses and track the emergence of variants. YES: In a subsequent tweet the CDC clarified that the sequencing and analysis work is on the genome of the virus that causes COVID-19, not on human DNA.
NewsLit takeaway: Conspiratorial rumors about the government and private companies using COVID-19 tests to build collections of human DNA have circulated since at least late 2020. These previous viral falsehoods — along with people’s tendency to interpret information in ways that maintain rather than challenge their existing ideas and beliefs — likely led some vaccine deniers and others to see “proof” of these baseless suspicions in the wording of the CDC’s Feb. 16 tweet.
NO: This photo does not show police shoving and kicking so-called Freedom Convoy protesters in Ottawa, Ontario, in 2022. YES: It is a photo of police attempting to clear a group of protesters during the G-20 summit in Toronto in June 2010. YES: Police in riot gear used force to clear some “Freedom Convoy” protesters in Ottawa on Feb. 19.
NewsLit takeaway: When major developments in a protest movement occur, supporters often seek out photos and video to share online to promote their cause. But some supporters go so far as to steal more dramatic visuals from other contexts to help their message go viral on social media. For example, purveyors of disinformation often seek out old photos and video of large crowds — at other protests, but also at nonpolitical events like music festivals and sports team rallies — to inflate the degree of grassroots support for a cause. This particular rumor is aimed at a different approach: exaggerating the degree or intensity of opposition to garner sympathy. Because these visuals are generally powerful, and typically align with actual events, they are often readily accepted by other supporters online. This is another reminder to double-check the authenticity of photos and videos of controversial events from sources you don’t recognize online.
NO: This is not an authentic photo of the sign for Little Pigs barbecue restaurant in Asheville, North Carolina. YES: The message has been artificially created using an online fake sign generator.
NewsLit takeaway: Provocative signs are optimized to go viral on social media; they’re pithy, visual and easy to process at a glance. But they’re also often fake. Almost any text in photos is easy to manipulate using basic editing software, but the altered photo in this example was particularly simple to make using a fake sign generator website. This same photo — with the exact same background and details — can be found online with a variety of different messages, including a derogatory anti-Muslim statement and a message mocking then-President Donald Trump.
You can find this week's rumor examples to use with students in these slides.
Don't miss this 60 Minutes report on the vital role local newsrooms play in communities and the threats they face from hedge funds and other financial firms.
Can you guess which faces in this Instagram post are real, and which are fake? Do you trust some faces more than others? It turns out that faces created using artificial intelligence are now nearly indistinguishable from real faces, and even seem more trustworthy to people, according to a new study. (Compare your guesses with the correct answers at the bottom of this story.)
A new report examining narratives related to COVID-19 vaccines found that influencers played a major role in spreading vaccine mis- and disinformation during the pandemic.
A new analysis from the Center for Countering Digital Hate shows Facebook is failing to flag about half of the posts with articles from the world’s top publishers of climate change denial — falling short of the platform’s pledges to curb climate misinformation.
A new survey from the Public Religion Research Institute found that 16% of Americans surveyed last year believed in the core tenets of the baseless QAnon conspiracy theory — and that people who trust far-right media outlets like One America News Network and Newsmax are about five times more likely to be believers.