Curriculum Connection: Examining the impact of rising government disinformation

NLP Updates

Peter Adams

Peter Adams


Political parties or government agencies in 70 countries are using “cyber troops” to engage in organized disinformation efforts online, according to a new report from the Oxford Internet Institute at the University of Oxford. This is a 150% increase in state- and party-sponsored social media manipulation campaigns since 2017. At that time researchers found such activity in only 28 countries.

Among the key findings:

  • Facebook (56 countries) and Twitter (47 countries) are by far the most popular platforms for these efforts.
  • A strong majority of countries use human-operated (61 countries) and automated (bot) accounts (56 countries).
  • Attacking political opponents (63 countries) is a significantly more common use of computational propaganda than spreading messages that support a government or party (51 countries). (Computational propaganda is defined [PDF] as “the use of algorithms, automation and human curation to purposefully distribute misleading information over social media networks.”)

The report also found that most countries use a combination of tactics in these campaigns. Fifty-five countries create and circulate misinformation, such as memes and “fake news” websites. Fifty-two countries amplify content, including legitimate news, that aligns with government or party interests. And 47 countries employ targeted trolling of journalists and people with opposing opinions.

Often, government and political figures employ “cyber troops” who work with private industry, social media influencers and online communities. And some enlist students and private citizens to post specific messages to social media accounts.

Disinformation training

In addition, the study found several cases in which countries with more established disinformation programs — such as Russia, India and China — provided training and other assistance to countries with upstart disinformation programs.

For educators

Discuss: The report points out that a “strong democracy requires access to high-quality information and an ability for citizens to come together to debate, discuss, deliberate, empathize, and make concessions.” Why are misinformation and disinformation considered threats to democracy?

For further consideration: The final lines of the report pose two key questions that provide an excellent way to spark student engagement.

  • “Are social media platforms really creating a space for public deliberation and democracy?”
  • “Or are they amplifying content that keeps citizens addicted, disinformed, and angry?”

Activity: Have students reimagine a historic propaganda campaign with access to today’s information environment. Refer to those created during World War II or the Cold War. How would it be different? How might history have been different?

Related: Disinfo bingo: The 4 Ds of disinformation in the Moscow protests” (Lukas Andriukaitis, Digital Forensic Research Lab, Atlantic Council.)

More Updates

Facebook page with laugh icon selected

Curriculum Connection: Facebook, satire and fact-checking

The Wall Street Journal reported last week that Facebook plans to exempt satire and opinion content from its fact-checking program. This would mean that posts that contain demonstrably false claims, but which the platform deems to be either satire or opinion, would not be referred to its network of third-party fact-checkers. Thus, Facebook would not…

NLP Updates

Catherine Griffin

Bringing news literacy to a school, one freshman class at a time

Like many teens asked to research a topic, Catherine Griffin’s students typically would open a search engine, type a word or phrase, and simply use the source at the top of their results. But once Griffin guides them through the Checkology®virtual classroom, they start digging deeper, citing scholarly articles and database results in their research.…

NLP Updates