Curriculum Connection: Examining the impact of rising government disinformation


Peter Adams

I am NLP's senior vice president of education. My team runs all of our professional learning opportunities, including NewsLit Camps, and our resource and content development, including Checkology assets. Many years ago I taught middle school ELA and social studies, in an after school program for high schoolers and at the college level.

Political parties or government agencies in 70 countries are using “cyber troops” to engage in organized disinformation efforts online, according to a new report from the Oxford Internet Institute at the University of Oxford. This is a 150% increase in state- and party-sponsored social media manipulation campaigns since 2017. At that time researchers found such activity in only 28 countries.

Among the key findings:

  • Facebook (56 countries) and Twitter (47 countries) are by far the most popular platforms for these efforts.
  • A strong majority of countries use human-operated (61 countries) and automated (bot) accounts (56 countries).
  • Attacking political opponents (63 countries) is a significantly more common use of computational propaganda than spreading messages that support a government or party (51 countries). (Computational propaganda is defined [PDF] as “the use of algorithms, automation and human curation to purposefully distribute misleading information over social media networks.”)

The report also found that most countries use a combination of tactics in these campaigns. Fifty-five countries create and circulate misinformation, such as memes and “fake news” websites. Fifty-two countries amplify content, including legitimate news, that aligns with government or party interests. And 47 countries employ targeted trolling of journalists and people with opposing opinions.

Often, government and political figures employ “cyber troops” who work with private industry, social media influencers and online communities. And some enlist students and private citizens to post specific messages to social media accounts.

Disinformation training

In addition, the study found several cases in which countries with more established disinformation programs — such as Russia, India and China — provided training and other assistance to countries with upstart disinformation programs.

For educators

Discuss: The report points out that a “strong democracy requires access to high-quality information and an ability for citizens to come together to debate, discuss, deliberate, empathize, and make concessions.” Why are misinformation and disinformation considered threats to democracy?

For further consideration: The final lines of the report pose two key questions that provide an excellent way to spark student engagement.

  • “Are social media platforms really creating a space for public deliberation and democracy?”
  • “Or are they amplifying content that keeps citizens addicted, disinformed, and angry?”

Activity: Have students reimagine a historic propaganda campaign with access to today’s information environment. Refer to those created during World War II or the Cold War. How would it be different? How might history have been different?

Related: Disinfo bingo: The 4 Ds of disinformation in the Moscow protests” (Lukas Andriukaitis, Digital Forensic Research Lab, Atlantic Council.)

More Updates