Did You Know?

Troll farms: Not the stuff of fairy tales

When you think of the term troll farm, what comes to mind? Maybe a scene in a sci-fi book or a game to download onto your phone. But don’t be fooled: the dangerous consequences of troll farms are real and serious.

Those behind troll farms spread misinformation through fake profiles and accounts that appear to belong to real people. Some are designed to amplify bogus support for political ideas; others create confusion and distrust of democratic institutions. Still others do it for personal gain or notoriety.

While most Americans became aware of troll farms upon learning that Russia used them to try to mislead voters during the 2016 U.S. presidential election, trolling goes back farther than that. And this phenomenon is in no way exclusive to Russia. Troll farmers can be individuals, political parties or governments.

Social media platforms have tried to curb the proliferation of these fake accounts, with mixed results. In March, Twitter, Facebook and Instagram — which Facebook owns — disabled dozens of accounts tied to a Russian-linked troll network based mostly in Ghana, The Verge reported. The accounts, which fueled racial divisions, attracted more than 300,000 followers. Yet, this is just the tip of the iceberg.

The Institution for the Future researched state-sponsored trolling in a 2018 report. Researchers found that while the impact of fake accounts and sites could actually be classified as human rights abuses, almost no infrastructure exists for punishing creators. They suggested using international human rights laws to “require social media platforms to detect and, in some cases, remove hate speech, harassment, and disinformation; and implement such requirements in a transparent and accountable manner that respects due process and reinforces human rights.”

A recent article by the New York Times examined the content Russian troll accounts produced in 2016 versus 2020. A side-by-side comparison of posts revealed that trolls have gotten smarter at hiding their true identities. Accounts from 2016 tended to be filled with spelling errors or have an unrealistically high number of followers. Now, trolls copy and paste chunks of text to reduce such obvious errors.

Without these tell-tale signs, internet users must be even more cautious. A November 2019 segment from NPR affiliate WBUR offers advice from Clemson University researchers Darren Linvill and Patrick Warren: Question why you are seeing certain information from a particular account. Be wary of a flurry of inspiring, uplifting content because trolls don’t want to antagonize a community, they want to be part of it. Use caution when encountering content from accounts you don’t know.

Keep that advice in mind, particularly regarding information related to COVID-19, racial justice protests and the 2020 U.S. presidential election. The stakes are high, as misinformation pushed by troll farms can endanger our public life and health and the functioning of our democracy.