A mobile phone with YouTube icon on screen

More Info

Photo by Christian Wiediger on Unsplash

See Full Image

YouTube’s efforts to restrain conspiracy theories have mixed results

Updates


A new study (PDF) shows that YouTube’s efforts to limit the reach of harmful conspiracy theory videos via its algorithmic recommendations have produced positive, but inconsistent, results. From October 2018 to February 2020, researchers at the University of California, Berkeley recorded more than 8 million “Up next” video recommendations made by the YouTube algorithm in a set of more than a thousand “of the most popular news and informational channels in the U.S.”

The data showed a significant and steady decrease in recommendations of conspiracy videos from January 2019 — when YouTube announced that it was taking steps to reduce recommendations of “content that could misinform users in harmful ways” — to June 2019, when the platform touted a 50% reduction in such recommendations. After that, though, the rate of recommendations of conspiracy theory videos increased — possibly, the study noted, because YouTube may have relaxed its efforts or because content creators may have figured out how to avoid being flagged.

The study also found evidence that YouTube’s reduction efforts yielded significantly stronger results on specific subjects, suggesting that the platform can minimize the spread of specific kinds of harmful misinformation when it chooses to.

Note

Because researchers have been unable to track personalized recommendations at scale, this study and others like it have relied on analyses of algorithmic recommendations made to “logged-out” accounts, meaning that researchers could not access and analyze data from accounts whose individual “watch histories” factor into algorithmic recommendations. The study’s authors pointed out that users “with a history of watching conspiratorial content will see higher proportions of recommended conspiracies.” A YouTube spokesperson, Farshad Shadloo, told The New York Times that the study’s focus on nonpersonalized (“logged-out”) recommendations means that its results don’t represent actual users’ experience of the platform.

Discuss

Is YouTube the primary source of information for young people? Has YouTube replaced television for them? What are the advantages of watching videos on YouTube as opposed to programs on television? What are some of the disadvantages? Why don’t major television networks struggle with the proliferation of conspiracy theories on their channels? Does YouTube’s recommendation algorithm — which makes suggestions for the platform’s 2 billion monthly users — have too much power? Should YouTube be regulated by an outside agency, or not? Why?

Consider this idea

Have students select 10 popular YouTube channels that they consider to be credible sources of news and other information, then document the “Up next” algorithm’s recommendations on those channels’ videos for a period of time. Compile the data and share the findings, including with the study’s authors.

 

More Updates