Note: The News Literacy Project staff is off next Monday for Presidents Day. The Sift will return on Monday, Feb. 24.
Beware: Bogus local news sites
A large network of phony local and financial news websites — replete with recycled press releases and plagiarized news stories — designed to make money in a number of ways was exposed last week in a BuzzFeed News investigation. Matt McGorty*, who has experience in the financial information industry, established some of the sites as far back as 2015, BuzzFeed reported.
Many of the fraudulent sites, some of which are no longer live, had names such as the Livingston Ledger or the Denton Daily to give people the impression they were small but legitimate local outlets. McGorty* used plagiarized news stories to get the sites included in Google News results and to optimize rankings in general search results, and made money through ad revenue, commissions for financial email sign-ups and referral fees for questionable investment opportunities.
The network is the latest example of an attempt to capitalize on two aspects of the current information environment: the public’s trust in local news organizations and the likelihood that people will not detect bogus but legitimate-sounding sources of local news. During the 2016 U.S. presidential campaign, Russian disinformation operatives created Twitter accounts such as @ElPasoTopNews and @MilwaukeeVoice to amplify divisive — but real — local news stories from legitimate outlets. More recently, homegrown political operatives — including campaigns, political action committees and partisan activists on both the left and right — have used the same tactic to advance their agendas.
* UPDATE (April 6 and April 22): This item initially said that Matt and McGorty and his brother established the network of phony news sites. BuzzFeed News updated its report on April 2 after being told by Matt McGorty that his brother “had nothing to do with any of this.”
Discuss: Why do you think so many different kinds of people and organizations (foreign disinformation agents, con artists, clickbait farmers, political activists, etc.) are using bogus local news outlets as vehicles for their content?
Idea: As a class, create a list of all the legitimate (standards-based) local news outlets in your community, then create Wikipedia pages for any that do not currently have one. (Big h/t to Mike Caulfield, director of blended and networked learning at Washington State University Vancouver, for inspiring this idea.)
Another idea: Have students brainstorm ideas for protecting people in their communities from being taken in by phony local news sources, then vote for the best one. Make that idea a class project, or let students choose one of several ideas to work on in teams.
Viral rumor rundown
NO: Kansas City Chiefs quarterback Patrick Mahomes did not wear a shirt with the words “The Great State of Kansas” printed on an outline of the state of Missouri. YES: Mahomes wore a shirt that said “SHOWTIME” with his jersey number (15) in the original photo, which was tweeted by his agent. YES: President Donald Trump made an error in a tweet after the Super Bowl on Feb. 2, saying that the champion Kansas City Chiefs “represented the Great State of Kansas…so very well.” YES: Trump deleted the tweet and corrected the error in a second tweet less than 12 minutes later.
Note: The replies to this tweet, which is still live, contain a debate about whether the post is misleading and should be deleted or whether it’s obvious satire.
Discuss: Is this tweet obviously a joke? If a person creates a piece of misinformation as a joke, can it still cause confusion? If other people are fooled by this tweet but you aren’t, could it still affect you? What steps should people who create fakes like this take to make sure their creations aren’t mistaken for fact?
NO: Barack Obama did not award Harvey Weinstein the Presidential Medal of Freedom. YES: This is a crudely doctored photo of Obama awarding the Presidential Medal of Freedom to then-Vice President Joe Biden in January 2017. YES: This fake photo went viral in 2017 as part of a meme containing other manipulated images of Obama giving this award.
NO: The clips in the two screenshots above did not occur during Donald Trump’s presidency. YES: Brief shots of each were included in a political ad for Democratic presidential candidate* Michael Bloomberg was posted on social media on Feb. 2.* YES: They are from a June 2014 report from The Associated Press about a detention center for lone minors who have crossed the border illegally.
NO: Eight counties in Iowa do not have more adults registered to vote than adults living there. YES: The conservative activist group Judicial Watch published a press release on Feb. 3 making this false claim after comparing 2018 voter registration data with 2018 census population estimates. YES: The current voter registration data show that in one Iowa county the current number of total registered voters — both active and inactive — exceeds the 2018 census population estimate by 60 people (8,490 to 8,430) — but it does not exceed the number of active voters. NO: Comparing census population estimates with voter registrations is not a sound methodology for identifying voter fraud.
*CORRECTION (Feb. 13 and April 22, 2020): This sentence has been corrected. As initially published, it described Michael Bloomberg as the Democratic presidential nominee. He is not the party’s nominee; he is a candidate for the Democratic presidential nomination. It also initially stated that the ad aired during the Super Bowl. It was posted on social media on the day of the game, but was not aired on television during the game.
Five to teach
Facebook’s policies on medical misinformation are under scrutiny after a 4-year-old Colorado boy — whose mother appears to have ignored doctors’ guidance and instead sought “natural” remedies from an anti-vaccination group on the platform — died from the flu last week. In response to posts in which the woman described serious flu symptoms in her children, members of the “Stop Mandatory Vaccination” Facebook group recommended elderberry, zinc and vitamins instead of the Tamiflu a doctor had prescribed for the entire family to keep the virus from spreading. After the woman shared in an earlier post that her 10-month-old baby was running a 105-degree fever and had had a seizure, one member told her, “Boil thyme on the stove.” The anti-vaccination group’s founder, Larry Cook, blamed the 4-year-old boy’s death on the hospital where the child was eventually treated.
Note: Facebook took some steps to reduce the spread of misinformation about vaccines on its platform in March 2019, including banning anti-vaccination groups from buying ads or raising money on the platform, and reducing the groups’ visibility in search and content recommendations.
Discuss: Why do you think “alternative” or “natural” medical remedies appeal to some people? If your neighbors and classmates get the flu and decide to “treat” it with ineffective home remedies, could it affect you? Do you think Facebook should ban anti-vaccination groups? Do you think it should ban all anti-vaccination content?
Idea: Assign small groups of students one social media platform and have them research and evaluate the steps it has taken to combat medical misinformation. Next, have student groups compare their findings and rank the platforms from most effective to least. Finally, as a class, ask students to compile an ideal approach for social media companies to take against medical misinformation.
Facebook and Twitter refused to remove a video posted Feb. 6 on President Donald Trump’s accounts that interspersed highlights from his State of the Union address — such as his recognition of Charles McGee, one of the last surviving Tuskegee Airmen, and a reference to historically low unemployment rates — with a clip of House Speaker Nancy Pelosi tearing up her copy of Trump’s speech. (Pelosi applauded when McGee was acknowledged, and tore up her copy of the speech only after Trump’s Feb. 4 address to Congress had ended.)
Democratscriticized the video as misleading — claiming it was edited to make it appear as though Pelosi had torn up the speech during several moments when the airman and military families were being honored — and called on Twitter to take it down. Drew Hammill, Pelosi’s deputy chief of staff, condemned the decisions by Twitter and Facebook, suggesting they were lying to millions of people by leaving up the video. But Facebook spokesperson Andy Stone disagreed, replying, “Sorry, are you suggesting the President didn’t make those remarks and the Speaker didn’t rip the speech?” Stone pointed to Facebook’s policy, which “does not extend to content that is parody or satire, or video that has been edited solely to omit or change the order of words.” (Trump’s campaign called the video a parody; the White House has not commented.) Twitter said new rules on manipulated media were not yet in effect. On March 5, Twitter plans to start labeling videos that have been significantly edited as “manipulated media.”
Note: In May 2019, Facebook chose not to remove a viral video of Pelosi that had been altered to make her speech seem slurred.
Discuss: What do you think constitutes a manipulated video? Do you think the video posted by Trump is misinformation, or an effective political message edited for emphasis? Do you think the intent behind an edited video matters? Why or why not? Is it possible that the video will create the false impression that Pelosi repeatedly tore up the speech or tore it up at specific moments?
Idea: Review the video with students and summarize the debate it prompted. Then divide the class into four groups — one representing Pelosi’s team, another Trump’s team, another Facebook and the other Twitter. Have each group research the viewpoints and positions expressed by their assigned entity. Then facilitate a discussion about the issues surrounding the video in which the students play the part of their assigned party.
Nearly half (45%) of U.S. adults have stopped talking to someone about politics because of something the person said online or face-to-face, according to a Pew Research Center report, illustrating the country’s polarized political climate. Liberal Democrats are the most likely to have stopped discussing politics with someone — 60% said they have — while 45% of conservative Republicans acknowledged doing so. Moderate factions of both parties, according to the report, are substantially less likely to have stopped talking politics with someone. The survey also showed that the more closely people follow political news, the more likely they are to say they have chosen to stop talking about politics with specific people. Habits can have an effect: Respondents who primarily turn to local TV news — as opposed to news websites or apps, print, radio, cable TV, social media and national TV — for their political news are substantially less likely to have stopped talking politics with someone.
Discuss: Has anyone you know stopped talking about political news with another person because of differing viewpoints? Have you stopped talking to anyone because of this? Is disagreement a natural part of democracy? Are disagreement and debate good for democracy? What can happen if a democratic society becomes so polarized that its members stop talking to those with whom they don’t agree?
Idea: Have students poll adult family members and report findings back to the class to see how their results compare to the Pew data.
The QAnon conspiracy theory has moved from the far-right fringes of extremist message boards into the political mainstream. Two reports last week — one by Mike McIntire and Kevin Roose, in The New York Times, and another by Michael Kunzelman for The Associated Press — outline the ways that symbols, references and core beliefs of the QAnon conspiracy community, which started on the 4chan message board in October 2017, have made their way into mainstream political discourse. These include an increased presence in online political discussions; on shirts, pins, hats and signs in the audiences at Trump rallies; and, according to a running tally by the liberal media watchdog Media Matters for America, the open support of 24 current or former Congressional candidates.
Discuss: Have you heard of the QAnon conspiracy theory? Do you know anyone who believes in it? Why do you think it persists, even though several major “drops” — predictive statements from the anonymous user whose posts started the conspiracy theory — have been inaccurate? Why do you think the theory’s adherents see so many “signs” indicating that their beliefs are accurate when there is no evidence to support the theory?
A small group of journalists and fact checkers is testing Assembler, a new platform that helps detect doctored photos and “deepfake” videos online.
Jigsaw, a subsidiary of Google’s parent company, Alphabet, introduced the platform in a Feb. 4 blog post. Work on Assembler began in 2016 as a collaboration among several universities and research groups to help journalists and fact-checkers judge if and how an image has been manipulated. At present, Jigsaw does not plan to offer the tool to the public.
Assembler uses seven “detectors,” each designed to spot a different type of manipulation. These include the examination of pixels to detect composite images that have elements originally captured by different cameras, analysis of color contrast and copy-paste detection.
Jigsaw also has launched an online publication called The Current, which addresses various forms of disinformation and efforts to fight it.
Discuss: What are some of the ways photos can be altered and distorted? Why is it important for news organizations to verify the authenticity of images and videos? What online tools can you use to verify the authenticity and original context of images?
Resource: The Check Center (https://checkology.org/check-center), part of NLP’s Checkology® virtual classroom, includes tutorials and fact-checking missions for students. (Registration is required for teacher access, and individual student access requires license purchase.)