IS THAT A FACT?
How much did misinformation impact the election?
Season 1 Episode 10
About The Episode
This episode, which was recorded live on Zoom on Wednesday, November 18 with a panel of experts, is our season finale. Our host moderated a conversation with Enrique Acevedo of CBS’ “60 in 6,” Dr. Joan Donovan of the Shorenstein Center and Jane Lytvynenko of BuzzFeed News about how misinformation impacted the 2020 elections and what we can anticipate on the horizon.
Acevedo is a correspondent on the new CBS production “60 in 6.” Before that, he was the Emmy-award winning anchor of Noticiero Univision late-night edition. Acevedo is also a member of the News Literacy Project’s (NLP) board of directors and the host of NLP’s Checkology® lesson on Practicing Quality Journalism. Dr. Donovan is the Research Director of the Shorenstein Center on Media, Politics and Public Policy. She is a leading expert in the field in examining online extremism, media manipulation, and disinformation campaigns. Lytvynenko is a journalist with BuzzFeed News, based in Toronto, where she covers online misinformation. Her work investigates the spread of fake news, digital deception, and the rise of hyper-partisanship online.
Darragh Worland: Welcome back to Is that a fact? This episode is the culmination of our 10-part journey to discover how democracy can best survive the onslaught of misinformation. There’s no denying we are in the midst of a concerning moment in our country’s history, triggered by the rise of social media and growing distrust of institutions, including the news media. Misinformation, disinformation, even conspiracy theories, are having an increasingly damaging impact on our public discourse.
Each of our guests this season sounded the alarm – some rather loudly – but they also shared concrete actions we can take as individuals and collectively to work toward a future founded on facts. One thing is clear: There’s no sitting on the sidelines. Misinformation is not a red or blue issue, nor does it stop at our country’s borders. It truly is a global threat. We must all be invested in stopping the spread of misinformation and bolstering facts, whenever we can. All of us at NLP believe news literacy is an essential part of the solution to that problem. For our season finale, which we recorded live on Zoom, we wanted to hear from experts who could help us understand how misinformation impacted the 2020 elections and what we can anticipate on the horizon.
We’re thrilled with our panel, each of whom brings a unique perspective to the subject. Enrique Acevedo is a correspondent on the new CBS production, 60 in 6. Before that, he was the Emmy award-winning anchor on Noticiero Univision Late Night Edition. At the News Literacy Project, we’re proud to call Enrique a member of our board of directors. He’s also the host of our Checkology lesson on practicing quality journalism. Dr. Joan Donovan is the research director of the Shorenstein Center on Media, Politics and Public Policy. She is a leading expert in the field in examining online extremism, media manipulation and disinformation campaigns. Jane Lytvynenko is a journalist with BuzzFeed News, based in Toronto, where she covers online misinformation. Her work investigates the spread of fake news, digital deception and the rise of hyper-partisanship online.
It seems clear that social media platforms took more aggressive steps to combat the spread of misinformation about the election this year than they have previously. But researchers are still divided on just how successful those steps actually were. What do you think platforms got right this year that they maybe hadn’t before? And what do you think they might have fallen short on?
Joan Donovan: You’re never going to catch me saying they did something right. I’m just kidding. I’m just kidding. What’s hard here is transparency in the sense that as researchers – and Jane knows this well because we’ve been on the phone a few times this week about it – is even when they get it right, they still manage to bungle the story in terms of letting people know when and why and how they did something.
And so it’s a really hard thing as a researcher to make any kind of determination good or bad, when you’re kind of feeling your way in the dark. And you’re trying to say, “OK, this was a good move.” But by and large, I think one of the things that we have to think about is pre-election and post-election, which is to say that pre-election, there was a pretty good momentum around not allowing the October surprise to become the dominant story.
And that’s important in the sense that, you know, we had watched over the summer and a few times over the fall, media manipulators really trying to make Hunter Biden the story. And we knew it was going to come up in debates. We knew of course no one was going to ask a question about Hunter Biden, but nevertheless, it was going to be wedged in there as an issue.
And so different attempts not just by social media companies, but by journalists, to make sure that people had access to timely, local, relevant information about voting – that was sort of the key metric by which I assess this election. And we did really well, we had more people voting. The U.S., anyway, is very apathetic about voting. And we have had so many people come out to vote at the polls or mail-in voting. So I feel like that was a really big success, and that was a whole-of-society success.
Post-election, though, it has been absolute chaos. And on Election Day, it was really hard to do good misinformation campaigns, because people’s attention was just fractured in so many different directions. But now, you know, a few weeks after the election, we’re reckoning with the fact that some of these states are going to spend many, many hundreds of thousands of dollars on needless recounts, because misinformation about voter fraud, it’s just everywhere. It’s ubiquitous at this point. And so I think that platform companies could do a lot more to help people access timely, local, relevant, accurate information, in order to combat what seems to be a waterfall of misinformation that’s been happening in the wake of the election.
Worland: Do you think some of the efforts that social media platforms like Twitter made to flag inaccurate information should be continued beyond the election? I know Twitter has said they’re going to dial back on some of that now that we’re not in the midst of an election.
Donovan: They have a few key categories of things that they are going to keep flagging. And I think Twitter’s labeling strategy has been a bit haphazard in the sense that, what labeling might actually trigger in people’s minds is that when I see something unlabeled, it must be true because we’re labeling everything that these influencers say is false. So there are things that are not labeled, therefore must be true. And so we don’t really know even if the labeling is helpful in any sincere sense of, academic sense of the word, and in terms of keeping it up – for them it’s a reputational risk to keep doing content moderation at this scale, because frankly, it’s drawing a lot of media attention, and they don’t really have any measures to say if it’s successful or not.
Worland: Is it possible to tell whether mis- and disinformation actually swayed how people voted? I don’t know if there’s any way to measure that.
Donovan: It’s like the sociological Holy Grail is like trying to disentangle if people didn’t go vote because they saw a good deal on the TV and decided to go buy a couch that day, or the weather was a little rainy and they decided they didn’t have an umbrella. And so we can look at correlations and make assessments about, did groups of people who were exposed to certain kinds of misinformation were they suppressed or not? And that’s going to take some time to figure out, but I do know that researchers are really looking into voter suppression efforts that had misinformation embedded in them that were targeted at different Latinx communities.
And so we can start to make assessments if we look in very specific places. But when it comes to any overall assessment, I’d say that, you know, there are going to be pockets of voter suppression where misinformation was definitely part of the equation. But because the U.S. did enjoy such a huge voter turnout, it would be impossible actually to assess if that had any impact on the election. And I say that knowing that there’s conspiracy theories out there right now about the quote unquote “suppression polls,” – dun, dun, dah! – which is the great mathematic conspiracy of our time, where nerds conspired to do math and make it so that people didn’t go to the polls. And you know, who actually cared about those polls and read them and then acted on them? I don’t think we’re going to know.
Worland: It’s a whole new twist on Revenge of the Nerds. Right?
Donovan: Yeah. It’s a good one though. I’m like, “Oh, yeah. If I heard it, maybe it changed my behavior.” I guess yeah, but also, you know, most people vote based on party. There’s not a lot of shift when it comes to how you sway voters. A lot of people are locked in just like they love their uh, I’ll say it, they love their Red Sox, and you can take those Toronto Blue Jays wherever.
Worland: I’m also Torontonian, so I’m going to take that personally.
Donovan: Oops, I’ll never get another question.
Worland: I’ve given you enough time Joan, you can mute yourself.
Donovan: It’s because we’re in the same division. And you know, it’s personal.
Worland: I like it. So Enrique, my next question’s for you. Can you talk a little bit about the tension between playing the watchdog role in covering the election, which obviously involves calling out falsehoods, among other things? And then balancing that with the need to build trust among news consumers who might even see that coverage as partisan because it’s criticizing their candidate. How do you think journalists should walk that line?
Enrique Acevedo: Even though some Trump supporters might feel that we’re in an adversarial position because we are trying to fact-check what the president says, I don’t think we should stop reporting the facts, right? The fact is that the president lies continuously and he has been lying about widespread fraud during this election. Trump has led his supporters on a very dangerous path in terms of disinformation and misinformation, but like my friend David Fahrenthold of The Washington Post says, “Trump can’t change the nature of facts.”
So having said that, you know, this is one of the things that keeps me up at night – this and a 1-year-old that you’ll probably hear while I’m answering the question – but misinformation, disinformation, manufactured lies amplified by social media are an existential threat to our democratic order – an order built on factual and empirical reasoning. And it’s just hard for me to figure out how are we going to address the 21st century challenges, the global challenges like climate change, like pandemics, autocracy, the (inaudible) of autocratic regimes if we’re collectively incapable of distinguishing the truth over lies.
This is really a five-alarm fire and I think we need to come up with a solution, both that’s tactical and strategic, because I’m part of the News Literacy Project, and I think NLP understands the urgency. We all need to jump on board. But it’s not just about educating the next generation, it’s the most important component, but we also need to act today, right? There’s a lot at stake today. I just mentioned some of the challenges, and we need to bring everyone on board, not just people who agree that we’re moving towards a dangerous territory when we’re in this post-factual, or whatever you want to call it, environment.
Worland: Yeah, I actually think that journalists right now are playing a really important role in upholding democracy by holding power to account, which is you know, what you do on a daily basis on a good day with any candidate, even more important when we are living in a world of misinformation and disinformation. My next question is for you, Jane. Lots of people try to keep score when it comes to misinformation, you know, claiming like, misleading memes and posts are more common among people on this side of the aisle or that side. How do you think misinformation affects people of different political ideologies? And do you think the differences in values like for example, freedom versus equality or patriotism versus justice. How does that affect the appeal of misinformation to different audiences?
Jane Lytvynenko: Well, disinformation is really group agnostic. Anytime there’s any kind of breaking news event, there is disinformation that goes along with that. We see it in hurricanes, we see it in fires, we see it in attacks, and we see it in elections. But the difference is, when we talk about online misinformation, it comes from the bottom up. Sometimes people misunderstand something, it’s genuinely an accident that something spreads.
But in the case of the U.S. election, there is a deliberate spread of false information and a deliberate targeting of audiences who are previously exposed to false information. So with the false information that we see right now, the vast majority of which is of course falsely claiming that there’s fake ballots that are being found or poll-rigging of some sort. That information is not necessarily meant to convince people who voted for Mr. Biden that Mr. Trump actually won. That false information is targeted at Mr. Trump’s supporters, in order to convince them that further investigation is needed or that Mr. Trump has won the election.
In this moment, the vast majority of the false information that we see is still within that ecosystem. But that doesn’t mean that that’s the only ecosystem that’s vulnerable to false information, nor does it mean it’s the only system that is hospitable to false information. We have seen hoaxes that target Mr. Biden’s supporters as well. And like I said, the nature of the false information means that it will always be there in a breaking news event trying to fill that information vacuum, that the election so clearly provided for the false information.
Worland: Would you say that overwhelmingly most disinformation is designed to create more confusion, or is it designed for a specific outcome?
Lytvynenko: Well, I wish I could say what most misinformation is up to. That would make my job a whole lot easier, but it depends, right? Some disinformation creates very hateful messaging and is there to target individuals or groups who are meant to be silenced. Some of it is genuinely there to create confusion. But what we do know is that the fact that false information has, very frequently, comes with extremely charged emotions – anxiety, anger, fear – and people tend to act on those emotions.
Now how they act on those emotions, that’s going to be different from individual to individual. During the early days of the coronavirus, people mostly try to act on that fear and anxiety as a way of selling vitamins or products that just don’t work – that don’t work against the virus. During the Black Lives Matter protests, we saw disinformation demonize protesters unfairly, and they were subsequently targeted for violence and harassment.
In this case, during the election we’re seeing a different purpose, which is to support the president’s claims. So we can’t really paint all of the falsehoods that we see with one broad brush because they all have different goals in mind. And sometimes that goal is just the financial profit. Disinformation is a financially profitable business, thanks to online advertising and thanks to the ease with which disinformers can collect clout online. But what it does have in common is that extremely negative cocktail of emotion that it exploits in people who are already inundated with a lot of anxiety and anger.
Worland: So in response to concerted efforts to crack down on mis- and disinformation on more mainstream social media platforms. Unfortunately, more permissive alternative platforms like Parler and Gab have gained momentum, and our audience may not necessarily know what those are. It’s hard to even imagine our information environment even becoming more niche-ified. But how likely is it that these sites and others like them might gain a strong share of the social media market? And what concerns does this raise, and if you could just sort of weave in what Parler and Gab are-
Donovan: Yeah, so in our world, so think about there’s an ecosystem of apps when you turn on your phone and you look at the 25, 50, 100 apps you have, every one of them is kind of designed with a user in mind. There’s many, many, many different social media apps, but we tend to focus on the big tech – Twitter, Facebook – they’re pretty stable communication infrastructures. That is, they tend to do what we expect them to do every time.
Now, within the ecosystem of what we would call minor apps or apps that don’t necessarily have the stability or the users that are, you know, numbering in the millions, they capitalize on moments like this. And every time we see a group of people or social movement experience a large-scale defeat, the first thing they do is they blame the messenger. And in this case, they’re blaming the infrastructure. They’re saying, “It must have been Twitter, it must have been Facebook. We need a place where we can communicate completely unfettered.”
And we saw this with Gab in particular, my team did some research that I’ll drop in the chat. Looking at, well, how did Gab, as a minor app, really try to soak up white supremacists that were being de-platformed after the Unite the Right rally? And what they really did was they tailored their messaging, saying, “This is the place for you. It’s completely unmoderated.” What ended up happening though, over time with an app like that is, people realized, “I don’t actually want to be surrounded by a bunch of Neo-Nazis.” Right?
So people that went, that were more of the free speech types. And I say that knowing that any time you say free speech near Harvard, 1,000 lawyers descend upon you. But you know, apps need moderation. And so in this moment, you have a set of different apps that are trying to gain traction. And what’s kind of interesting in this moment, if you don’t follow right-wing news very closely, is that there’s a big fissure in the right-wing media elite especially around Fox News, where if you were watching the election coverage closely, Fox called Arizona before other major networks, and at one point, Trump called the people at Fox and asked them to walk it back, and they didn’t.
And since then, a lot of conservatives that really enjoy hyper-partisan news have basically called Fox News traitors in this moment, which opens up an entirely new kind of marketing scheme to draw in conservative news viewers – which there are a lot – and they are voracious spreaders of news, they love to share. So right now, there’s a bunch of apps that have this opportunity where people are feeling like Fox News let the party down because they reported the news. Like remember, what’s at stake here is that Fox didn’t buy into some of the more odorous disinformation campaigns around voter fraud and around election tampering.
But as a result though, you have a set of different apps that are trying to gain those folks. And what’s at stake here though in the long run is that these apps, if they do take in people, and they are relatively stable, and people do use them, then they just become an echo chamber. This is where people share that kind of news. Most of the things that may excite you about Twitter, which includes getting reactions from people that are trolling or trolling other people, you don’t get that in the same way.
The other problem is that an app like Parler has right now is, because they’ve advertised themselves as this kind of content, moderation-free zone it’s just safe haven for pornography and fetishism and really, just really gross stuff. At the end of the day, it’s like, you know, this is going to turn people off because if you make a comment and someone doesn’t like it, and then they literally post a huge steaming pile of poop as a reaction image, you’re probably going to be like, “I didn’t like that. Maybe I’m not going to go back.” I’m not even joking. That’s the caliber that we’re dealing with here.
Worland: So it sounds like you’re not especially worried about what these more fringe social media sites are going to do to our civic discourse.
Donovan: The only thing that is different here with Parler is that there’s a millionaire behind it, and Rebecca Mercer’s a partner and is putting money into it and is a well-known purveyor of right-wing news. And so it probably will stick around for a while, because it doesn’t have to rely on donations and advertising revenue like other more fringe apps. But the problem with interaction and engagement online is if you are stuck in an echo chamber, even for these influencers, the attention isn’t really going to be as much as the attention they would get on Facebook or Twitter. And so unless they’re being paid to post, there’s not a lot of upsell in being popular on Parler alone.
Worland: But I would imagine that journalists being aware of these platforms might end up reporting on Parler, for example. Could that end up inadvertently amplifying the platform?
Donovan: I think there’s the platform, and then there’s the content on the platform. And I think that you know, over the years – and Jane’s seen her share of this – the way in which journalists learn to deal with anonymous websites and posts on those anonymous websites. I mean right now, nobody is taking seriously a lot of the posts on Parler because it’s lousy. It’s just infested right now. And so there’s a lot of fake screenshots. Here’s a tactic: You go on, you can become Rush Limbaugh. You can make your whole profile look like Rush Limbaugh, and nobody’s going to know you’re not Rush Limbaugh, a lot of people are going to think you’re Rush Limbaugh. You could be the ghost of JFK, you could be anybody you want.
But it’s going to take a while for people to generate affect and community and really learn to talk to one another and work with one another and yeah, so for me, I feel like if anything, it’s complementary to the major apps that we already have, but that one major difference around funding means that it’s probably going to be persistent.
Worland: Enrique, many news outlets reported that political misinformation was targeted specifically at the Latinx community, especially in swing states during the election. Do you have a sense of what kind of misinformation or even disinformation was aimed at Latinx voters and why?
Acevedo: Of course. I think Latinx people are especially vulnerable to disinformation and misinformation because first of all, they consume most of their information online where we see this content proliferate, right? Well, mainstream Spanish-language media is not doing a great job representing different points of view. Two-thirds of Latino voters are now U.S.-born. And that means politically, they’re going to start behaving and acting more like the rest of the electorate. So there’s something to be said about the lack of representation for conservative Latinos in Hispanic media. I think that contributes to a false narrative, that there’s some type of conspiracy happening, and that they are all aligned against their interests.
And then you have the propaganda and irresponsible politicians comparing Democrats to socialist regimes in Latin America, conflating socialism and communism. I’ve covered Latin America for a fair amount of my career. I’ve been to Venezuela, Nicaragua, Ecuador. I visited Argentina. I’ve interviewed most of the leftist leaders there, and I see a difference between what’s happening there and their ideologies, compared to what’s happening here in the U.S. and some of the politicians that are labeled as socialists.
So there’s a lot of that happening. And that was at the heart of the 2020 campaign, especially in places like South Florida, where you had Cubans saying, “Well, we don’t want to go back to what we fled from in Cuba.” And you have to be respectful of their experience, that the history where they’re coming from, but you also have to ask yourself, “How can they instead choose a man who’s right now for example, not willing to accept the results of the election?”, which is at the heart of democratic order of our democratic system. So I think at the heart of this it’s also news literacy. You mentioned at the beginning, credible sources, verification, the difference between opinion and fact – all of that is lacking in our community.
Worland: So do you think that any kind of misinformation or disinformation actually had an impact on voting trends?
Acevedo: It’s hard to say. Again, as Joan was saying at the beginning, without having some sort of empirical evidence, but we saw a higher turnout for President Trump in places like Miami-Dade that we saw in 2016. And it could be the result of this. You don’t want to minimize the Latinx vote to what happened in South Florida with mostly Cuban Americans, of course. Latinx voters decided elections in Arizona, Nevada, it was a close election in Texas and in all the other states where you don’t really think about Latinx voters, like Pennsylvania and Georgia and North Carolina. But you do see a trend where in places like South Florida, this narrative of socialism and the complaining again of communism, and then dictatorships with what some of the Democrats describe as democratic socialism, having a role in that discussion at least of what candidate they think they should vote for, and what issues are important to them. And I think that was very evident in places like South Florida.
Worland: Right. So what you’re saying is many Latinx voters, Cubans maybe especially, have this history of having lived in a communist society. And so they’re bringing that experience to an idea of what some sort of socialism might look like in the U.S. What is the role of journalists in actually talking about what is democratic socialism? What is actually being proposed here?
Acevedo: When you mentioned that there, I think there’s also sort of a memory of what “the media” – and I hate that term and we can talk about that and challenge that. We’re all grouped together in the news media term. I mean-
Worland: It’s not monolithic. Yeah.
Acevedo: Right, no. If we label food junk food or food with high nutritional value, we should be able to do the same with the information we consume, right? It’s not all the same. But there’s a little bit of that memory from places like Venezuela, like Cuba, like Nicaragua right now, where you have the media ecosystem being part of that structure of power and autocracy. So there’s a lot of mistrust in the media once they come to a place like the U.S., right? And it’s different to calibrate the differences between what they experienced their whole life, or their parents did in their country of origin, and what they’re seeing here, especially when both sides try to blame the messenger and say, “Well, they are spreading lies and falsehoods.”
In terms of that specific issue of explaining democratic socialism, I interviewed Bernie Sanders a couple of times, for example, during the primaries, and I wanted to present him with that question because I knew it was in the mind of voters. But just by doing it without providing the right context, right then you would make it an issue, right? By bringing it up and saying, “Well, for some voters here in South Florida and other parts of the country, what’s the difference between what you’re proposing in terms of democratic socialism and what they’re experiencing with Maduro in Venezuela or with Correa in Ecuador or with Castro in Cuba?”
If you don’t fact-check yourself on the question and say, “And I understand there is a difference, but could you provide that context for the voters?” You’re already spreading, I think it’s not a falsehood, a bias in the question.
Worland: That’s a question I want to cover a little bit later is in trying to correct misinformation or disinformation or even misunderstandings, do you often then run the risk of amplifying it? And I want to come back to that and address it in more detail with all three of you, actually. But I have another question in terms of, I want to focus a bit on COVID, because that’s obviously the other major story we’ve all been dealing with this year, and which was a big issue in the election. Joan, do you think that attempts to stem the spread of misinformation on social media about COVID, do you think that taught us anything about how to stem the flow of political misinformation?
Donovan: It’s interesting being a sociologist, that I come from the field of sociology of health, and I’ve always really had a passion for looking at communication systems. And so when I was looking at health-based discourse, where white supremacists were basically like, contradicting genetic findings because it really sat uncomfortable with them that they would be genetically non-European. And so I have a lot to say about the way in which health information circulates online.
This moment really made me go back and start looking at, you know, what were doctors thinking the internet held as a possibility for circulating health knowledge, and I keep going back to this quote, where a doctor in one of the premier medical journals basically says, “The internet is going to be a nightmare for doctors, because anybody can post anything. They can make it look scientific, they can jam some graphs next to it that are completely unrelated, and people will believe it.” And unfortunately, the way in which health information has been dealt with online prior to COVID has been abominable. I mean, we are in this situation where the only stuff you have access to as an information consumer is things that have been failed to be monetized otherwise.
So stuff that’s not behind a paywall tends to become popular. And that means that people are like putting yogurt in their woo-woo. And like, they’re doing all kinds of things they shouldn’t be doing because they’re reading about it online. And they’re just like, “Well, maybe this will help, right? If I put ice on it, and I put this kind of salve that I made from the bark of this tree, right?” And the other thing is in the U.S., we don’t have good access to health care. And so people start to make stuff up. And that’s not even the worst of it, because the other part of it is the fake cures and the fake supplements and things that people are using because they have a serious, serious problem.
And so with the COVID moments, what we saw initially was a huge rush to buy up domains that use the term COVID-19 or coronavirus in them, over 100,000 new domains. And they were all really suspicious in the sense that it wasn’t like just coronavirus, it was coronavirus unemployment, right? So the ways in which the scams found people weren’t just through medical misinformation, but were for information that people are seeking out of desperation, out of financial loss, about losing their jobs, losing their housing, COVID-19 daycare, like all of this stuff came up.
And it really showed the serious curation problem that we’re in. Because what the companies really could only do was at first is they were like, “Well, we’re just going to put banners on everything.” So everywhere you signed in, there was a banner for everything. And then they were attaching labels to very specific key words. And so if you were a community of people that were writing coronavirus, in English or COVID-19, you would get these labels. But if you were calling it the ’rona, you were not getting that information.
And so it really showed the unevenness and the lack of preparation that we’ve had for a moment like this, where we do need our platforms to have public interest obligations. They have to have a curation strategy for getting people timely, local, relevant and accurate information, and right now we don’t. And we have a kind of hodgepodge of experiments, that of course the platform companies are taking it seriously, but at the expense to who – our public health professionals. They’re now calling me and they’re like, “Listen, people are coming in my office, and they’re saying they don’t want the vaccine because it’s got a microchip in it.” And I’m just like, “OK, I get it.” And you’re not going to fix it by changing the technology, because it’s now really pretty much a part of our vernacular as a society. But we do need to work it out so that people get accurate information when they’re searching for it.
Acevedo: And just to add …
Worland: Yeah, go ahead.
Acevedo: Really quickly to what Joan was saying, we recently at 60 Minutes interviewed the general in charge of the rollout process of the vaccine, General (Gustave F.) Perna. He said that his main worry is not this gigantic logistical challenge, distributing hundreds of millions of doses, but actually, that people take the vaccine because this erosion of trust in science, in medicine – that they’re very concerned about that. Early on during the pandemic I was in touch with the communications team at the World Health Organization, and they came up with the term info death because they were not being effective in providing information that could really save lives to people because of all the disinformation and the misinformation that was being spread about the virus and about the role of institutions – institutions like the World Health Organization, and about the role of scientists in this matter. And I actually think that the anti-vaxxer movement is a precursor to what we’re seeing with widespread misinformation and disinformation. That’s how it really got started and became massive, because of the erosion in something as fundamental as empirical evidence.
Worland: You know, it starts to feel like this is a very U.S.-centric problem. And of course, we know that this is global. You mentioned that earlier, Enrique. Jane, being someone who’s based in Canada, I want to do a little bit of comparison, which one of my favorite things to do is to compare Canada and the U.S., particularly as a dual citizen living in New York now, but having grown up in Toronto. You’ve definitely reported on ways that misinformation has affected Canada’s election and I guess, most recently in 2019. How does the problem compare to ours and are Canadians as concerned about the erosion of democracy as we are? Asking for a friend.
Lytvynenko: Well, I can tell you that comparing Canada to the U.S. is a favorite sport of Canadians. But there’s this sort of really interesting thought that Americans have, which is that every time something happens, it happens to them only. They’re the first ones that it has ever happened to, and they’re the first ones to grapple with a problem. And that’s just not true. With the disinformation problem, it’s been percolating for a decade. And with problems with social media companies specifically, they’ve been warned by activists worldwide about the dangers of disinformation before it hit U.S. shores.
So when we look at this global picture, really the U.S. is pretty late to the party. But because the U.S. is so giant, now that the party’s here, they’re exporting it back out. What this has done is, this has created this extremely insidious loop, where domestic disinformation goes viral in the U.S. And when that disinformation goes viral to the U.S., it doesn’t stay within the U.S. border because we don’t have borders on our social media platforms. The only borders that we have on our social media platforms are the echo chambers that we’ve built for ourselves in languages that maybe we don’t understand, but even then we have the translate function.
And so when something goes hugely viral or popular in the U.S., it will show up in other languages, this happens almost uniformly. One great example was the video Plandemic, which featured anti-vaccination activist Judy Mikovits being interviewed. This was a 30-minute video that looked extremely professionally produced, and it spread so quickly because it was allowed to hop online ecosystems. It started out on YouTube, and then it went to Facebook groups. It was then shared among semi-private spaces like WhatsApp and Telegram, but it wasn’t just shared in English. What we’ve also seen is that it was very quickly subtitled or translated into other languages. Those other versions were not taken down as quickly or at all by social media companies or moderated as quickly by social media companies as they were in the U.S.
And so what happens is the U.S. has this outsized influence on the information environment that has ripple effects all around the world, even though this problem is already known to be global. It’s funny, I hope my grandma’s not listening to this. She lives in Canada with me, but we’re Ukrainian immigrants. We moved here about 18 years ago, and she doesn’t speak very much English, but when she picked up the phone to call me after the election, she said, “Oh, how are you doing? How’s the Trump win going?” And I was like, “Whoa. OK, that was fast.” That was like day two and it already got into the Russian language ecosphere to the point where my grandma believed it.
And so it’s really difficult to do an apples-to-apples comparison of what’s going on in the U.S. to what’s going on elsewhere. But even in Canada, we have very similar issues. And I always remind people that Canada has a 10th of the population of the U.S., which is maybe why those issues are not as loud. And also our media production values are much lower. If you ever watch election debates in Canada and the U.S. side by side, one of them looks like a high school drama production. But that doesn’t mean that this information doesn’t have an impact. We also have people protesting lockdowns, wearing masks, restrictions, we have a lot of similar anger to the anger that people in the U.S. are experiencing and people in all the rest of the world are experiencing. So, this is an issue that is deeply global, and it’s not isolated by region or language or even political preference.
Worland: That’s comforting in some ways and disconcerting in others. It’s nice not to be alone. And yet, maybe this is one thing it would be good to be alone in.
Donovan: That’s why me and Jane text and commiserate. Well, it’s so hard talking about this to audiences who aren’t like in it every day, because Jane describes this stuff and I am just not horrified because that’s not the worst thing I saw today. And that’s kind of what it’s like to be in this zone. I’m sure Enrique has very similar experiences where, you know, we also have to choose our examples really carefully in moments like this as we’re trying to not introduce people to the rabbit holes that we’re in. We’re trying to make sure that they’re getting relevant understanding of the problem. But it’s vast and it is deep and the time has come to do something about it.
Worland: I alluded to this earlier when I was talking to Enrique, but this issue of amplification and I know that all of you wrestle with this. And it’s something we also think about in NLP. Like when is drawing more attention to disinformation and specific examples of it a bad thing, because you’re giving oxygen to it, and exposure, and when is it needed? Because I think we all know researchers have found that the misinformation once it’s out there, is really hard to correct and that most people who see the misinformation are not going to see the correction. So who do I direct this to? I feel like any of you could address this. Does anyone want to grab it? Enrique I see you unmuting.
Acevedo: Well, I wish I could be more optimistic about this question, Darragh. In my personal experience, it came to a point where I’m dedicating most of my energy and my time to the type of journalism that allows me to provide that kind of context and to stay away from a system that’s not equipped to deal with this issues of … Well, if I mentioned this, am I amplifying it? Well, there’s no more time for me to provide the context that I need or to even think about local media, reporters having to shoot their own stories and edit it and set up their camera and do that throughout the day.
How can they have time to actually report the news? This 24-hour news cycle is just making all of our heads explode. So I made a personal, professional decision to stay away from that, and now I’m trying to not do daily news, but do feature stories and got on part of a platform that has the time and the resources to do that in the best possible way. But again, I don’t think it’s the answer people want to hear, right? It’s like, “Oh, well, we still have to consume news on a daily basis.” I was just overwhelmed. My mental health, my physical health, it was bigger than me. There was a point where I was just spending more time providing context and fact-checking than I was informing people.
So something clearly has to change. As an insider, as part of that system, I just said this is going to explode in our faces until we collectively come up with an alternative, with a solution to this. For me, the best decision was just to focus on what I can do and tell the stories in a more comprehensive and complete and timely manner. I don’t know if it makes sense at all, but that was just my decision.
Worland: I was going to say there’s a really important role for long-form journalism and all this and then I remembered you’re at 60 in 6, not 60 Minutes. Long form is now six minutes.
Acevedo: Yeah, absolutely. But most of our stories are over six minutes to be honest. But six minutes, I did an entire newscast that was, with commercials, let’s say 15 minutes of airtime.
Worland: Yes. And that is, that’s a long-form story. So Joan, what did you want to say on the topic?
Donovan: I think one of the things we need is a curation strategy for the internet so that when people look up things, their search returns are somewhat stable over time. If you search today for certain thing, the results are not guaranteed to be the same tomorrow. We’re dealing with social media, which was never designed to deliver news. News is an afterthought, medical information is an afterthought.
You’re dealing with a system that’s designed to show you cat pictures. That’s what social media was really there for. And so when I think about how do we solve this problem of amplification, is we actually have to shore up the vulnerabilities in our system that manipulators often … it’s like hitting a lever sometimes for some of these folks. Their audiences know exactly what to do when they put up the name of a journalist and their phone number. They put up a piece of a blog post and they say, “Make it go.” And that’s what it does is their audiences know what to do, or they’ve invested in some other kind of technology that will help them amplify things like botnets and whatnot, which we’ve heard a lot about over the years.
And so when it comes to amplification though, you also have to know your audience, which is really important. If your audience is pretty much a generalist audience, then yeah, you don’t want to start telling them, “You know, there’s a lot of problems with voter fraud and the misinfo people are using these four key words to talk about it.” Nah, don’t do that, right? Don’t use those keywords, don’t lead people down those rabbit holes. But if you’re a platform company, you have to look for this stuff, you have to have teams of people that are constantly understanding how your systems are being manipulated, how misinformation is propagating, what keywords are really acting as those levers driving people in.
And I’ll leave you with one thing that we discovered, which I think is a big key to this, is the role civil society needs to play, which is to say that there’s a lot of civil society organizations out there right now, and I can think of a few that I really like working with, including United We Dream and Media Justice, who are doing now, you know, more basic community defense work, basically telling their communities, “Hey, we know this disinformation is out there.” If it shows up in any of their web communities, they have a way of marking it, of getting people to understand that this isn’t really what they’re all about.
And I think we’re going to have to kind of pass through this moment where people are going to have to realize that the true vulnerability of the internet is related to how we use it, and how we let others use it. Especially – and this is the tricky part – political elites, people with a little bit of money. These are the people who are most abusing our open information environments. And so we actually have to hold them to higher account and say, “If you have 100,000, 200,000 followers, you got to follow a different set of rules.”
Worland: That’s going to be tough to implement.
Lytvynenko: I’ll just add one thing. I know I joked about not liking this question and that’s because I have to deal with it every day and I’m tired, much like Enrique is tired. This work does drain you. But what I want to add is that, right now reporters’ only option to counter false information is to explain how it’s false, to debunk it, to fact-check it. And that’s by design. It’s by design because Facebook decided that it will outsource fact-checking to news organizations that are specialized in fact-checking. The moderation that goes into making sure that the content on Facebook is accurate, is not algorithmic, it is reporters trying to figure out what is happening.
We talked a little bit about the warning labels that social media companies have put on the content following Election Day. But those warning labels mostly don’t lead to actual information about what the heck that person is talking about, why it’s correct or incorrect, what the evidence for it being correct or incorrect is, and where they can find more information. Those labels don’t do that. What they do is give general information that the social media companies sort of cross their fingers and cross their toes, hope will work to not inoculate more people into this disinformation.
So unfortunately, newsrooms and reporters are just in this difficult position where we have to moderate these platforms, because these platforms will not moderate themselves. And unfortunately, we have to provide the service of explaining to our readers where certain rumor came from, and what are the facts of it. We have to show our work as well, because the trust is just plunging and plunging and plunging, but there’s nobody else doing this work. That’s it. This is the last line of defense.
Acevedo: And quickly to add to that what Jane was saying: Just think about it. We have a system that’s designed to certify that the financial transactions we make online, that we can trust that they’re not being corrupted in any way. And that’s because it’s in the best interest of everyone online to have that, right? But we’ve never had a system that verifies that the sources of information that we’re consulting are trustworthy and acceptable in any way. The same with the food analogy I was making at the beginning, right? We can differentiate between junk food and high nutritional value food. But we don’t do the same with the information we’re consuming, and it’s having an effect on our intellectual and mental health.
Worland: Yeah. A lot of people are consuming the equivalent of junk food as far as information is concerned and may not even fully be aware of it. I want to move to the questions that we have from our audience and start with one from an educator. And because we’re running just a little bit behind in terms of how much time we have for those questions, I want to do it kind of like as a lightning round, quick questions and answers. So we have a question from an educator named Krista Dayton-Ventresca, and she wants to know, what’s your advice for educators as we guide young adults into educated responsible citizenship? So does anyone in particular want to take that question?
Donovan: I mean, it sounds really hard. The question for me is really not about necessarily, how do we educate a mass populace, but how do we really turn the tide, so that people know what they’re seeing and know why they’re seeing it, right? So that’s part of it for me, and that has to do with what the platform companies are willing to do in terms of letting people understand how these algorithms work, right? If you think about media literacy, and then you kind of turn your brain to think about algorithmic literacy, where’s the difference there?
And when we think about what people need to know about information curation online through algorithms, it makes a difference. If you’re an older person and you think, “Well, you know, people took the time to write this down, they must be pretty serious.” If it looks like news, if it’s been served to them from someone who they trust, we have to start to do some more work around getting people to understand what those signals of legitimacy are. But then also get people to understand that the role of an algorithm on a social media platform is to keep you on the platform. And so it’s going to continuously send you things it thinks you like.
And that’s not a bad problem. If you’re on YouTube, and you love Yankee Candle review videos, and it’s just going to keep sending you Yankee Candle review videos, and that’s fine. It’s a problem when it’s white supremacist content. It’s a problem when it’s self-harm videos. And so we need to have much more literacy society-wide about the way in which these platforms are designed and how they’re designed to keep you in their mix. One of the things that’s been interesting from my research is that, when white supremacists are de-platformed from YouTube, their audience doesn’t generally go with them to other platforms. Their audience is vastly, vastly reduced.
And what that points me to is that people are just going to YouTube, that’s what they’re doing. And that’s where they’re going to watch what’s on. And so I think when we think about the next stage of this world of media ecosystems online, that we really need to focus on the algorithms.
Acevedo: And I think I would be making a mistake here if I don’t talk about the work of the News Literacy Project. I got involved very early on, Darragh, through some of the work we did together. And saw that as just a – let alone an existential threat to journalism – but it’s a moral responsibility to get involved with it. So we can’t give the responsibility of media and news literacy to the platforms themselves, we need to work with educators and with schools, and not just here in the U.S., but hopefully globally. We have the resources to do that.
And I think the News Literacy Project is working hard to do that, working at least two different languages now. And we really need to focus on making sure that the next generation. A kid right now with a smartphone like this, has access to more knowledge and information than the person in United States had 15, 20 years ago in the Oval Office. So this should come with a responsibility and with a set of tools to navigate that and to be more responsible consumers of information, not just the way we consume information, but the way we share that information.
Lytvynenko: I’ll just add one thing which is, a lot of this sounds like a drag, because it is. But it also doesn’t have to be. I think the most fun part of my job is trying to figure out how to figure out the truth. And I think that for a lot of kids especially, they think that social media is the internet right? Like they think that’s it, it’s just social media and nothing else. But it’s not. Social media is trying to find new sources, Wikipedia pages, archival footage, learning how to do reverse image search, learning how to hack a reverse image search, so you could do a reverse video search. And all of that is actually pretty cool. The internet is actually a pretty cool place when you take a step back and not look at social media companies.
And I think that there’s really something to the approach of lighting the fire under our kids’ asses and being like, “Actually, like you can investigate this stuff yourself. You can play internet detective because it’s a cool and interesting way to spend your time. It’s a cool and interesting skill to have. You don’t have to, like, sit there and submit to the content that is given to you. You have power over it, and you have power over how you use the internet overall. It doesn’t just have to be Instagram stories or TikToks.”
Worland: I couldn’t agree more. So I have another question from one of our participants. So, Noah from La Habra, California, who I think might be a journalist, is asking what advice you have for reporters tracking misinformation in races that are out of the national eye or take place at the local level. Enrique, do you want to take that one?
Acevedo: Well, I don’t think I would make any distinction between the way we approach this information and the verification of information in your local stations or if you’re working for a network nationally. I think it’s even more important and unfortunately, we’ve seen how resources are being taken away from local media and that has a terrible effect on the spread of disinformation, misinformation and propaganda at a local level. My advice would be … well, I was covering Nevada for CBS News just a couple of weeks ago and the kind of access you get to officials and the entire process locally, it’s just much more conducive to the kind of legwork and the kind of reporting you need to do, regardless of the outlet you’re working for.
So we were talking to registrars and officials from both parties and observers and people who were protesting for one or the other candidate. I think that’s what you should do: Make sure that you’re doing the legwork, that you’re verifying information through the local sources. And even being creative about how to use technology and data as a tool for your reporting. That information is available not just online, but also directly through the primary sources. If people in New York wanted to know about what was the number of outstanding votes in Nevada on our decision desk, they were asking us to check that directly with the people counting the votes in Clark County, in North Las Vegas.
So that’s at the heart of reporting. You don’t do that behind a desk, or behind a computer, you actually do it in person and there’s no better way of doing it than through local reporting, local media.
Worland: I want to bring it back to the burning question at the heart of this season of our podcast. Justin Hendrix asks, “I’d be interested in the panelists’ views on the question whether they are optimistic or pessimistic about democracy in the U.S. over the remainder of this decade.” So I guess he’s being very specific to the near future. And he adds, “It seems to me we’re still in a very precarious position despite the outcome of the election with none of the underlying issues resolved.” Joan, what do you think? Optimistic or pessimistic?
Donovan: I sent Jane probably the meanest tweet I’ve ever seen. I usually am not the pessimist type. I’m kind of an eternal optimist kind of person. I think part of it has to do with growing up a punk in Boston and kind of just always seeing the sunny side of things where the world is aggressive. But I sent her this tweet that was really mean, because we were just going back and forth last night about the Krebs firing. Trump fired one of the top election security officials last night. You know, she was like, “I don’t manage to get shocked very often. But this one, we all knew it was coming, but it’s still … It kind of burns.” And I just was like, “Well, you know, wait ’til you see the look on your face on January 21, 2021, when Trump’s still in office.”
Lytvynenko: You said January 2121, actually, which I thought was much meaner than 2021.
Donovan: And at that point, I realized I had probably taken the black pill and was full-on nihilist because there is something really dangerous going on. Something that as a researcher you can grasp these things, and I think journalists have this intuition as well when you study something. It kind of helps you see a bit of where the trajectory of where we’re going. It doesn’t make you… You can’t predict the future. But you realize how much of a knock-down drag-out fight it’s going to be to get Trump to leave. And what that’s going to do, is it’s going to kill people. I mean, people are going to die in the heaviness of the pandemic in this moment, the fact that there’s no more outdoor hangs. It’s going to be a really sad and depressing time for a lot of people through the holidays. We need something in the U.S. to become more stable than it is.
And I felt like a jerk, Jane. I’m really sorry I sent that to you. Because I was just like, I was goofing, but then I realized that was probably a hard one to hear. And then I had to kind of walk it back. I had to say to myself, “You know what? There’s probably not much I can do personally. I can keep doing what I do well, and that is explaining what’s happening online and explaining what’s happening with misinformation at scale, and really trying to change things.” But there are also things that are going to be out of my control, and I’m going to have to accept them as they happen.
But I’m cautiously optimistic though, that we will get through this transition period and that the incoming administration will be open to dealing with some of these issues. Maybe we don’t get at the question of the internet in a big way, but we deal with online extremism, violence against women. We deal with the many different kinds of racialized disinformation that we’ve been dealing with over the past few years. And so I’m trying to hold out hope, but you know, you’ve got to find love in a hopeless place, as the song goes, but I just felt like a jerk saying it.
So now that it’s off my conscience and I’ve confessed to everybody that I did last night – I had a moment. But I’m coming through the other side, and I’m really trying to work on what is in my capacity, what is in the capacity of my communities, what is in the capacity of all of us working together to make sure that the votes get counted and that the transition happens. That’s where I’m going to focus a lot of my energy and my angst for the next few-
Worland: I think to work in this field, though, you have to have some element of optimism, right, because we’re all working in the field of fighting misinformation. So we all have to believe at some level that there’s a light at the end of the tunnel. That’s all the time we have, but I would like to leave it open to Enrique or Jane, who might want to add to what Joan said.
Lytvynenko: Look, I think that a lot of the time, reporting on this field can get really scary because that trajectory that Joan mentioned – being able to sense it from far away – can feel like you’re bracing for the worst possible thing. And very frequently, the worst possible thing does happen. So you have to recalibrate yourself each time to say like, what’s the situation now? I’m not going to make any predictions between what the rocky period between now and inauguration will look like. But what I am optimistic about is that the U.S. has finally seen the power that its own companies have. This very, very scary big power that, as we talked about, touches the whole world.
My hope is that now that this power has been unleashed on the country responsible for it, because that’s where these companies are coming from, that maybe some fixes can be implemented that benefit not just the U.S., but the entire ecosystem that we’re grappling with.
Worland: Yeah, and I think awareness is the first step. Education is a way to take action. And I think all of us are doing what we can to inform the next generation and the general public – everyone really – on what these issues are and how to be aware of them. Enrique, I don’t know if you want to say any final thoughts if you had any.
Acevedo: Right. No, just hope that we’ll learn some lessons. In the last four or five years, there’s a pure disconnect of what as a society we demand from a candidate and what we need from a president, and that plays out in the way we are covering the campaign. I wish I had a more attractive answer, but I think that’s the point, that political coverage doesn’t have to be about record ratings, as much as it has to be about substance. So the way we cover a presidential campaign should be focusing on the issues of the character of the candidates. Plain and simple. Maybe it’s not attractive, but I think it’s in the best interest of our democracy.
Worland: On that note with the very last word being democracy as we started, we end. Thank you everyone. I want to thank our guests very much for joining all of us tonight. Thank you so much.
And thanks for joining us for the first season of Is that a fact? If you haven’t already, we invite you to listen to the other episodes in the series. And when you have, feel free to drop us a line at [email protected]. We welcome your feedback and ideas for upcoming seasons. We’re going to take a break, but stay tuned for more information on season 2, which we’ll release in 2021.
Is that a fact? is a production of the News Literacy Project, a nonpartisan education nonprofit helping educators, students and the general public become news literate so they can be active consumers of news and information, and equal and engaged participants in a democracy. Alan Miller is our founder and CEO. I’m your host, Darragh Worland. Our executive producer is Mike Webb. Our editor is Timothy Kramer and our theme music is by Erin Busch. To learn more about the News Literacy Project, go to newslit.org.