IS THAT A FACT?

Kara Swisher on why Facebook is a threat to democracy

Season 1 Episode 4


Kara Swisher on why Facebook is a threat to democracy

Kara Swisher

About The Episode

Our guest is Kara Swisher, one of the premiere tech columnists in the country. Our host spoke to Swisher about how social media platforms have affected our ability to talk to one another. Have platforms like Facebook contributed to the rise of misinformation. Do social media platforms have a responsibility to police certain content? Swisher has been a long-time critic of the power of tech companies, their use of personal data and the industry’s lack of accountability. In our interview she said, “We are cheap dates to the data companies.”

Swisher has been chronicling the rise of Silicon Valley since the early ‘90s. In 2003, Swisher launched the D: All things Digital Conference with Walt Mossberg, and later co-founded the website Recode with him, which was acquired by Vox in 2015. Swisher has written for The Washington Post and The Wall Street Journal, and she is currently an editor-at-large for New York Media and a contributor to the New York Times, where she recently launched a new podcast called “Sway: A Podcast about Power.” She is the author of two books, AOL.com (1999) and There Must Be a Pony in Here Somewhere: The AOL Time Warner Debacle and the Quest for a Digital Future (2003).

Available on:

Episode Transcript

Darragh Worland: Welcome back to Is that a fact?, the all new podcast brought to you by the nonprofit News Literacy Project. I’m your host, Darragh Worland. Today, for the fourth episode of our series on the impact of misinformation on democracy, we talk to Kara Swisher, one of the premier tech columnists in the country. Swisher has been chronicling the rise of Silicon Valley since the emergence of the internet in the early ’90s. In 2003, Swisher launched the All Things Digital conference with Walt Mossberg, who is a member of NLP’s board of directors. Swisher and Mossberg also co-founded the website Recode, which was acquired by Vox in 2015.

Swisher has written for The Washington Post and The Wall Street Journal, and she’s currently an editor-at-large for New York Media and a contributor to the New York Times, where she recently launched a new podcast called Sway, a podcast about power. We spoke to Swisher at the end of August. We thought she would be the perfect person to provide insight into the damage social media is doing to our democracy. Her take? Without accountability it’s bad, real bad.

You followed the development of social media from when the first platforms emerged on the tech scene through today, where social media obviously holds tremendous sway in all of our lives. How would you say these platforms have impacted our civic discourse, particularly in the U.S.?

Kara Swisher: Oh, I think it’s been terrible. I mean my very first column for The New York Times two years ago – and it’s something I’ve been railing about for years – is how badly these social media companies have hurt our civic discourse completely. They’ve weaponized it, they’ve amplified bad feelings, they’ve been built for enragement over engagement.

Engagement causes enragement, and it’s linked together. So I don’t think anything good at all has come from it, except for maybe you can share, you know, there’s good parts of it, productive parts, where you have groups and you organize and stuff like that. But in general, the idea that this is all Arab Spring is really, that was a long time ago in a galaxy far away.

Worland: So would you say that the bad outweighs the good on social media?

Swisher: I think the bad, way they run it. It doesn’t mean it couldn’t be good, it means they run it badly. And I think that’s my issue with the whole thing, is that it’s really the way they have architected it and the way they have designed, it has to do a lot with their business model, which is advertising. And they’ve done very little maintenance of the platform, and so they’ve created a platform that people run amok on, that’s really pretty much what’s happened.

Worland: And are you talking about Facebook specifically?

Swisher: We’re always talking about Facebook, aren’t we? It’s the biggest. There’s others, Twitter of course is just sort of somewhat of a cesspool, but it’s a minor amount of people are on Twitter. It’s usually just the right and left wing elite and politicians and celebrities talking at and past each other. It’s not very big, you know what I mean? The same thing, Snapchat stays by itself. Kids use it. It’s not that bad. It’s a really good platform. Whatever you think of the Chinese, TikTok is a great platform. It’s fun, it’s interesting, it’s engaging. There’s not a lot of hate running around that platform because of the algorithms prevent that from happening. It doesn’t mean it couldn’t, it couldn’t be taken advantage of, but it isn’t. And so you just pretty much have Facebook. Facebook is the dominant social media site. No matter what they say, they control everything. They’re the [inaudible 00:03:23] that everybody else has to be on and unfortunately is on.

Worland: So I’m glad we’re focusing now on Facebook.

Swisher: You have to, you just can’t. I mean, we can talk about the others, but it doesn’t … Reddit, parts of it are awful, they’ve tried to fix it. It really is Facebook and Twitter, and Twitter is really more around the media and political elite than anything else.

Worland: Well, if you had to pick just a few issues with Facebook, specifically, that everyone needs to know about, which would they be?

Swisher: The latest apology of Mark Zuckerberg. Now he’s like, “We should have taken down these sort of groups that were organizing militias into cities that led to the shooting of the protestor.” It’s just, you can go back to New Zealand and the person who was broadcasting on Facebook and other websites, including Google and YouTube. You know what? Actually, include YouTube and Facebook together, because I think that you really do. Although I think the people at YouTube are trying to take their job seriously and fix it, as difficult as it is. So you just look at either of them and it’s one crisis after the next, whether it’s Alex Jones or the killings in New Zealand or the current QAnon controversy spreading everywhere, all that nonsense. And it’s dangerous nonsense, I don’t mean to diminish the dangerousness of it. But whatever it is, whatever – anti-vax, you just name it – it’s gone across those platforms. And those platforms have helped these fringe lunatic people have a voice.

Worland: So are you saying Facebook’s unwillingness or being late to the party to shut these pages down is the main issue?

Swisher: It’s not their unwillingness, they’ve talked about it. You know, it’s their business plan not to. They haven’t been unwilling. They’ve said they’ve tried, Should we go to Russia? Shall we go to the abuses by foreign … If the platform is designed to create crisis, no matter what the topic, it doesn’t really matter what it is. Of course, it does, whether it’s anti-vax, whether it’s coronavirus misinformation, they try to get it off there, but it’s built to do this. And that’s the problem that they have, is that they’re complicit in what’s happening. Mark called what happened with the kid who was shooting that protester in Kenosha, that it was an operational problem. It’s not an operational problem, it’s a system-wide problem. They continue to look at it as if it’s fixable. Problem is the architecture, it just stinks.

Worland: Well, can we go to the Cambridge Analytica data breach, which-

Swisher: Sure. That seems so small and quaint at this point.

Worland: That’s a very sad statement.

Swisher: It probably wasn’t as big. It wasn’t as big as what’s happening now, for sure.

Worland: Well, that’s where I think so many people started to question Facebook’s use of personal data or maybe the risk of their personal data on Facebook. But I think a lot of people don’t really understand what happened with the breach. Can you just sort of summarize it quickly and then tie that to what you think everyone should be paying attention to now, when it comes to how these platforms are using their personal data. And if you want to keep it specific to Facebook, that’s fine.

Swisher: Yeah. Researchers often use Facebook for information to test different theories and things like that. And this was a researcher at a university, was supposedly doing some things. In fact, all the information was being ferried over. Cambridge Analytica was using data crunching in order to target people, allegedly. They just were misusing the data that Facebook had. Facebook of course said, “The researcher was misusing and didn’t follow our terms.” Fact of the matter is they didn’t police it very well. And so, Cambridge Analytica is quaint in comparison to what’s happening right now, the manipulation on the platform.

Worland: Right. And is part of the reason you think it’s quaint because it’s not really a life-and-death issue?

Swisher: Facebook, it makes the argument, “Not many people were impacted, that they fixed the leak, this and that. It was a leak in our basement, and we cleaned it up, now the mold is gone.” The mold isn’t gone, the mold just moved. And so I think it began to teach people how to use these platforms for media manipulation. And then it moved to political advertising, lying and stuff like that. Now a lot of people then focus on political advertising. Well, it’s not that big a deal, actually, political advertising. It’s political content that’s the bigger deal. So Facebook was over here waving its hands, not doing a very good job compared to the other platforms. When actually all the action is in political content, which is the use of the platform to disseminate inaccurate misinformation via content, which is another way to do it. And you don’t have to pay for it. And so that’s another issue. So it just is one thing to the next. This is like a whack-a-mole, unfortunately. And it’s been used really effectively by the right to get their messaging out. They use it and they know they have this opportunistic way to get out their messages to millions of people before the police show up. And the police often don’t show up, to use a terrible metaphor right now.

Worland: Would you go so far as to say Facebook, because of this role, in spreading misinformation and propaganda, would you go so far as to say it’s a threat to democracy?

Swisher: Yes, everyone knows that. It’s carelessness that’s the threat to democracy. It’s the lack of understanding of the power of the platform and a lack of policing it. They try to wrap themselves in the idea that it’s the First Amendment, but the constitution does not say, “Facebook shall make no law,” right? They always do that. And I’m like, “You can police it. You can police things.” And so I think that’s the issue, is that they act as if they’re Congress or something. They’re not the government and they can do things on their platform to make it safer for people using it.

Worland: Well, if you could go back in time and be Mark Zuckerberg, inventing Facebook all over again, how would you make it? What would you do differently?

Swisher: Well, it was invented with a lie, you know what I mean? He was stealing information from people’s things without their permission at Harvard, he was stealing Facebooks that were the property of those people. So it started with a theft, a theft of people’s identity. So I’m not sure. I wouldn’t have started it. I wouldn’t have done it. But so I don’t know what to say about that. It started with a theft of people’s identity.

Worland: What do you think these social media companies, Facebook included, could be doing to mitigate this damage that they’re doing to democracy?

Swisher: Look, there are some really great people at Facebook trying to make this work, or trying to figure it out, and trying to make it safer. Everywhere they go, they have content moderators. Well, it turns out they don’t pay them enough. It turns out they put them in terrible situations. “Okay, we’re going to fix that. Oh, one slipped through. Oh, we’re going to do this. And we’re going to introduce this product without having adequate tools for people to do something about it. Oh, did that break?” They wait for something to break and then they fix it, without thinking really clearly about what they’re making. And it’s not a problem that Facebook has by itself, it’s a lot of them. That’s one of the issues, is that they aren’t thinking about safety. And it’s because the people who make these products have never been unsafe a day in their lives, or very few of them have. And they’re not thinking about safety. They’re typically overprivileged, they’re typically white, they’re typically men, they’re typically young. These are people who don’t think about people’s safety. They don’t think about implications. But you can say that, but they do need to pay attention to the impact of their inventions on the rest of society, and not point at people who say, “Hey, just a minute,” and say, “You’re not for free speech.” That’s not the argument going on here. It’s about safety of people to operate online in environments that are not full of the cesspool of misinformation and lies that are on these platforms.

Worland: So is it a question of safety only? Or is it also a question of safety and ethics? Is it an ethical failure?

Swisher: I’m not even going to argue ethics with them. I just stick to safety. But you know what I mean? Sure, ethically, it’s a problem. But there’s no point in arguing ethics, they don’t care. Why argue with people who don’t have them?

Worland: So you think there’s a way to put pressure, to apply pressure for them to address safety issues, but not ethical ones?

Swisher: I think it works better. You’re putting people in danger, over and over again. Now Mark did his apology about what happened in Kenosha. There’ll be another one tomorrow. Every week it’s something else. It’s QAnon. Oh, it’s Alex Jones. Oh no, it’s this. At some point, the penny has to drop that this platform is being woefully misused by hate groups. How are you going to do that? And then they’re like, “Well, it’s impossible to police.” I’m like, “Why have a platform that’s impossible to police? Why build something that makes it impossible to fix?” And then tell everyone you don’t want to be the arbiter of the truth. Why create a platform that requires arbiter of the truth? None of it makes any sense when you actually ask them tough questions that they don’t want to hear.

Worland: But some social media platforms have started to do more to stem the flow of misinformation, particularly during the pandemic. And I do think it has something to do with the fact that it is very clearly a life-and-death issue, the spread of misinformation about COVID. Twitter is labeling manipulated media, multiple platforms, have reporting options so you can flag election-related misinformation. The COVID response has been pretty decent. They did things like take down Plandemic, for example.

Swisher: Oh, thank you so much. I’m sorry, I’m not giving them kudos for that. That’s their job. That’s their job to do that. So, sure.

Worland: All right. Well, so what do you think they’ve gotten right—

Swisher: It’s like saying to me, “Thank you for putting periods in your sentences when you write columns.” “Oh, you’re welcome. Thank you. I worked hard to put those periods in the sentences.” It’s your job. It’s their job to do this, to make these platforms safe.

Worland: But would they think that that’s what their job is? How would they define what their job is?

Swisher: To make a lot of money and not care about violating people’s privacy. That’s one thing, I think that’s their job. It seems like they think that’s their job. It’s about the community will police itself when it’s so obvious that it won’t. They’d say their job is to just create tools and that they have no responsibility for what happens when people use those tools. They’d say that.

Worland: So what can we do as individuals? You’re obviously doing a lot with your columns and your vocal opposition. The majority of our listeners are probably on Facebook. And it’s a huge sacrifice to go off Facebook, because you might be losing-

Swisher: They use it for information.

Worland: And they lose all those photos and that whole history on their timeline of important memories. They’re part of groups that maybe they wouldn’t be able to connect with if they’re not on Facebook. So what do you advise the average person to do in the face of this?

Swisher: I think you can start to voice your opposition to that. You should make your opinions known to them, make your opinions known to legislators who are trying to figure out rules of the road for these companies in some way. Use your voice to talk about it on Facebook. Point out false information, you know what I mean? But just keep at it in these forums. The only way to push back a lie is to call it out continually. And if you see something, you say something.

Worland: Yeah, the research shows that misinformation travels much, much further than the correction of that misinformation. And often in the correcting of the misinformation, you’re restating it, the misinformation in the first place. It is a slippery slope and it’s tricky to handle. So aside from just shutting down social media companies, what do you think-

Swisher: I wouldn’t shut them down. I would shame them. Shame them, I’m not going to shut them down, I’m not China. You know, I just think they should have their own rules of the road, but legislators should start to think about fines. Fine them if something happens. Hold them responsible and accountable, just the way chemical companies are responsible. Prove it’s addictive, show that they are selling people an addictive … How do we take cigarette people to account? How do we take chemical companies to account? How do we take Wall Street firms to account? Not very well. But they did stop once the public got educated and involved in making rules of the road. There’s nothing wrong with using these services when they’re at their best. Everything’s wrong when they get to run amok and run it themselves and then abrogate responsibility for running it and saying, “It’s up to you.” It’s like literally, it’s The Purge, is what they’re doing.

Worland: You’ve written extensively about the lack of regulation on tech companies in general and obviously social media. I mean, how can they hold these tech companies to account if there’s nothing to hold them to account to?

Swisher: That’s right, we need to pass laws. Look, the ones who’ve gotten really wealthy in this economy are all the tech companies. Look at the numbers right now, the top companies that have made more money in this pandemic are all tech companies and all the owners of the tech companies. One, start to tax them properly so that they can pay for a great and vibrant FTC to regulate them properly. Stop buying their line of it not being innovative, if they aren’t allowed to do whatever that you want. People can be innovative with guardrails, they can be. It’s just an argument to do what they want, to take what they want, and to get richer than ever, so they can hire more lobbyists to stop people from passing decent laws.

Why don’t we have a national privacy bill, like every other democratic country in the world has? Why don’t people have control of their information? Why don’t we have transparency? We are cheap dates to these internet companies. We are cheap dates. And we get a map, we get a dating service, I don’t know, you get to form your group on there, and they get everything else. They get your data, they get your money, they get everything. So I don’t know why anybody thinks that’s a good thing. So that is up to our elected officials to do something about it.

Worland: You’ve talked about how some other countries have more regulations in place. Can you talk a little bit about how Europe, for example, countries in Europe have approached regulating social media platforms and these privacy issues?

Swisher: Yeah, I don’t agree with everything. They tend to over-regulate in Europe, they tend to, but they’re down the right road, right, in doing something about it. In New Zealand, all across the world, there’s all kinds of just basic privacy rights, I think is one thing. And everyone goes crazy here when you say, “What are you doing with my information that is mine? Why are you following me around? Why are you taking information from me? Who’s protecting it?” And what they do is they rely on the fact that they hand us some free service and we’re just fine with it. We shouldn’t be fine with it. People are fine with Twinkies, but guess what? They had stuff in them that wasn’t so good for us. We might want to look at that. I don’t like wearing seat belts in cars, but they’re safer. And no one asked me to put a seat belt on a car, but I’m glad they did, right? A lot of people are glad they did. Prevents deaths.

Swisher: Not everything needs to be by a claim. You know, going back a long, long time ago, interracial marriage, lots of people weren’t for gay marriage. We don’t have to consult with everyone on everything to make a better society. Our elected officials whom we elect should be the ones on the line for this stuff. Because these guys are unelected, they’re unfirable for the most part, you cannot fire Mark Zuckerberg, ever. He’s unfirable. Think about that, an enormous power. Just for a second, take a minute of someone who has enormous power over billions of people on the planet and is unfirable, and one of the richest people on the planet. Oh, that’s problematic. I’m not that liberal, I’m kind of in the center like a lot of people. So even someone in the center can go, “That’s not good.” Just be intelligent and demand that your elected officials put rules in the road in place like they do in Europe and other places.

Worland: Well, speaking of legislators and what they can do and how they can take action, you’ve written about how California has taken stronger steps in particular, stronger than other states. And I think California is a state that tends to regulate more in general. So can you briefly outline what the state’s doing and is it working?

Swisher: Well, it hasn’t initiated it yet. They passed it and then everyone’s been pushing back ever since. And it’s just basically a whole set of rights around your data. I don’t have every one at my fingertips, but it’s a basic privacy bill, where they have to tell you about things, they’ve got to do things. The problem with a lot of these privacy bills is only big companies can comply with them, because they’re so expensive and onerous. So that’s the difficulty, is that it does favor people with lots of lawyers or money to pay for lawyers. It tends to zero out the smaller companies that could be competitors to them. It’s that you have control over your data and how you have control and how you need to be told and what they can use and what third-party data they can sell. They’re very complex things, but we don’t have a national one like they do in other countries. The question is is it going to have as much teeth as the California one? So California voted it in and then it took certain time to implement and they’re still working out various parts of it, just like they are with AB5, which is another bill in California around what an employee is. A lot of this stuff tends to start in states like California. But there’s privacy bills in lots of states, I think it’s 11 or 12 or 13 states. The issue is there’s too many in too many states that’s too confusing for any small company to be able to comply with, and therefore there should be a national privacy bill.

Worland: I guess even as you’re pushing social media companies to be more accountable for the content on their platforms, aside from what you’ve already said, what do you think the average social media user can do to take advantage of the opportunities the platforms present? So even while using Facebook, is there anything somebody can do to protect themselves?

Swisher: Well, for your basic, two-factor authentication, although it’s pretty easy to hack you. So I think a couple things you can do is you don’t share things if you don’t know them to be true, right? You don’t deal with people you don’t know to be actual people. You push back on bad information when some one of your friends shares it. You watch out for scams and phishing. That can happen not just on these sites, but on email and texts now and everything else. Jeff Bezos was tricked by a text, because it came from the Prince of Saudi Arabia, but he wasn’t expecting him to send him a malware thing. But you have to watch what you’re getting and give it a smell test instead of just forwarding it and passing along. It’s a really easy thing to pass along for sure. You get why people do, because it’s so addictive, it’s like creating a slot machine in lots of ways, you share just because you’re sharing.

Worland: So we’ve obviously talked a lot about Facebook and we touched a little bit on other social media platforms, but I want to make sure that we definitely cover the threat of other social media platforms. Do you think platforms like Twitter or TikTok, are they also a threat to democracy?

Swisher: Twitter is just a noisy place the president has seemed to land in, and he uses it really effectively. I’ve written about this a lot. He uses it as a campaign, a way to cudgel people, to spread misinformation, that’s what he spends a lot of his time doing. And hatred, that’s his basic … He’s the greatest troll in history. And so, yeah, that’s dangerous, because he has a voice somewhere that gets people going, especially the media. Twitter is sort of a megaphone for him. It works to get attention of the media leads and the others, and then it gets spread all over the place. And things spread up to him too, going both directions. So it’s what it is, you know, it’s his megaphone, essentially.

Worland: Well, Twitter has started to label his tweets though, right? So what do you think of that?

Swisher: I think it’s great. When he lies and does something dangerous, you say so. You don’t give him a pass. You don’t give him a hall pass for misusing the platform that everybody else gets thrown off for three or four days or for forever. Why can’t he live by the rules of Twitter? And he is now. I think they’re labeling it. I didn’t take it down, they labeled it. They sometimes take it down when it’s violent, and it’s been violent many times, but mostly they’ve labeled it, is all.

Worland: Well, you called him the biggest troll in history, I think. Do you think that the country needs some kind of hate speech laws, some sort of regulation around hate speech?

Swisher: There’s current laws. There’s lots of laws in that area already. I think they need to be enforced. I think there’s lots of hate crime legislation that is passed over time. Again, he can say it and then they can take it down. I don’t know why that just doesn’t work. I think it’s just that if he violates the rules of these platforms, it gets taken down or labeled. I don’t think that’s the most incredibly egregious thing you can do.

Worland: Do you think they should remove the president from Twitter the way they would other people who violate their standards?

Swisher: I absolutely do. He’d like that not to be the case. And in some cases it is notable for its news. If he can say it on a cable program, he can do it himself somewhere, on White House press releases and stuff like that. I think you amplify it when you let him break rules that other people have to follow. I don’t understand why … Nobody is above the law. It’s just a basic tenet of our country. And so he’s breaking rules on the platform, so he has to live by the rules on the platform. Again, not controversial, but if people go crazy when you say he has to do that. That’s why he violates it so many times, to see how many times. They’ll see if anyone will do anything, and they won’t, they won’t do anything. That’s the dirty secret. He knows. That’s why he violates things all the time, just like having his rally at the White House. He knows he’s violating the Hatch Act, he doesn’t care.

Worland: So let’s just say that someone were to post some kind of hate or violence-inciting message on Facebook, what do you think the consequences of that should be?

Swisher: Take them, throw them off, throw them off. They take them down. They take them down. Sometimes it’s two days, sometimes it just asks you to remove a tweet. It just depends. I did one where I linked to a tweet that had someone’s phone number. I didn’t even put their phone number, but I didn’t realize the phone number was in it, because I was being careless, and they said, “Take it down or we’ll take down your account.” I was like, “Of course I will. I don’t want someone’s personal phone number to be here.” And I took it down.

Worland: So you got a warning but you weren’t forced to take a break from the platform or anything?

Swisher: No, because it was a mistake, a clear mistake. I was retweeting someone else, I wasn’t tweeting it myself. Had I done the phone number, I would have known just what I was doing, and then I should have been taken off or given a time out or whatever.

Worland: So that was seen as some sort of doxing or could have potentially been?

Swisher: Yes, yes. I think they have their rules. They have whatever rules they have in place. It’s a good one, not to put people’s personal phone numbers.

Worland: So you’ve also talked a lot about how these tech companies should be seen as media companies, treated as such, behave as such. Do you think we’re anywhere near where they will be acting as media companies and having standards?

Swisher: No.

Worland: No?

Swisher: Why would they if they have a law that says they don’t have to? There’s some good things about Section 230, 100%, which gives them broad immunity, they don’t have to act like a media company. But when you’re acting like a media company, you should get the liability of a media company. I do think that’s the case.

Worland: So for our listeners who don’t know what Section 230 is, can you just explain it?

Swisher: It gives them broad immunity. it was used for a very good reason to allow these companies to grow without being litigated to death. If things were on their platform that was inaccurate, they couldn’t be sued for it, it wasn’t their fault. Now it’s pretty much an immunity blanket for a lot of things. And in some cases it does allow them to take crap off. If it wasn’t there, they’d take everything off. And so it would squelch speech. What you have to look out for is making a 230 for today versus trying to get rid of it. It’s not so much get rid of it as let’s reform it in a way that speaks to today’s issues.

Worland: Is there anything that we didn’t cover that you think we should mention in the context, particularly of social media’s impact on democracy?

Swisher: Well, I love tech. I think it could be used for good. The original ideas behind a lot of this stuff was for good. Let’s be clear, they were just wanting to unite people. And so that’s a really good thing. So what can we do to make it safer for people to connect versus fight with each other? And what incentives, can you architect it so people do use these platforms for good versus just constant partisan bickering or manipulation or misinformation? At their best, it’s a great way to bring people together. At their worst, it’s what we have now.

Worland: How optimistic are you that that future could be realized?

Swisher: Not at all.

Worland: So how do you keep doing what you do, covering these—

Swisher: I just keep yelling at them. I keep saying, “Fix it.” Because I think it’s fixable. It’s certainly fixable. I know it’s vast, I know it’s hard, I know it’s technical, but you think about the vast amount of information that’s being collected about this world. The amount of information that, imagine it getting in hands that were not quite as good as we might have it right now. As bad as we have it right now, they’re better than a lot of people who could get their hands on it. That’s what really worries me. That’s what keeps me up at night, is who gets their hands on this stuff.

Worland: Thanks for listening. We’ve been talking with tech writer Kara Swisher. In our next episode, we’ll talk to Deen Freelon, an associate professor in the School of Media and Journalism at the University of North Carolina. He’s an expert in analyzing large amounts of social media data. We’ll talk to him about how foreign adversaries are using social media platforms against us to seed division and sow chaos. Is that a fact? is a production of the News Literacy Project, a non-partisan education nonprofit, helping educators, students, and the general public become news literate so they can be active consumers of news and information and equal and engaged participants in a democracy. Alan C. Miller is our founder and CEO. I’m your host, Darragh Worland. Our executive producer is Mike Webb. Our editor is Timothy Kramer and our theme music is by Eryn Busch. To learn more about the News Literacy Project, go to newslit.org.


FOR EVERYONE

Checkology® can help you tell the difference between fact and fiction.

What is Checkology?