Check your inbox
We just sent a link to your inbox. Click the link to continue signing in. Can’t find it? Check your spam & junk mail.
Didn't get a link?
Check your inbox
We just sent a link to your inbox. Click the link to continue registering. Can’t find it? Check your spam & junk mail.
Didn't get a link?
How often would you like to pay?
Your payment was interrupted
Exiting the registration flow at this point will mean you will loose your progress
Rappler CEO Maria Ressa talks to author, professor, and former journalist Anya Schiffrin, who comes down hard on Facebook for abdicating its responsibility in curbing disinformation, policing election messaging, and regulating political advertising.
Schiffrin is the director of the Technology, Media, and Communications specialization at Columbia University. She teaches courses on media, development, and innovation. Schiffrin, who spent 10 years working overseas as a journalist in Europe and Asia, also writes about journalism.
Maria Ressa: Hello and welcome. I'm Maria Ressa. Thank you for joining us. I'm really excited to speak with Anya Schiffrin, someone who has written so much and hopefully will help us find a solution to the problems that we're dealing with today. She is the director of the Institute of Technology and Communications at the School of International Public Affairs at Columbia University. Anya, it is so good to see you.
Anya Schiffrin: Yes, I'm so honored to be with you. Thank you very much for contacting me and for caring about this incredibly important problem.
Maria Ressa: It's like, it's the biggest thing we're going to have to solve next. To make sure the next decade moves, moves well, right. So Facebook today came out with an announcement. Its effort to try to deal with the elections – what did you think of it?
Anya Schiffrin: Well, Facebook, of course, is a major danger to democracy all over the world at this point. And I think many of us have been asking them to do something about election messaging and political advertising for years. Sadly, in the United States, we have a completely paralyzed system at the moment. We have a lot of dark money and campaign finance abuses. We have a federal election commission that is totally paralyzed and we don't have basic laws that everybody else has. We don't have, you know, mail-in voting or voting on weekends. In many parts of the country, there's been a systematic attempt to disenfranchize people.
So for Facebook to kind of sit back and wait for laws is a totally disingenuous position. Some of us have been saying for a really long time they should be better than the laws. There are things that they should have been doing for years, and there's great examples all over the world that they could copy if they wanted to look at European countries that don't allow for years and years paid advertising, that required disclosure of who paid the ads, that don't allow false or hate speech to have laws and have courts that make these decisions.
So Facebook has just taken advantage of this legal vacuum in our country to go ahead and really further poison an already polarized discourse. It's the weirdest thing is I don't even understand why Facebook takes election advertising, because they don't make even that much money off it and everybody hates them so much. So I think what they announced today is a step in the right direction, but there is much more that could be done.
Maria Ressa: It feels in a way, like a finger in the dam, right? I mean, it's gone. It's coming down. What do you think Facebook should be doing walking into the November elections?
Anya Schiffrin: Absolutely. So I'm really happy to say that in the last few months, there have been so many excellent reports and recommendations that have come out so Nathalie Marechal and Rebecca MacKinnon have done a report on why we should ban micro-targeting of political advertising. Rebekah Tromble and a group of colleagues in Europe have put out a report called Virtual Insanity, where they talk about the kinds of disclosure that could be done. Laura Edelson at NYU and many others have been looking closely at the ad libraries and talking about how those could be strengthened. So a huge amount more of disclosure needs to happen and privacy needs to happen. Ann Ravel from the Federal Elections Commission, she was a commissioner before, has just written an op-ed on how we could provide the Honest Ads Act to broaden definitions of what is considered political messaging. We've seen that whole Stop Hate campaign, which the National Association for the Advancement of Colored People (NAACP) and many others are part of to pull ads. And my own paper is coming out on this and a couple of weeks with the Roosevelt Institute.
I'm actually going even further than all of these excellent ideas and really what I called "beyond disclosure." So I think that disclosure is great. But then, as Joan Donovan says, Facebook discloses and then the rest of us clean up their mess. And I find that ridiculous. You know, Facebook barely gives out any information. Then they start something called Crowd Tangle. Then all the journalists, all the academics start going through the data for Facebook, which Facebook probably has anyway, and say, well, you should do this or you should do that.
So what I'd like to see the platforms do is commit to a voluntary fairness doctrine where from the get go, they say, you know what, we're not going to profit off this stuff. Right. "You have no right to my rally," as Jack Dorsey said famously a couple years ago. What we're going to do is to have a news feed of quality information on the topics that matter, and that can be debates, town halls provided by the campaigns, provided by, you know, fair commentators and respected media and just do away with all this garbage out there.
And, you know, the platforms like to say, "we're platforms not publishers," but that's just not true. I mean, that went away a long time ago. So why embrace it and actually put out quality news feed? They've already started doing those election cards, which are actually fantastic. I myself learned from Facebook that my polling place had changed last spring, which I had not realized. So let's just take that a step further and put out some quality news.
Then, of course, the other exciting development has been Australian, France requiring Facebook to start paying for the news that they use and Google. And of course, now they're lobbying and trying to get out of it. But they could do that, too. So these companies have they're so proud of themselves. They think they've done so much for the world. Well, then do something for the world. Let's do some more things for the world. There's tons of stuff that they could step up and do. And again, why ban political ads a week before the elections? Why not ban them permanently? Two months before.
Maria Ressa: Right. And again, it's been 4 years. I mean, you look at Black Lives Matter 4 years ago versus today. This Tuesday, they talked about Russian networks getting pulled down again. From where we sit we haven't seen any real significant change. And it certainly seems to have pulled societies further apart, created the "us against them," made facts debatable. If you don't have the facts, can you actually have integrity of elections?
Anya Schiffrin: No, you cannot have – not just elections– you cannot have a functioning democracy without baseline agreement on facts. And one of the things I've been doing this summer is writing about the 1930s when propaganda was considered to be a huge, huge problem. And it actually was because we saw the rise of the Nazis, we saw Stalin. And it's very, very clear that if you don't have – Hannah Arendt, many people have written about this and talked about this – If you don't have baseline facts, you don't have baseline understanding, your society will die. And we haven't been told we've got COVID, But what could be more clear than the fact that my aunt is getting three or four WhatsApps every single day saying it's a conspiracy, you can't get the vaccine, you can't get checked. I mean, this stuff is garbage. And if you can't even trust this decay to this point and the articles about all of this sort of say, "Oh it's terrible, there's all this polarization, there's all this disinformation." But we also know to a large extent through work on yours, who is actually putting this out there. It's not some coincidence. I mean, I don't know. I don't want to sound like a conspiracy we also have to really understand very clearly who are the people who are putting this out there? How are they profiting? And again, a minimum, I know there's academic arguments about labeling and whether that makes sense. But the very minimum is to put a little border around this garbage on WhatsApp that says, hey, this started from whoever it started from. So that at least people know where it came from, because once it's been repurposed and recycled and passed along, nobody has a clue.
Maria Ressa: In terms of how the content and technology and the targeting – so the way all of it, a behavior modification system, which we don't really label it that way. That book by Shoshana Zuboff last year was Surveillance Capitalism. This is designed this way and Facebook, YouTube, Twitter. They know this, that it is what it is, what is preventing them from actually acting on it, because I was much more optimistic two years or four years ago, I was optimistic because these platforms moved so quickly to pivot to mobile. It took them only two years and it's four years later. So obviously. Is it just money? Is it a lack of understanding of the impact on democracy?
Anya Schiffrin: So while I was writing my PHD dissertation on how to fix the problem of online mis–and–disinformation? Nobody from Facebook gave me one interview. It was remarkable. I have spent hours and hours and hours with regulators in Brussels and France. I've gone back to them over a period of years. Nobody at Facebook would ever speak to me. And I tried everything, including, by the way, friends of friends say, Oh, they're so nice so-and-so will talk to you. They just never would. So I don't have any insight.
But I think that what's happened at Facebook is a combination of a whole bunch of toxic things. And I you know, I speculate as to their motivations in my dissertation. You know, I think it's partly above all, they wanted to avoid regulation. So they started putting forward this idea that the solutions were things like funding, fact checking. Right. You know, funding, media literacy training. I was like, easy for that. You pay for that. And hopefully nobody will regulate you. So I think it's partly I think it's greed. And I think that it's ideology. And I think that they conveniently think everybody does this. It's like the old (saying) if you have a hammer, everything looks like a nail. They were so committed to the idea that Facebook was great and Facebook could solve problems, social media would connect everybody that they simply couldn't see that they were actually spreading hatred and killing around the world. And then once they saw that, they couldn't really handle it and they sort of a lot of them, like Zuckerberg, seemed to have just doubled down. Right. So the more you try to talk to them, the more you tell them, the more they don't seem able to absorb new information. And that's obviously a sign of something wrong because smart people learn the facts and they adapt their ideas, but they keep bringing out these cliches about the marketplace of ideas. Well, you know, my husband, Joseph Stiglitz, has written very clearly about how it's not a real marketplace. So has Zeynep Tufekci, so has Tim Wu, they've explained for a long time why this isn't a real marketplace of ideas and they just persist. So I think they've just become entrenched. And you know, I'm not their psychologist so I don't actually know, but I will say there's clearly internal dissent. Obviously it's just a shame. It doesn't seem to have reached the top. I think it's convenient for them to stick to their guns because they're making a lot of money and they keep making money.
Maria Ressa: And the market incentives actually don't discourage them, in fact, encourage them to follow this path. So in your in the writing, in your dissertation, in the study on how do we solve this?
Anya Schiffrin: So I think it's going to be like by the time I finished my three years of research and writing, I came to the view that all of the things people are trying. Are going to have to be tried. So, yes, there's a role for the fact checking and the media literacy and the community engagement. But there's really a big role for regulation. And that will have to be on so many levels, you know, things like banning micro-targeting, things like forcing disclosure of algorithms you know, their algorithms as in Britain, as in France, things like privacy protections. I think that countries like Germany, that have laws about making the platforms liable or even some of the hate speech laws, I think those countries should be allowed to have them.
I mean, I know we're in a tricky position, which is if Brussels passes regulations that, oh, God, everybody worries that China, Vietnam, Singapore is going to copy them. But those are countries that were anyway censoring media and controlling media. And so if democracies want to pass regulations, they're going to have to do it. Ditto all the copyright, which is on the other side of the equation. Right. That's provisioning news and providing for a supply of quality news. All those tech companies have to pony up money and they should obviously be taxed. So I have a lot of ideas. I wish I could say there was one big grand idea. I don't. But there's millions, dozens of things that need to be done straight away. And also, as people like former UN Rapporteur for Justice David Kaye keep reminding us, in line with U.N. international principles on freedom of expression.
Maria Ressa: Let's not even go to the Facebook oversight board.
Anya Schiffrin: Let's not.
Maria Ressa: There were several books that I thought were fascinating, like Yochai Benkler that might be looking at the role of not just social media, but how media took the lies and amplified the lies, and Kathleen Hall Jamieson's book Cyberwar said very similar things, but looking at the narratives, right? Media's role. You've talked a lot about social media but it has definitely impacted journalists and news organizations. How did you see our roles evolve and where does it need to go?
Anya Schiffrin: Gotcha. Well, I think that journalists have done a really good job of calling out the lies of being skeptical and of providing quality information. So I wake up every day in admiration of people like you, your colleagues at this TV station that was forced to close in the Philippines, our colleagues in India who are struggling so much right now, in the US with threats against journalists. So I think the journalists have done a fantastic job of reporting what's going on and really staying with it. I am not in that business of bashing journalists. I also think that in the US they've done a really good job of calling out the lies and now saying, actually, "That's not true. Trump said eggs, but that's not true." So I think that's really important for the future. As you know, all of this is being done while journalists are facing financial ruin as well as physical threat. So I think the main thing we have to do straight away is get more funding to local and quality journalism around the world. I think that where countries that have an Australian Broadcasting Corporation or BBC should thank their lucky stars. And, you know, sure. Criticism is great, but you don't know what you thought till it's gone. I think that we need to see a lot of things like some of stuff Canada's doing. You know what? Victor Pickard talks about vouchers and tax credits for media outlets.
So I think we need to see a massive influx of funding. And I think that journalist will, you know, keep keep doing what they're doing now. I have edited three volumes on the problem of media capture, and that's, of course, extremely serious problem in many, many places in this world and a growing problem where what you have are oligarchs or autocrats combining forces. So Turkey, Poland, Hungary are all examples where the media has been put in the service of the state and the rich person that has interests. Dare I say, this is probably the problem in the Philippines. So I do think media capture is a terrible problem. And it hurts me when my students say to me, you know, they're from Brazil. Oh, the media's terrible. There is to horrible job. They're so corrupt. They're just working for the rich. Yes, that is a problem. And we have to do something about it. And again, there's lots of regulations and work that's been done on this. But I don't think we can let that distract from the real problem, which is we need far more funding for quality information. And a lot of that is going to have to come from the tech companies.
Maria Ressa: So I agree with everything, and so we talked about media. Let's talk about the people. People who consume the news, the people who are manipulated on the platforms, the people who are micro-targeted. I mean, the goal of influence operations is to change the way people think and ultimately the way they act. Right. And that's happened in the last four years. So how do we in a democracy, how do people in a democracy, people who have – you have Portland, you have the things that have happened in the United States – is shocking to the rest of the world as well. Right. So now how do you see this? How will people change what they've been – I'm going to say Pavlov's dogs – what they've been conditioned or or led to believe?
Anya Schiffrin: Yeah. So it's a three part problem. Right. So clearly, one thing that we need, you know, I always look at the supply and the demand – so the demand is the audiences and the supply is the garbage that Fox is putting out or the tech companies are putting out. And we have to address both sides of this problem. Right. So we know perfectly well that in India, you know, Facebook executives who are aligned with Modi refused to stop putting out incitement. Right. We know and we've seen examples of this over and over again. Wall Street Journal had that article about how within Facebook, there was a whole unit that was looking at not recommending all this polarizing material. And it got shut down. So a huge problem we know is that the recommendations and the algorithms, you know, that YouTube, and all these places are sending people down the rabbit hole. So that's that's something that has to stop straight away.
Anya Schiffrin: Then the other part is, of course, loads of funding for media literacy training. People need to understand, you know, in the schools, they have to be taught what's true, what's not true. And again, so often it comes back to the journalists around the world or all these journalists were volunteering to go into classrooms and teach people, teach kids like how to understand the news. But that should be more systematized. Right? Obviously. And I've actually got a paper coming out. Peter Colin Jones wrote most of it. And I've got I've got a bit of a contribution in his paper on media literacy, though, come out this fall, especially looking in the global south. And then the third point, you know, I'm married to an economist, so obviously there are economic and societal routes to all of this. And one key point is the fact that when you have young men with no jobs there and weapons, they're going to get up to trouble. And I don't care what country you're looking at, but these guys, these terrifying militia that are showing up right now and beating up – I've got one of my college buddies was out with the Wall of Moms every night in Portland where the militia are showing up with weapons. And the police in some places are just giving them bottles of water and thanking them. So that core army of young men with weapons, well A, we shouldn't have guns like that in the hands of people to begin with but that's a whole other story in our country. But B those people, they should all be given jobs straight away. I was so touched by the prime minister of New Zealand who says that every kid under 30 should be volunteering, taking care of somebody, in school, or working. And we need massive jobs corps to reach these kids, you know, whether it's in the inner cities of New York where I live or out in the red states where a lot of manufacturing and mining jobs have dried up and folks are taking drugs like they need jobs and they need to spend less time getting riled up on social media looking at YouTube and more time out doing something to help society. Absolutely.
Maria Ressa: You sound like a voice of reason. Last question. What makes you optimistic that this is all going to happen in time? Well, we have free and fair elections in the United States. I have vested interest for this to happen quickly. I see a runway of maybe a year or so. Right. So what makes you want to try?
I just want to explain that in my family, I came from two generations of refugees on both sides. So my family loved on my dad's side, they left Russia in 1917 and they left France in 1941 after the Nazis arrived. And my family in Spain were a military family. And my grandfather fought for the republic against the fascists and couldn't leave, and they left in 1939. So both sets of my parents and both sets of my grandparents were refugees. And in fact, on my grandfather's side twice, two times, and the second time was a lot of fun. I can tell you, because he was older. So I think I'm probably hyper-aware of the dangers of what used to be called propaganda, what we call now just information, misinformation and the dangers that it poses to society and to democracies. I'm completely aware and have been really since 2016. This is a key personal thing I worry about. So I'm worried, extremely worried.
The one thing that gives me optimism. Well, there's two things. One is that in the time that I've been studying this problem, I have seen so many good solutions and good recommendations coming out from academia, from think tanks, from civil society. But four years ago, everybody threw up their hands and said, well, we don't want state censorship. We don't want corporate censorship. Oh, gee, what are we going to do? You know, they played into this, right? They would show up at the hearings. So Congress does not know what they're doing. There's a bunch of old people, they don't get it. Which is, by the way, what everybody always says. Right. Like, that's what the IMF used to say. Oh, this is complicated. You hide behind the technical stuff so that you diminish and cut out everybody else.
So that's there's plenty of know how. There's plenty of knowledge. There's plenty of solutions now. And we need to implement them. So it's the political will. And then the second cause for optimism is obviously our young people. I mean, living in New York City of new COVID pandemic and watching my students every single day, take care of each other, take care of the sick, you know, send in masks, go out and do volunteer work. You know, march, speak up and criticize all of us about Black Lives Matter. So I just think the next generation are incredible people. And I apologize all the time that we didn't fix all those problems. But absolutely, I think they are capable. And I hope that they will.
Maria Ressa: And I hope you're right. Thank you. Thank you so much.
Anya Schiffrin: Thank you so much. Really an honor to have all this time to talk to you, Maria.