Maria Ressa talks to Wired Editor-at-large Steven Levy about his book Facebook, the Inside Story. Levy gives readers a front-row seat into Facebook's transformation from coolest tech kid on the block to nemesis of democracies through the journey of its founder, Mark Zuckerberg. Levy was among the few journalists who had access to Zuckerberg and had interviewed him from his college days.
Maria Ressa: Thank you so much. Hello, I'm Maria Ressa. Thanks for joining us tonight. I am so excited to speak with Steven Levy. He is editor-at-large of Wired Magazine and most recently, while he published the latest book on Facebook, The Inside Story last year. Before I met him, I read his book and that he published in 2011 called In the Plex about Google. He knows a ton about Silicon Valley and what it's like to report on it. Steven, thanks for joining us.
Steven Levy: Thank you. It's my pleasure, as always. Great to talk to you. By the way, it's kind of funny you say the book came out last year. It seems like last year. Seems like 10 years ago, but it was only February.
Maria Ressa: Yeah, you're right. OK. So it is the most recent book on Facebook. And in order to do this book, you spent a lot of time in Facebook and interviewed Mark Zuckerberg 9 times. What was it like?
Steven Levy: He actually was surprised. I went back and counted. I thought I'd talk to him like maybe seven times. Not every time was a long hourlong interview. But, you know, those are nine times where we talked and I had a tape recorder going. And sometimes he was just give me a preview of some manifesto he was going to drop. So it's been interesting to cover him during the course of his career, not just in the 3 years that I was working on the Facebook book, but I first met him in 2006, and that was before Facebook even had opened up to people beyond students. And there wasn't even a news feed at that point. And at that time, he was an extremely awkward communicator, almost a non-communicator. It was hard to get things out from him. And I learned when I was doing the book that sometimes even when he was addressing his small staff, sometimes he would get nervous. He would have to lean against a desk or something because, you know, he was just nervous talking to that small group of people he trusted. And he's grown a lot from that.
But certain things have remained stable. That's why I looked at his childhood that I even talked to his parents to learn that the basis of his personality is someone who is very dug in and stubborn. He has what I think he really, truly believes are his principles. He does believe in taking in information. People around him sometimes call him a learning machine. But during the course of Facebook, growing to a big company – and here's where I guess the "corruption of the ring" comes in. You know, when you're, you know, with a big company and you feel that your mission itself is good, you feel what you do to forward the mission automatically is good, even if it can be harmful. It's tough to move him from those core principles. He can be very dug in.
And I think he's just taken a punch in the past few years. He didn't expect to be vilified the way he has been. There's always been criticism of him, but he's been able to shrug it off. And it's a slow process which is still happening. I think actually maybe it's reaching a climactic point now where he's looking at what he's done and it's taking him a long time to understand what's wrong with it. And the last couple interviews I did with him, I almost consider it one long interview. I talked to him twice last year, once in the spring and once in the summer, right here on the Fourth of July. And they were much more candid than I'd seen him be not only with me, but with any journalist. Where he was? I think trying to come to grips with that. He still wasn't budging off it. And you could see how dug in he can get. You know, the example you always you can cite is his idea that it's okay for politicians to lie and take out ads and buy their way into your news feed with a lie. That's insane. But he sticks to it because he has his own set of rules. But what happens is that the results are often something that even a casual observer can say, wait a minute, this is wrong. And at that point, Facebook often has to retreat.
And I think just now, probably in his head, he's coming to grips with the idea that, gee, maybe the reason why things are so wrong isn't that here's one tweak we have to fix but there's something fundamentally not right, if not broken. Other people say broken, but he wouldn't with Facebook. From what I know of him. I think he's grappling with that now.
Maria Ressa: It's so interesting because you were with Facebook at a period of time when he went from being a hero to a villain. And you watch this transition and you actually had tremendous access to other people around him. Sheryl Sandberg, Monika Bickert. One of the moments that stood out for me in the book was when you looked at how the company dealt with Russian disinformation and that is news again today as the US walks into elections. You were very categorical. They knew there was Russian disinformation and they took it out. I mean, what did you see then? And how have you seen it evolve today?
Steven Levy: Right. Well, you know, so they missed the Russian deep involvement during the election. They understood that the hack of the DNC [Democratic National Committee] that the Russians did, you know, was publicized through Facebook, but they didn't take the steps necessary to look at whether the Russians were involved in the kind of fake news, so to speak. You know, that make believe articles from make believe publications that proliferated on Facebook in 2016. Facebook decided to do nothing about it because, you know, their rationalization was that it would be tilting the playing field. It wasn't until a few months after the election that one of their research sites, as I described how that happens and who the guy was in their D.C. office working on this little intelligence team and Facebook, he discovered it and they looked into it and we're horrified. They did go to the federal authorities. They did not share that with the public. And you could argue and I believe that if they should have been more transparent about that, I think it was a convenient course for them to take where they could say, oh, well, we we have to keep it secret because, you know, it's national security. But that wasn't the case. There was nothing stopping them from sharing this with the general public. And, you know, they look bad for that. So now, actually, the behavior is different now, isn't it? Now, even Facebook, you know, understands that they have to call us out, despite the fact that the government, which they've gone out of their way not to offend, isn't eager to have that narrative publicized.
Maria Ressa: I'll bring it back again to Mark because he's so fascinating, because he has so much power. You could argue he's the most powerful man in the world, right? I mean, we could. I feel like in the Philippines, at least, the two most powerful men that we are trying to hold to account are President Rodrigo Duterte and Mark Zuckerberg. This is a man who still very young, very, very smart, but very entrenched in his world view. And he is kind of he's been the poster boy for the engineering way of breaking things down into solvable things. But the world you know, as a journalist, we know the world is extremely complex and it rarely can you ever break down truth, for example, or political systems into steps. Right. How have you seen him grapple with this? You talk about it earlier that he's grappling with it.
Steven Levy: Yeah, well, you lit on something when you say he's been this way for his life, he's never worked for anyone. Picture that.
Maria Ressa: Exactly.
Steven Levy: He never worked for a company. He never had a job where he had a real boss when he was a student. He made money taking on contract work. And as we saw in one of the contracts he was working on, he felt afraid basically to screw over the people he is working with, the people who are building a product, and they say that he stole it from them. And I think that's overblown. But he definitely did misrepresent the work he was doing in order to work on his own project, which was going to compete with theirs. And when I told him I believe that he just shifted uncomfortably and said, you know, I don't think you understand. But, you know, the evidence for that is clear.
Facebook has these things called Community Standards and they didn't arrive at them in a methodical way. It was sort of an evolutionary thing that when people – not Mark – but the people who had to answer emails from users started to realize that misbehavior was happening on the platform page, informally did a set of rules. And from that little kernel, it was like one sheet of a word document. It expanded to something that, you know, got up to 40 official pages and thousands of pages of interpretation. You know, the Talmud or something. But because this document exists it's viewed on Facebook as almost code that runs the thing. So, you know, if there's something that looks wrong but the code says, oh, there's nothing violating the code here, they'll let it go. And that's what happened recently when we saw, you know, evidence that was just wrong. These militias were gathering new followers to come in and break up protests in a violent way. People reported it hundreds of times. And the people, you know, who got the report said, gee, there's no violation of our terms here. And I think that happens again and again that we see Facebook saying, hey, no violation. We looked at it. Look, here's our rules. But everything we can see, that's wrong. And then often Facebook will retreat like what happened with anti-doxxing. Right. And you've seen it yourself in what they do in these international situations where people in a very methodical way abuse Facebook sometimes in ways that violate their terms and sometimes in ways that the terms just don't anticipate how it's being used.
So, that sort of indicates there has to be some bigger thinking going on here. There's something wrong. And that's what I think to this date. That's the step that Facebook hasn't taken. And I think it's fighting back on them.
And here's what's really getting Zuckerberg now, is that his own employees see this fallacy and they are not standing up for that. And that really is something that alarms him because all along, he's a person who thinks in long horizons. He bought this virtual reality company because he felt, 10 years down the road, this is going to be the next bubble in terms of their platform. Everyone's going to use and and Facebook almost had a death experience by being late to mobile that they recovered very well. I tell that story, but he doesn't want to be cut short again. But in order to do that, in order to stay relevant 5 years from now, 10 years from now, he has to have the best employees. He has to get the best talent and keep the best talent. And now the best talent is embarrassed to work for him.
Maria Ressa: What do you see this heading, especially with elections coming up in November in the U.S.?
Steven Levy: Oh, it's a scary situation because if you look at what's circulating on Facebook and, you know, there's still a lot of misinformation, some of which violates our terms of standards and they're trying to take down and other things which don't violate their terms of standards. But you look at it and say that's wrong. You know, that basically the yardstick should be that Facebook is not a major force for divisiveness and misinformation. And Facebook would agree with that statement. But, you know, the evidence is overwhelming, that it's a factor in that. And they profess to be a data-driven operation so that's what they have to grapple with.
Maria Ressa: It's like they're not looking at their own data or the data is geared more towards growth. Can I ask you again, because you've spent so much time with Mark. His ideas of good and evil, does he think in those terms? I talk a lot about how when we have our values, we have to draw the line where on this side you're good and on this side, you're evil. What I've read in your book is just about the way things and the way he breaks everything down and single-minded focus on his goal, which in terms of Facebook was growth, was connecting the world. And it's only recently that this started to talk about all the dangers of connecting 3.2 billion people. Does he know, does he set the line where on this side he's good and on this side he's evil? Has he set that for himself?
Steven Levy: Well, for a long time – they still do this – you know, it's a measurement issue. They say, "Well, really, we're only talking about a small percentage of the things on Facebook which are divisive, were possibly harmful." Things they don't get in time, "eventually AI will address that."
And so they've thought about it. More on the way that they always thought about it. When Facebook was a student publication and the consequences weren't as dire and it's harder to see things that way when people die, you know, it doesn't happen everyday necessarily. But, you know, there have been cases where the consequences of things on Facebook have resulted in real harm to people. And so they have to look at those problems more like a plane crash happened than if someone's feelings got hurt down the dorm.
And and I think it is you can't separate their self-interest from that. I have to tell you also. I mean, I enjoy talking to Mark Zuckerberg. When I talk to him, I don't feel like I'm being manipulated. I think I mean, I got to the point with Mark, where he was showing who he is. And he's a person who is very single-minded. He wants to do good, but his ambition gets mixed up with that. And that's why I tried to very methodically go back and show my readers step by step where that happened.
There's one chapter about growth. Yes. And it was a great untold story I felt about the way the growth team operated. Now, it wasn't that he brought in someone who said, Mark, Mark, Mark growth's got to be number one. That was always what Zuckerberg wanted to do. Really? Facebook wasn't even two weeks old before he was planning to move it from Harvard to knock off the equivalent at Columbia. Right. So growth was always big with him. But he brought on this amazing guy named Chamath Palihapitiya, who was like a madman for growth. And he bulldozed all the norms in order to do it. And he had his own cadre of people. He was given the best talent in the company to go in and do crazy things to try to boost growth. You know, things like stepping on privacy issues and things like that. And, Mark, let that happen, right. That was something that he didn't want to get to the bottom of what that group is doing.
And I think that same impulse of saying, well, we've got this guy in Washington, this guy, Joel Kaplan. And they sort of outsourced a lot of things to give Kaplan a voice on that. When you think about it, people are sort of waking up to this now and saying, wait a minute, why is the chief guy in charge of the lobbying of Facebook and talking to politicians, getting a say in what content it can stand up and not stand up on Facebook? He's in the loop on there, and there's no reason why that person should be in the loop. It should be solely based on what the content is and not political considerations.
Maria Ressa: I have to ask you about Sheryl Sandberg. You know, she is many people will say, you know, Mark was younger, very driven on the tech, but she controlled a big chunk of the company. What happened?
Steven Levy: All right. Well, you know, I think Sheryl's story, I'm I actually am very, in some sense, sympathetic to Sheryl because to me, she exemplifies, I guess, what people think of as the meritocracy in America.
I mean, she came from relative privilege, her family, a lot of physicians in her family, upper middle class. You know, she went to Harvard, worked hard. She believes that if you work hard enough, anything will come to you. You work hard, you get an A+.
And she worked for the World Bank. Did some projects there that helped eradicate disease. And she worked for the Treasury. Her mentor became the secretary of the Treasury. She goes to Silicon Valley, works for Google. Helps build their ad operation and then takes this step up to somewhat be a mentor to Mark and to also build their ad business there – their business model. Their business model's very much based on this data it has on people. You know, as it was with Google, with Facebook, though, that the data is deeper, that they know all your behavior, and then they purchased information, know more about it. And again, she was convinced that the mission was good.
But I have to say. It was always clear that Mark Zuckerberg was making the big decisions there. What he did was he outsourced that part of the business, the parts he wasn't so interested in to Sheryl. And when the problems came up on her watch, she didn't call on him to say, "Mark, we've got a big problem. Help us out here," because the way the company was split, all the engineering, even in the ad world, was reporting to Mark. And those people, that army that Mark had had to help Sheryl's people if if this was going to be addressed. And so, as a matter of fact, the Russian involvement didn't even come to Mark's attention until the chief security officer, a guy who reported up a chain to Sheryl, skipped over her and sort of writing to Zuckerberg lieutenants and saying, "Hey, there's this problem here."
So she missed that. And I know she knows that and that hurts her. And she's in this box now. I think life is probably tough for Sheryl in that sense. And then in the professional sense and to be honest, I would be surprised if in two years Sheryl's there.
Maria Ressa: Wow. Can you imagine a world without Facebook? Because I don't think I can. Like, how do you see Facebook now shifting to this new world that they've helped create?
Steven Levy: Well, you know, Mark made some very smart moves that, you know, a lot of state attorneys generals and DOJ, Department of Justice attorneys and people in Europe are looking at now to buy two other companies that he's now trying to integrate more closely into Facebook. That's obviously Instagram and WhatsApp. To be honest, I think the future of those companies is more promising than in Facebook. I think the vision would be that Facebook, the company would be this multi-headed, the seizure-like operation where, you know, you would be a customer of this of the corporation and various components of it. It would be part of your online life. But, you know, nothing is forever, really. And it's unclear whether in 20 years we'll we'll be talking about Facebook like we talk about whether its power will be what it is. I don't think, you know, someone who is 20 years old will be growing old with anything like the Facebook that we know now.
Maria Ressa: Let me but let me ask you about the Google and Facebook together, given the books that you've written. Harvard Professor Shoshana Zuboff both wrote about surveillance capitalism, how the technology has taken our data and has created markets for prediction of behavior and is selling this in advertising, I guess. How do you put all of this as has been the context or why there's many? See, there's been a blowback against Facebook. Google hasn't really been caught up in that. You know, what is Google doing well. And I guess in terms of surveillance capitalism. being inside, do you see this changing? You see the business model changing to the point where technology will be a force for good.
Steven Levy: Well, it's interesting. So I just you know, a couple weeks ago, I wrote an afterword for the Google book because we're going to come out in paperback on the 10th anniversary of that book. And at the time I wrote the book. You didn't think much of Google and privacy issues. You didn't think much of it in the context of what we're calling surveillance capitalism. I think in the 10 years since, you know, forces have moved Google in that direction. I mean, one simple thing is that when our technology is in our pocket, you don't sit there and look for those 10 blue links. When you ask for a question. You need an instant answer. And in order to give you that perfect instant answer, Google has to know a lot about you and your habits. And it's very enticing to say, hey, if you give us this information, we can give you the perfect answer.
And in a way, if in fact you have a choice, that's the key, you know, that might be worth it. I think the problem is that we're asked to give up our information and places you to get hold of our information, even without our knowledge, without being informed of what that tradeoff is. And I think that, you know, what Apple just did is basically say in their next version, the operating system, they're going to tell people, say, hey, this place wants to get this information off. Do you want to make that tradeoff? And people will say yes or no. And Facebook says that's gonna cut into our ad revenue. Well, guess what? You know, it should because people should be making informed choices. And I think in part, I lay a lot of actually who I have a lot of the blame for that surveillance capitalism on the fact that it's legal to do it. I don't think it should be. I think that we should have strong laws to protect people. And this is to say, if you're going to have this information in the hands of these companies, you've got to know what's going on and make an informed choice. And you should have choices. And it's a failure of our institutions that that's been allowed to happen.
Maria Ressa: Fantastic. You know, I remember when I read In the Plex, it's really the wonder of it, you know, that's what your book awakened in me. And in the beginning of Facebook, Rappler, is a fact-checking partner. We work very closely with Facebook and I keep saying we're frenemies.
The last question I have really is a question I struggle with myself, which is given that given where we are today, given the failure of legislature, given that journalism is in trouble and we are at the front lines when it's about facts. What can we do? Because I am also convinced that we need tech. Well, what can we do to help them help themselves – enlightened self-interest? What can we do with them?
Steven Levy: I think that's a great question. And you I think we're we have to support the places like Rappler that they do serious journalism that try to sift the truth from misinformation that get to the bottom of things and not spread the message for misinformation. Call it out when you say it. You exercise your right to vote. You know, you have people who are in office who support the truth, who run governments, you know, based on the truth and not try to deceive the people who work for them. Sometimes that's not so easy in some places, as you may know. And you're absolutely correct that we have a role in this.
And it addition, we talk about, you know, Zuckerberg, you mentioned him as the most powerful person. You know, he's uncomfortable in some sense with that. And he's trying to do this oversight board to take some of the heat off. But ultimately, this is his. And I feel that if it wasn't Mark Zuckerberg, there would be a different Facebook. You know, these these kind of companies are working in a winner-take-all ecosystem. It's not an accident that one search engine dominates, that one social network dominates.
That's something that we've allowed to happen. And if the nature of it is that it has to be that way, then we need to figure out ways to regulate it so it's not abusive. And so it is it is up to us to put pressure on regulators to do that, but also to, you know, hey, support the places that are going to circulate the truth. You know, a lot of people just say, I'm going to delete Facebook and they do. And, you know, we miss them. It's good to have one place were all your friends are. But if that's what you feel you have to do. Don't look back.
Maria Ressa: It's hard, I mean, definitely in the Philippines where so many are, we have a Filipino diaspora. We live on Facebook. 200% of it was on the Internet or on Facebook. I know I said last question. This truly is the last question given elections in the United States. You know, people are going to vote. But given that, facts are debatable. Can you have integrity of elections without the facts?
Steven Levy: I don't think it's a great situation to do that. I mean, it really if if the election winds up being about whether we're going to choose a system where facts matter or facts don't matter. Well it's even bigger than a leader, isn't it? Things don't work out great when you throw out the facts. History teaches you that. And even leaders who scramble to the top based on lies wind up getting a comeuppance eventually. So I think I'm on the side of truth. That's my side.
Maria Ressa: Thank you so much. Guys, you gotta read Facebook, the Inside Story. Steven Levy, thanks so much. Always a pleasure. – Rappler.com