Disinformation

Rappler Talk: Sinan Aral and social media’s Hype Machine

Rappler.com
Rappler Talk: Sinan Aral and social media’s Hype Machine
Rappler CEO Maria Ressa talks to professor, author, and entrepreneur Sinan Aral on how social media disrupts economics, the results of elections, and our lives

Rappler CEO and Executive Editor Maria Ressa talks to professor, author, and entrepreneur Sinan Aral on how social media affects the choices we make – disrupting economics, the results of elections, and our lives.

Aral is author of The Hype Machine, where he lines up critical ideas on how Americans can protect themselves in the 2020 election and beyond. In the book, he exposes how the tech behind social networking sites like Facebook hopes to change the way its users think and act.

Aral is the David Austin Professor of Management, IT, Marketing, and Data Science at the Massachusetts Institute of Technology (MIT), and Director of the MIT Initiative on the Digital Economy (IDE).

TRANSCRIPT

Maria Ressa: Hello and welcome. I’m Maria Ressa. We are live and, you know, we’re going to be talking about one of the topics that has obsessed me for the last four years since Rappler came under attack. And we’re talking to a man who has spent decades studying this scene. Sinan Aral is MIT’s Director of the Digital – and here I’m gonna get it right – Initiative on the Digital Economy, Sinan. Thanks for joining us.

Sinan Aral: Thank you. Great to see you. 

Maria Ressa: So it’s all you are also, of course, I forgot to say this, but you are also not just a professor. You’re an entrepreneur. But more than that, your book, The Hype Machine, is releasing on September 15, Sinan. And I’m so excited for this.

Sinan Aral: Yes. It’s been 20 years of research and four years of writing. I’m so excited to share it with the world and to hopefully move this conversation forward to a much better place than right now.

Maria Ressa: So one of the things that I’ve talked about on and off of forever for the last four years has been how Facebook, which has is essentially our internet in the Philippines, how this has helped the rise of these digital authoritarians, populist authoritarians all around the world. How do you treat this in your book?

Sinan Aral: Well, you know, this is unprecedented. So about 10 years ago, all we had to connect humanity digitally was the phone, the fax machine and email. And in the span of about 10 years, we, essentially, through high octane gasoline on human beings’ need to communicate and coordinate with each other on a global scale. And what I call the Hype Machine, the social media industrial complex, essentially blanketed the planet.

This brings everyone into a one-to-one or one-to-many and many-to-many conversation online. And it is completely directed by algorithms that not only steer the evolution of the human social network, but also the flow of information through it. So in terms of its role in civil society, whether used by governments, authoritarian governments and or democratic governments, as well as by social movements and society and political campaigns, it really is the information ecosystem that determines our reality. And when you think about that for just a split second, you realize how powerful it is. Given that that is what it is, whether it’s a political campaign or an authoritarian government or a liberal democracy or a small business or an online newspaper website, a community like Rappler, it really is the conversation. 

Maria Ressa: The problem, though, is that given that technology, right. You know, it’s that technologies become now a behavior modification system which could sap – some people are saying this, Shoshana Zuboff says that it takes away free will, essentially that it is. And in that instance, if it’s doing that, how can we have democracy?

Sinan Aral: So this is an incredibly important point and one that I describe comprehensively in the book from first principles, as a scientist. So I’d begin with the neuroscience of social media, how it affects our brain and how that translates – the neuroscience of it all – translates to behavior change. And then I get into the notion of scaling mass persuasion, which is really the goal of the Hype Machine of social media, because it’s built on a business model of advertising which is designed to persuade people in the realm of products and services that they should be buying. But as an advertising persuasion machine, it also is available to political campaigns, to governments and so on. And we’ve done very large scale studies with tens and hundreds of millions of people in it in them, to study how it affects behavior change. And it does affect behavior change.

And so then the question becomes, I don’t think I’m not I’m not somebody who believes that social media eliminates free will. But I am somebody who believes that it has tremendous persuasive power. And it’s driven by algorithms that have specific objectives which are tied to the maximization of profit for the platforms. And all of these realities raise important questions about how we are going to adjust what I called the money, code, norms, and laws – the four levers we have to guide the evolution of social media to a place where we can avoid authoritarian governments. We can avoid genocidal propaganda. We can avoid the incredibly dramatic rise and distribution of falsity and false news in a way that alters our reality. These are incredibly important problems that we must address. And the point of the book is to discuss how we can address them.

Maria Ressa: You know, one of the most… what I felt was a brilliant one graphic that you had in the book, and you you encapsulated all of that in that. Let me let me pull that graphic up. And if you look at this, you describe the mass persuasion, the personalized mass persuasion, which is the trend at the right. And then you describe the levers – money, code, norms, and laws. But I thought it was what I haven’t heard many people write about or discuss is really this Technology Trifecta, which is the process of how this controls us. Or how it could. Could you go through this? And then, you know, then go through the levers and how it could impact that.

Sinan Aral: Absolutely. So this graphic encapsulates the layout of the book, in a sense. And in the middle, you have the technology trifecta, the three technologies that together came up together in a way to create the hype machine that we have today, and that’s digital social networks like Facebook, Twitter, LinkedIn, WeChat, WhatsApp – and that the medium, which is the smartphone. And the important thing about the smartphone is it’s always on and always in our pocket. So it does two things: it gives us constant notifications which change our behavior, and it collects constant data about what we’re doing, where we are, who were messaging and so on. And then finally, what I call the Hype Loop, which is the central process of the hype machine. It is the dynamic interplay of machine intelligence and human intelligence, where the machine intelligence are algorithms designed to maximize engagement in order to maximize profit. And then human being, our behavior and thinking are affected by these algorithms, feed algorithms, friend’s suggestion algorithms and so on. So that’s the technology that exists now.

The rise of this Trifecta, which created the Hype Machine creates three important trends, each of which get their own chapter in the book. One is the rise of personalized mass persuasion. So if you can think of television as mass persuasion, you can think up the Hype Machine as personalized population scale mass persuasion, where persuasive messages can be targeted at the individual level, but at population scale at the same time. What are the consequences of that and how do we deal with it?

The second major trend is hyper socialization. So the Hype Machine essentially injects the opinions of our friends, family and the crowd into every decision that we make in the world. And if you read the wisdom of crowds literature, what you find is that the wisdom of crowds is based on independent thinking. What the Hype Machine does is it makes completely interdependent our thinking. But not all of us to all of the rest of us, but in these tightly knit clusters of polarized communities that are very different from each other. Which is why we get affective polarization on the rise around the world where political parties hate each other and different types of thinking are witnessing very different realities.

And then the final trend is the tyranny of trends, which is this hyping up of everything. And the reason why this happens is because they take things that are trending and amplify. So algorithmic amplification of ideas, thoughts and behavior makes everything go viral much faster than it would. And in fact, we know through decades long studies that we published in places like science that falsity travels farther, faster, deeper, and more broadly than the truth on social media and on the Hype Machine, as a result of all of this.

Finally, we’ve got four levers that control the Hype Machine, and I go into each one of them in detail throughout the book. In order to fix the social media morass that we find ourselves in, and that is money, code, norms, and laws. Money are the business models and incentives created by those business models. Code is the design of the platforms and the algorithms. Norms are how we as a society choose to use the technology and laws, of course, or regulation around free speech, privacy, hate speech, anti-trust and the like.

Maria Ressa: It’s so fascinating – just going through the book when when you do this. But let me just pull out two strands. The hyper socialization that you talked about in the book – you talk about this as it my my way of saying it is it’s like an “us against them” mentality built into the design of the platform. You’re playing to the crowd and you bring your tribe closer. In this situation, because it’s built into the platforms’ design, are our leaders who use “us against them” – messages that use “us against them” and tribalism. They are designed to spread faster and further. Correct?

Sinan Aral: Yes. So two important design elements really are amplifying this. One is that, as I describe in the book, a simple engineering shortcut is evolving the actual human social network in polarizing ways. What do I mean? Well, in friends suggestion algorithms which vastly are responsible for the connections that we make on LinkedIn, Facebook, Twitter and the rest are – it’s a hard problem. I need to find people to recommend to you to connect with. It’s difficult to search the billions of people on social media to recommend someone to you. So it’s easy engineering shortcut to make those algorithms run more smoothly and efficiently is just to look at your friends of friends. But if I do recommend to you friends of friends or I disproportionately recommend friends of friends, it’s going to enhance and amplify the clustering of society into these tightly-knit communities that are so far apart from each other but are much more densely connected within. That combined with the business model and the algorithmic amplification of engagement where I am trying to get people really to share a lot, to comment a lot and to just get riled up. Which is why I call it the Hype Machine, algorithms that favor salacious content, false content, you know, things that are blood boiling. We find that false content. And a 10-year study of Twitter is creating reactions that are much more surprise, anger and disgust. And that is correlated with the much faster diffusion of falsity compared to truth, and trust. So these types of engineering elements of the way the machine is built are contributing to society that we see.

Maria Ressa: Let me just ask two stats that you did have, which is in the Science article. Lies spread six times, seven times faster than than facts.

Sinan Aral: Yes, on average. That’s correct. In different categories of information. False news spreads further, faster, deeper, and more broader than the truth in every category of information and political false news is the fastest, the most viral. It spreads much faster than the truth in terms of politics.

Maria Ressa: So lies laced with anger and hate spread faster and further. Is it fair to say that the social media platforms are biased against facts?

Sinan Aral: Well, I think it’s fair to say that the outcome of a machine that is designed the way it is, combined with the evolution of the human brain as it has and the neuroscience that runs on inside our brains as we use social media combined to create that reality, that bias for salacious, anger inducing, disgusting falsity is spreading so much farther, faster, deeper and more broadly than the truth on these social media platforms. And so their entire chapter in the book called Your Brain On Social Media, which talks to the neuroscience of it all and describes why we’re so destructible to it. I also describe the science of false news, which has a section about our susceptibility to falsity in terms of our neuroscience. All of that contributes to the bias.

Maria Ressa: And again, what’s so interesting when you’re reading the book is that it really takes the weaknesses of human nature and turns it on itself so that you become addicted. So, you know, I guess we know the problem. There’s been a lot written about it.

If you guys haven’t watched, just out also now is Social Dilemma on Netflix. Sinan tells you exactly why. Right. So what do we do? What can we do? And, you know, for a country where 100% of Filipinos on the Internet are on Facebook. How do we save ourselves?

Sinan Aral: Well, you know, there has been a wave of techno utopianism in books and movies, you know, kind of 10 years ago, followed by a wave of techno dystopianism in the most recent time about how I was going to destroy the world, delete Facebook. You know, all of these claiming that the sky is falling. And what I put in this book is one boat under the hood and really talk about the science of how it works and what’s actually known through science about how social media is affecting our world. And two – to really get beyond this techno utopianism or techno dystopianism and say if we roll up our sleeves, what can we do? So the entire last chapter, which is the longest chapter in the book, is dedicated to how do we fix these problems. And there are a number of solutions that begin with regulation of competition, but then go on to data, Russia, false news, election integrity. Each one of these gets its own section and there is a silver bullet. We need to create a community of design ethicists and scholars and politicians, journalists that are on the side of good, that are working together to affect the money, the code, the norms, and the laws together. No one action is going to solve the problem that we have. But I have very real concrete solutions in that last chapter that give detail on each of those subjects.

Maria Ressa: You also talked about these clusters, right? And, you know, for us, we’ve heard six degrees of separation in the physical world. And what’s so fascinating is to see that on Facebook, for example, you have 3.75, is that right? Something like that degrees of separation. You describe it for us earlier, but, Sinan it was very geeky. So how do you tell why? Why does it matter that that if we’ve been clustering together tighter?

Sinan Aral: Well, just take it from the perspective of of two warring factions in a country. And I say that vaguely because it applies to any country in which social media is so widely used. In the United States, we have Republicans and Democrats, in other countries there are other political parties or social groups. Well, let’s say that there is a reasonable mixing of people in general that you get some you have some friends that are from one side and some friends from the other. What that creates is common ground. You see the perspective of both sides. You see the facts that both sides see and it creates common ground and in these negotiations knows common ground is essential to getting positive negotiator mediated outcomes and to having civil society.

But if you have a machine whose algorithms essentially cluster the groups further and further apart and tightly-knit communities that are themselves very closely connected, but not a lot of connections across the groups, you lose that common ground entirely. And in fact, if on top of that, you have algorithms that are feeding different sets of information to these two different groups, then not only are they going to lose their common ground, but they’re also going to have different perceptions of reality entirely which makes it almost impossible to have a conversation or a civil society in that kind of environment. That’s why it’s so important to understand the science of these geeky algorithms is essential because it’s determining what people think and see every day.

Maria Ressa: Absolutely. And, you know, in our conversations, I feel like we live through this. You know, we’ve seen this happen in the last four years. My last question really focuses back to the U.S. because you got walking into elections in November. How can you have integrity of elections if facts are debatable? If people are polarized, if they’re being emotionally manipulated, and if, as we know, geopolitical power, Russian disinformation and domestic interests are trying to manipulate you. How do you have integrity of elections?

Sinan Aral: Well, you know, I am very concerned and I cover this in detail in the book. You listed some of the key problems. So we have tremendous affective polarization in the United States currently. There is a huge divide between Democratic perspectives and Republican perspectives, as well as views of reality. We have foreign governments meddling, interfering, especially Russia, in our election as we speak. We have tremendous amounts of social movements and social unrest in the streets. We have questions about the integrity of the election that are being, you know, the fans of which are being the flames of which are being fanned by information spreading on social media. So it is incredibly difficult to understand how this election is going to unfold in terms of people’s confidence in it, in terms of people’s participation in it, in terms of the informed choices that are being made in terms of voting with election manipulation going on from abroad, as well as hyperpolarization happening on the ground with people out in the streets and so on. The integrity of the voting question from all sides. It is a dramatic moment in history because it’s also one of the most consequential elections that the United States has faced in a long, long time. A generation, perhaps. And so I’m worried and I talk at length about it in the book.

Maria Ressa: I would say it’s the most consequential globally, because this is where Silicon Valley – where the U.S. goes and the decisions that Silicon Valley – has made this impact that the rest of the world. I know I said last question, but I have to ask this, advice for other countries like the Philippines, where the Hype Machine has been turned against people on the front lines, like journalists, like us, human rights activists. You have an opposition politician, Senator Leila de Lima, who’s been in prison for three years now. Any advice for us?

Sinan Aral: Well, I mean, activists, journalists, and individual citizens need to demand and governments need to provide national commissions on technology and democracy in each country that that aggregate state and opposing viewpoints and voices and have a common ground creating conversation to really institute real money, code, norms, and laws that can affect change. And what I mean is, instance, in the United States, I watched these senators, congressmen and women questioning Mark Zuckerberg or other platform leaders about their platforms and technology. They are not qualified in terms of the computer science or the economics to ask the right questions. We need a national commission here in the United States that involves business leaders, scientists, journalists, ethicists, very important, as well as political leaders to come together to create common ground and to discuss these issues together. And the public, activists, journalists need to demand that that common ground creating conversation is institutionalized in real governmental organizations that are diversely populated with the right experts and with the right perspectives.

Maria Ressa: Can this move fast enough before we lose our rights?

Sinan Aral: You know, the speed of the Hype Machine is part of what’s so frightening about it. If you think about it, yes, Facebook was founded in 2004, but social media wasn’t. It hadn’t spread itself over the world as it has a it’s come up in 8-10 years, 8-10 years. And now it is the number one source of news and information for vast majorities of people. And every day we know less and less about how it’s affecting our behaviors or opinions, how the algorithms work, what it’s doing to our society. Information moves at lightning speed and debunking moves so much more slowly than the falsity on these platforms. So the speed of the hype machine, the lightning quickness of the Hype Machine is incredibly troubling. And yes, we need to act today. We need to act right now if we’re going to get control of this scene.

Maria Ressa: Sinan, Thank you. Last thoughts? Anything that you want? Your book goes on sale. Guys, you gotta read it. You know, I’ve been talking about this book since I read a draft in June. That one graph tells you everything. We’ve got to solve this problem now. Any last thoughts? Sinan?

Sinan Aral: You know, I tried to really bring the objective science rather than the hype in this book about the Hype Machine. I hope people read it. But I am so excited to work with all of the dedicated technologists, journalists, activists, ethicists, coders, and policymakers towards achieving the peril, achieving the promise of the Hype Machine and avoiding peril. That’s really the mission of this book, is how do we achieve the promise? How do we avoid the peril? What can we do? That conversation begins today.

Maria Ressa: Fantastic. Thank you so much, Sinan. Sinan Aral, The Hype Machine. The link is there on the site. Guys, read it. Let’s talk. I’m Maria Ressa. Thanks for joining us.

– Rappler.com

How does this make you feel?

Loading
Download the Rappler App!