Right before America votes, I can’t help but think we are underestimating the impact of technology on our behavior. Behavioral research has long shown that we vote, not based on what we think, but on how we feel. Now imagine that in today’s information ecosystem, which uses neuroscience and predictive analytics to shape our behavior. It’s an extractive business model that allows “fake news” to flourish.
Technology upended journalism, pitting evidence-based news organizations against propaganda, hate speech, and conspiracy theories. Mocha and QAnon won; news lost. That’s what the data shows.
For about a decade now, technology has been a tsunami, devastating and reshaping industries: music (Apple, Spotify), hotels (Airbnb), transportation (Uber, Grab), and so many more. News organizations lost their gatekeeping roles to technology. That upended the information ecosystem and the distribution of facts. It began to reshape systems of governance and change the geopolitical power balance.
The new gatekeepers had easy confidence because they cut the world into quantifiable pieces – never mind if it isn’t true, and never mind the consequences.
Move fast, break things
Decades back, I remember asking questions the engineers couldn’t answer about the impact of their choices on society: on personalization; the lies that you can only mute or block; the unintended consequences of “friends of friends” growth.
- Personalized recommendations moved from whatever you searched for (that followed you for weeks online) to personalized news feeds. The question I asked then: how can we all have our own different realities? How can news exist in a world like that? If there is no shared public reality, no shared facts, then democracy as we know it is dead.
- Lies that you can only mute or block drive me crazy as a journalist. Because they stay and continue to spread: the more salacious, the faster the spread. Emotions, like anger and hate, spread faster too. That alone showed the abdication of responsibility of the new gatekeepers and how little respect they had for the public.
This played right into long-running Russian military tactics which the public saw in Crimea in 2014. Called the firehose of falsehood, Russian propaganda is meant to “entertain, confuse, and overwhelm.” Fast forward a few years, and now you have leaders of democracies, elected with the help of social media, mimicking the same tactics, like President Rodrigo Duterte or President Donald Trump.
- How every social media platform uses algorithms recommending friends of friends to grow your network and their user base. Except that one seemingly minor decision, while good for growth, also built filter bubbles and divisiveness – us against them thinking – into the platforms distributing the news and consequently, into our societies.
These 3 practices and many more are part of a god’s eye-view of data, where content is neither good nor evil, and the goal of accumulating data above all feeds the business model of predictive behavioral analysis, the cornerstone of what Shoshana Zuboff calls “surveillance capitalism.”
One root cause
In her groundbreaking book, The Age of Surveillance Capitalism, this Harvard emeritus professor argues that many of the problems we’re dealing with today – the weaponization of social media, the destruction of data privacy and security, our inability to hold technology platforms to account – are all part of the same problem: the extractive business model which is insidiously manipulating us without our knowledge.
Working with Shoshana in The Real Facebook Oversight Board, I’ve put words to my experiences, like “radical indifference” – what she calls “an asocial framework” for evaluating data because in this world, all data is equal. The key is volume, not quality. This is the core of “fake news” – strip branding from news organizations, which differentiate between fact and fiction. In this world, lies laced with anger and hate spread faster and further than boring facts. (READ: The Spread of True and False News Online)
So what do we do about this? Our solution in Rappler is 3-fold, aligned with our founding principles: 1. Demand accountability from technology; 2. Protect and grow investigative journalism; and 3. Build community and civil society. (READ: As democracy dies, we build a global future)
This month, the Information and Democracy Forum will be releasing 4 papers from our infodemics working group that former EU Parliament member Marietje Schaake and I co-chair. We recommend policy frameworks around 4 structural challenges: the transparency of digital platforms; the meta-regulation of content moderation; the platform design and reliability of content; and mixed private and public spaces on closed messaging services.
We see this as a jump-off point for governments, technology companies, and civil society organizations to begin a much needed public discussion with academics, lawyers, and journalists.
Technology and the pandemic have destroyed our old world and are forcing all of us to actively create a new one. Defining the problem, bringing in all who need to help solve it – these are only the first steps to make sure we create the world we want.
Regardless of who wins the US elections, we need to act now. – Rappler.com