We had been living in a Black Mirror world; we just didn’t know it.
When we started joining social networks in the early 2000s such as Friendster, MySpace and Multiple, and started using online tools such as Google Search, Yahoo Messenger or Yahoo Groups, all we did then was marvel at all these new technologies, maybe wondering occasionally how these amazing services could be free.
At some point, we figured it was through online ads. The more eyeballs on a website, the more money the site makes. That was not wholly incorrect. But as we’ve slowly come to learn in the years that followed, and as two companies eventually came to dominate – Google and Facebook – we’ve come to realize that the whole business had been more insidious and sinister than that.
Today, there is a term that encapsulates what business the two tech giants are in, beyond the veneer of tech altruism and “connecting the world” or “organizing the world’s information” – surveillance capitalism.
Coined in 2014 by American author and scholar Shoshana Zuboff, the term describes the business model predicated on harvesting user experience through online platforms, smartphones, smartwatches, apps, and other devices, and manipulating behavior for monetization.
It is the subject of her 2019 book, The Age of Surveillance Capitalism: The Fight for a Human Future.
Having even a basic understanding of it is crucial to understanding, some of our behaviors today as we live increasingly in the digital sphere, and vital to our awakening and fighting back against the perverse ways that technologies today have been shaped by a harmful economic imperative.
Zuboff, in her book, repeatedly compares surveillance capitalism to industrial capitalism and industrialization, which have had devastating consequences to nature. She asks, what then will be the toll of surveillance capitalism, left to their own devices. (The short of it: not good.)
Here are some of the basics, derived from those interviews:
Surveillance capitalism makes its money by predicting what a person will do in the future
Surveillance capitalism created what Zuboff calls a “human futures” market. In a futures market, buyers buy a product now to be delivered at a later date, with the hope that the product’s value has risen by the time it is delivered. (READ: Explainer: what is surveillance capitalism and how does it shape the economy)
In a human futures market, that product is ourselves, and our behavioral data, and what we will do in the future. Essentially: what will we buy more of?
The more data that a surveillance capitalist has, the more data that it can feed to its AI systems, and the better that it can predict what we will do. The better the prediction, the more attractive it is to the real customer of Facebook and Google: the advertisers and the companies looking to make a profit.
“They compete on the basis of who has the best predictions — in other words, who can do the best job selling certainty,” said Zuboff in an interview.
To make the best predictions, surveillance capitalists need a ton of data
What does Facebook want you to do? It wants you to put up photos and videos. It wants you to check into places. It wants you to share an achievement. It wants you to let it know who your friends are. It rewards people by showing who likes and shares their posts, thus, encouraging them to post more, enabling a cycle.
Google Maps shows you where you are and where your destination is. It also lets Google know where you are and the places you’ve been. Obviously, it knows what you’ve searched for, what you click on, and the Google Ads you click on.
In Google and Facebook’s ideal world, each one of our appliances are smart devices as in a smart home, collecting data. It’s 1984, but private for-profit companies are the Big Brother, seeing everything we do – an intrusion many have been okay with because of the promised convenience of a smart speaker or appliances and lights you can turn on through an app.
The more data they have, the better the prediction they can make. Zuboff said the companies need both volume of data to enable scale, and varieties of data to enable scope.
To reach volume, Facebook, for instance, wants you to stay on the platform, keep uploading, keep creating content, keep sharing. To reach variety, it wants you to do different things: upload photos, comment on a contact’s post, react with a sad face on a news post, among other things.
The person and their habits are being recorded, packaged as bits and bytes, and sold to the highest bidder.
“We are their source of raw material, nothing else. These predictions are now sold to business customers, because a lot of businesses are very interested in what we will do soon and later. Never in the history of humanity has there been this volume and variety of behavioral data, and this kind of computational power to monitor and predict human behavior from individuals to populations,” said Zuboff.
To be clear, Facebook and Google aren’t the only ones in this business but are the two biggest examples.
For years, they were able to do this under our noses
Zuboff said that using the word “surveillance” in surveillance capitalism is very deliberate not so much to be melodramatic but to be accurate. Facebook and Google hid the nature of their business because “there would have been resistance, which produces friction.”
“Resistance is expensive,” Zuboff said.
Instead, the two companies – as do other tech companies – occasionally told people that the data collection is to make the product or experience better. It’s not, said Zuboff. “We are not their customers. We are not their marketplace.” They serve the bottomline.
One major indication that the consumer is not the customer is that there is what Zuboff calls a surplus on behavioral data. There is an excess of data being collected beyond what is needed for mere service improvement – although that is what tech companies would want people to believe.
Zuboff’s findings truly cast a dark shadow on seemingly altruistic projects such as Facebook and Google’s former and current internet-providing drones and balloons. Forays into VR devices and virtual assistants are attractive to these companies because once again, they could be a way to collect more data.
From sneakily stealing and commodifying human experiences, they move to manipulating behavior
Before we all learned what Facebook and Google were doing, never at any point did the two companies explain what was happening behind the scenes. It was likely hidden in the fine print, designed – as any other fine print in the world – to be tedious to read.
They know us; but we don’t know them. It’s a state of affairs that persists today. It is never fully explained why we see certain posts on Facebook or what videos are recommended to us on YouTube.
Eventually, according to Zuboff, the giants stepped their game up. “Eventually surveillance capitalists discovered that the best source of predictive data is to actually intervene in people’s behaviour and shape it.” They commandeer and “modify our behavior in the direction of its preferred commercial outcomes.”
With their knowledge of our behaviors and triggers, their digital architecture – from notifications we receive to the content that their algorithm delivers – can also become “the means of behavior modification with programmed triggers, subliminal cues, rewards, punishments, social comparison dynamics – all of it aimed at tuning and herding human behavior in the direction that aligns with the commercial goals of business customers.”
Pokémon GO‘s dark side
One of the examples of manipulation that Zuboff provides is Pokémon GO, which uses augmented reality and map data to play.
While Nintendo owns the license to the Pokémon franchise, Pokémon GO was developed by Niantic Labs, which is actually a company spun off from Google.
People were herded to real locations to capture a Pokémon, with Google’s Niantic Labs putting Pokémon to capture in commercial establishments that would pay, thus giving them foot traffic. For players, on the other hand, it may seem like they were only playing a game.
This example, along with all the many others, are why you hear that expression about data being as valuable as oil, about human experience being as valuable as oil. It is. And the companies have been allowed to drill freely.
This market, this sytem is completely undemocratic, asserted Zuboff. People and their experiences have been used as raw material in a market from which they gain no benefit, their actions creepily recorded for the benefit of the highest bidders, and their actions and behavior manipulated to perpetuate the system. (READ: Threaten to shut down online platforms to create leverage, says ex-Facebook investor)
“These systems are a direct assault on human agency and individual sovereignty as they challenge the most elemental right to autonomous action. Without agency there is no freedom, and without freedom there can be no democracy.”
Zuboff equated the human futures market to slave markets, the human organs trade, and human trafficking, and called it “pernicious and violent” and “fundamentally incompatible with democracy.” Those markets have been outlawed, criminalized, and made illegal. And the same must be done to the human futures market, and to surveillance capitalism, Zuboff said.
Surveillance capitalism is vampiric. But people gladly invited its practitioners into their homes and into their private lives, because when the companies had first knocked, they didn’t have the courtesy to show us their fangs. – Rappler.com