Facebook

In social media’s battle for our attention, real connection becomes the casualty

Gelo Gonzales

This is AI generated summarization, which may have errors. For context, always refer to the full article.

In social media’s battle for our attention, real connection becomes the casualty
Tristan Harris, founder of the Center For Humane Technology, talks about the nature of social media now, and suggests solutions

Tristan Harris, a primary driving force behind the documentary The Social Dilemma, spoke at a Rappler webinar on Tuesday, November 10.

The message remains clear: We are nowhere near a social media ecosystem that brings out the best in humanity. As it stands, platforms like Facebook, Twitter, and YouTube – despite constantly getting updates to its looks and user interface – largely remain the same at the core. 

“Attention casinos,” is one of the terms that Harris uses to describe these platforms. Like a casino, the house, first and foremost, seeks to make money. And for it to do so, the platforms must keep you inside their walls for as long as they can.

In social media’s battle for our attention, real connection becomes the casualty

The house, of course, always has an advantage: it knows its users, as many experts would say, sometimes far better than the user knows themselves. It knows exactly what will keep a user on the platform – what content to show the user, what friend to prioritize on the feed, among other things. 

Often, the type of content that the platforms will prioritize are the loudest, most angry voices, or those showing toddler-like behavior, Harris says. It’s human nature, a survival tool, to pay more attention to a ruckus rather than a calm discussion, and for that reason hate and sensational lies have spread 6 times faster than fact on these platforms. 

Perhaps, at those early Facebook and Twitter meetings, there was no blatant call to make hate spread faster. Yet, as we are all palpably feeling now, there were engineering decisions to maximize virality and engagement for profit, and to promise virality to users, even if that meant disregarding critical issues that would hit society like a hammer to the face. 

Harris was previously an ethicist at Google, but his calls for humanistic design were not taken seriously, which would push him to create the non-profit Center For Humane Technology. 

Maria Ressa, who hosted the webinar, offers some hope in that after hate and lies, it’s inspirational content that spreads second fastest.

Defining the problem

“We have to define the problem,” Harris says, if we have any hope of fixing the social media mess. 

In addition to the ones mentioned above, here are other key points defining the problem, according to Harris:

Social media remains exploitable

While Joe Biden won the US elections, social media remains exploitable by authoritarians – both the would-bes and the extant. Authoritarians and demagogues thrive on attention and the ability to “invade the private thoughts of everyone.”

One doesn’t have to look far for an example. Look at Donald Trump and his tweets baselessly claiming fraud. Biden won, but the attention is currently on Trump, who remains loud on Twitter and Facebook, with a messaging strategy that seeks to repeat the lies until they eventually become the truth, and maybe sow doubt in non-supporters. 

Social media’s inflexibility runs counter to human nature

The current nature of social media is that your history is recorded online, making it easy to “build a mob that will hate you” based on a caricature created out of all your past posts. There is an inflexibility of who you are as a person on social media that could lead to people firmly staying in their own tribes, polarized. 

Partly, this relates to privacy. If your entire record is accessible online, the tendency is to stick to what the record already says, and you lose the agency to pursue what you really seek to become. 

The spread of content remains decoupled from ethics

Virality puts tremendous power in the hands of users. But currently this power is “decoupled from ethics.” The current basis for virality isn’t whether there are good intentions behind it, but that it strikes a chord emotionally – even if often, the emotions it hits are toxic like hate and anger. 

The fix, Harris suggests, would be to reserve virality for people who are “most compassionate, and who can show understanding of two sides of the issue.” Those that promote healing and compassion are the things that social media should reward. 

Harris says that “social media has to change design norms” and move to one that “will reward the better side of people.” 

For individuals, Harris advises, “Before you post anything, ask why am I posting this, and choose de-escalation over escalation. Choose healing over more conflict.”

No gatekeeping duties for social media giants

“Everyone on social media is a micro-gatekeeper.” News institutions were once the gatekeeper, guided by ethics to show the people what is important and true. Social media platforms usurped that power but – as they have been designed for engagement and profit – forgot that not all information is equal, and that the vetted truth had to be the priority.

Flattening the curve of disinformation

Harris says that what we have to realize is that there’s an information war going on. But while countries spend a lot on physical geographical borders, the digital borders are wide open. Facebook opened borders to everyone. And currently, Facebook is being tasked to secure these digital borders – a task Harris believes the company isn’t equipped to deal with. 

One thing that needs to be addressed fast: Facebook, a trillion-dollar company, has to spend more on safety and security. Harris likened the situation to flattening the curve of COVID-19. There are only limited ICU beds in hospitals. Facebook too, so to speak, has very limited ICU beds for those being affected by its critical issues, and is overwhelmed. 

Addiction and isolation

Addiction and isolation are effects of social media. “The more isolated you are, the more vulnerable you are to conspiracy. You can become more radicalized,” said Harris. “And you don’t want to interact with other people because you believe they are wrong, and you are correct.”

To the last point, Harris says one of the most striking statements in the webinar. “What we’re really seeking is connection.” It’s the biggest irony of social media as we know it right now. It’s connected us all but it’s also left us starved for actual connection. 

The reason has been already discussed: the platforms got us glued to our screens for profit maximization, never mind that many of us become like addicts looking for the next dopamine hit, and seeking attention in the form of a digital thumbs up or heart.  

“We need touch. We need eye contact. Social media profits from the screen based version of that. But especially in a COVID world, you need real world connection.” 

Removing micro-targeting, political Facebook Groups

Asked what he would do if he were in Mark Zuckerberg’s shoes, Harris suggests removing micro-targeting completely, and disabling all political Facebook Groups. 

On the first point, micro-targeting is what allows social media companies to collect so much data about its users, and then use that data to know what content to serve the user, which will result in the most engagement.

This ties in with Ressa’s assertion of social media being a behavior manipulation device. And not only do social media platforms know our triggers or which buttons to push to make us behave in a certain way, advertisers – not excluding political bodies – also know the same. 

That is why it is no coincidence that with the rise of social media came the rise of authoritarians and demagogues. There are fewer better tools out there to help an aspiring authoritarian see exactly what people are feeling, and what message to rouse those sentiments. 

Micro-targeting is a product of surveillance capitalism. The more data you have on users, the more effective and exact the message you can create. To remove micro-targeting is to overhaul the entire financial incentive powering these social media platforms. Regulation, particularly in the US, is where this overhaul may start. 

Harris doesn’t believe in breaking up the social media giants though, because venture capitalists will only find ways to create another one. Instead, he suggests that competition has to be enabled in a regulated environment where social media platforms that deal with problems fast and with honesty are rewarded. 

To the second point, disinformation didn’t stop when Facebook started promoting Facebook Groups. The disinformation only moved to these closed communities. Harris suggests banning Facebook Groups that are political in nature, because he simply believes, Facebook does not know how to do this safely. 

Harris made an analogy to the firm Johnson & Johnson who once removed Tylenol off the shelves because it had a harmful element. The firm admitted that there was something wrong, and took the product off. Facebook has to do the same, Harris says, and admit that there is something wrong with political Facebook Groups, and they have to be brought down. 

On this solution, it’s public pressure and a push from activists that can get Facebook to act. Facebook can be monolithic. There are some good signs. Harris noted that Facebook and Twitter performed better in the 2020 US elections but criticized YouTube, which had allowed a fake election map to spread.

And it was the constant attention – from the public, from groups such as the Real Facebook Oversight Board, and governments around the world – on these purveyors of the attention economy that finally forced these companies to act. – Rappler.com

The full episode of this Rappler Talk will air on Wednesday, November 11, at 7 pm.

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Gelo Gonzales

Gelo Gonzales is Rappler’s technology editor. He covers consumer electronics, social media, emerging tech, and video games.