SUMMARY
This is AI generated summarization, which may have errors. For context, always refer to the full article.
Social media is a mess. Perhaps looking at it from a technical standpoint, the platforms are impressive, using and implementing the best consumer-ready web technologies. But the design logic, as has been described by ethicists and experts from within the tech industry and outside it, is deeply flawed and nefarious, leading to offline consequences that’s as catastrophic as the explosion of a bomb.
Rappler CEO and executive editor Maria Ressa, speaking at a Time100 Talk, Time’s series of thought-provoking online discussions, compared data to plutonium, quoting another expert. And the data, its collection, and how it’s been used as the key resource by social media platforms to keep us glued to our screens have resulted in a metaphorical explosion, flattening democratic institutions and civil discourse.
Ressa is one of several experts who spoke at the October 20 Time100 Talk.
Alexis Ohanian, the co-founder of Reddit, who was the first to speak, talked about how the Silicon Valley growth mindset is not an insignificant part of what fueled today’s social media mess.
“Investing in safety tools and trust tools were something that the industry put off.” It had been all about growth and revenue, and he said that it’s only now that companies are beginning to put more focus on those. Mark Zuckerberg’s old “Move fast and break things” mantra certainly put Facebook where it is right now – immensely rich but also at ground zero for disinformation, deep societal polarization, online radicalization, and the destruction of ethics-bound journalism and publication.
For many years, truly, few had an idea of the great data mining operations that the tech giants have been doing, and the accompanying neuroscience-powered push to keep us glued to their respective services. This viselike push for engagement doesn’t discriminate, meaning if false, hateful, sensational content spreads faster than true, boring facts, then so be it. State actors and other entities know as much, leading to the rise of the strongman leader.
Authoritarians really couldn’t have dreamt up a better tool than Facebook. Even if Facebook hadn’t intentionally designed it like that, it’s plain to see that the money incentive had been too large, and the competition too fast for these companies to think of ethical matters.
Ohanian remains optimistic. He says that the companies are seeing how important the safety and trust tools are now – or else lose users. If companies can’t be forced by moral dilemmas, we’ll have to make do with an economic one.
Have the social media giants and tech giants done enough? Right now, to say, with complete certainty that they have fixed things is an extraordinary claim. And extraordinary claims need extraordinary proof. The proof that we still see everyday is, people are still trapped in their own bubbles – oftentimes, not even politically, but friends and family drifting off to their own digital corners as algorithms continue to play people like voodoo dolls.
We are in our own bubbles, as if out in space, only darkness in between, and little understanding. It’s deep polarization and often, personal alienation as well. Such is the power of social media that it knows you better than you do yourself, than your closest friends and family do, and certainly, more agreeable all the time.
Tristan Harris, the founder of the Center for Humane Technology, also speaking at the same talk, offers a “Take Control” page on their website for people looking to disconnect from what has truly been a real-life Matrix.
Harris also adds another ingredient that has led to the normalization of hate on social media platforms: the promise of virality. The social media platforms promise that the thing we post will become viral. But this distribution logic (and a dangling carrot for users), again, doesn’t discriminate. It doesn’t always reward what’s true, and what’s good for society, Harris says. It rewards content that makes people want to interact and reshare.
It’s a design logic that remains to this day, one whose holes – fake news going viral – is being plugged by independent fact-check teams and new organizations such as Rappler. But currently, even with relatively more measures from the likes of Facebook and Twitter, the feeling remains: the dam is always in danger of breaking. Harris, in a 2019 podcast with Ressa, questions whether 35,000 Facebook moderators or the company’s trust and safety budgets are enough for 2.7 billion users.
Safiya Noble, the author of the 2018 book Algorithms of Oppression: How Search Engines Reinforce Racism, who spoke alongside Harris at the Time100 Talk, was adamant about tech giants not being allowed to merely self-regulate.
It’s like “the fox guarding the hen house,” Noble said. Noble asked US citizens to vote in the November 3 elections, first and foremost, saying that “policy shapes environment.”
People are on social media platforms, in their current form, because they have no choice, she said. These social media platforms have, among other things, “defunded democractic counterweights” such as journalism, resulting in less educated citizens. It’s the current environment now, and public policy needs to step in. – Rappler.com
Add a comment
How does this make you feel?
There are no comments yet. Add your comment to start the conversation.