Social Good Summit 2023

Gov’t must mandate access to Big Tech data to make platforms accountable

Pauline Macaraeg

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Gov’t must mandate access to Big Tech data to make platforms accountable

Facebook whistleblower Frances Haugen virtually joins Nobel laureate and Rappler CEO Maria Ressa, Andrew Keen, Adrienne Heinrich in a panel discussion, during the Social Good Summit in Taguig City, on September 16, 2023.

Angie de Silva/Rappler

Nobel Peace Prize laureate Maria Ressa and Facebook whistleblower Frances Haugen say there is a need for appropriate government structures governing Big Tech

MANILA, Philippines – Data transparency from Big Tech companies controlling our social media platforms is needed to make the ecosystem sustainable, tech experts said during the 2023 Social Good Summit on Saturday, September 16.

In a panel discussion, Nobel Peace Prize laureate and Rappler CEO Maria Ressa, Facebook whistleblower Frances Haugen, tech entrepreneur Andrew Keen, and Aboitiz data innovation head Adrienne Heinrich shared their views about the global tech landscape and talked about possible solutions to the ethical and sustainability problems brought by the rapid development of artificial intelligence.

The panel was moderated by veteran journalist and Rappler columnist John Nery.

Ressa and Haugen pushed for data transparency from Big Tech companies, saying that there can only be accountability if the public knows how the technology behind the products they use works. They explained that Big Tech platforms are getting away with the manipulation of the public sphere because they are not required to disclose the harms they know about.

“There are many things that are structurally embedded into the technology companies that should be illegal, but we don’t know because we don’t have access to the data. That’s what the governments have to do,” Ressa said.

Haugen said that if there is legally mandated transparency, Big Tech companies will no longer be able to hide the costs of being bad and the impact on the companies will not be just reputational.

“Right now, we don’t get to use any of the benefits of the free market to try to innovate our way out of these problems,” Haugen said. “The only information (publicly-traded) Big Tech companies are required to publish is on the profit and loss numbers…. But once you have data out, you can start doing things.”

The surveillance capitalism model

Ressa referred to the current model as surveillance capitalism, coined by scholar Shoshana Zuboff. This model mines a user’s personal data without consent, allowing tech companies to create digital clones out of users, from which they profit through targeted advertising.

“Surveillance capitalism hides and allows the impunity of manipulation. It isn’t just politics, because you can get away with as much as you can get away with – which means the politicians will push impunity as far as they can go,” Ressa said.

Ressa said that the lack of transparency from Big Tech companies hinders users from knowing how they are being targeted.

She compared surveillance capitalism to the old model of advertising, where people can see the same advertisement. That is no longer the case today since Big Tech companies use people’s data and clones to target users in a personalized manner.

“Ideas like naming and shaming – which is what civil society groups used to do – these don’t work anymore, because we’re being so manipulated in other ways. It’s the manipulation of the public sphere that really needs to stop,” Ressa said, highlighting the urgency of the situation because it might have a big impact on the elections worldwide in 2024.

Using generative AI to turn tech for good

Meanwhile, Heinrich and Keen pushed for the careful integration of AI in trying to find solutions.

“Putting AI in place will not correct the existing bias,” Heinrich said, citing an Apple card case study where algorithms discriminated against genders. However, she said that there is a need for a healthy balance between optimism and caution when it comes to using AI and the society adopting it.

Heinrich, a data scientist, said that explainability and transparency of technological developments are crucial in making innovations ethical, and so that the people understand the consequences that come with them.

Must Read

Safety evaluations, ethical standards for digital products are non-existent – Chris Wylie

Safety evaluations, ethical standards for digital products are non-existent – Chris Wylie

For Keen, partnering with smart algorithms is a possible solution to enrich the future. He said that people should start seeing AI as a partner that will enable humans to become better storytellers.

“The way the future could be enriched in the age of AI is by partnering with it to tell stories. There is a fetishization of the idea of stories, and some of it is a bit banal, but this technology would allow us to tell stories, not truths, not determine the lies of other people, not use it to point fingers,” he said.

Must Read

Chief Justice Gesmundo: Gov’t must be a step ahead of tech advances for effective policies

Chief Justice Gesmundo: Gov’t must be a step ahead of tech advances for effective policies

While the future looks grim, Haugen ended on a positive note. She said that in order to collectively move faster and address the harms brought by Big Tech, technology can be put to good use to bring more people into the discussions on how to govern AI systems.

“We still have a lot of headroom to be able to articulate what harm it does, the areas of opportunity,” Haugen said. – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Avatar photo

author

Pauline Macaraeg

Pauline Macaraeg is digital forensics researcher for Rappler. She started as a fact checker and researcher in 2019, before becoming part of Rappler's Digital Forensics Team. She writes about the developing digital landscape, as well as the spread and impact of disinformation and harmful online content. When she's not working, you can find her listening to podcasts or K-pop bops.