Democracy & Disinformation

How to solve information chaos online? Experts cite these structural solutions

Camille Elemia

This is AI generated summarization, which may have errors. For context, always refer to the full article.

How to solve information chaos online? Experts cite these structural solutions

CHAOS. The Forum on Information and Democracy lays out policy recommendations for states and tech companies on how to stop information chaos and protect democracies.

Illustration by Alejandro Edoria

The international group Forum on Information and Democracy proposes solutions on problems with private messaging services, platforms' transparency, and their content moderation policies, among others

The international organization Forum on Information and Democracy (FID) laid out policy recommendations for states and tech companies on how to stop information chaos, protect democracies, and uphold human rights worldwide.

In a 128-page report, they identified 4 structural challenges and proposed concrete solutions for each of the following:

  • platform transparency
  • content moderation
  • promotion of reliable news and information
  • private messaging services

The report, which detailed 12 main recommendations and a total of 250 proposals, was produced by a team of rapporteurs and the FID working group on infodemics, co-chaired by Rappler CEO Maria Ressa and former member of the EU Parliament Marietje Schaake.

Christophe Deloire, the forum’s chairperson, said “a structural solution is possible to end the informational chaos that poses a vital threat to democracies.”

“The exercise of human rights presupposes that democratic systems impose rules on the entities that create the standards and the architectures of choice in the digital space,” Deloire said.

“Social media, once an enabler, is now the destroyer, building division—‘us against them’ thinking— into the design of their platforms…. It’s time to end the whack-a-mole approach of the technology platforms to fix what they have broken,” Ressa said.

“The past years have offered a wake-up call for those who needed it….Without explicit and enforceable safeguards, the technologies promised to advance democracy will prove to be the ones that undermine it. It is now vital that democracy is made more resilient,” said Schaake.

Since 2019, at least 37 countries, mostly from Europe, have signed the FID- and Reporters Without Borders-led International Partnership on Information and Democracy, which calls on platforms to uphold their responsibilities. The nations also vowed to ensure their legislations and policies promote a healthy digital space that “fosters access to reliable information” and uphold freedom of expression.

In Asia, only India and South Korea have so far signed the declaration.

In summary, here are my 4 key takeaways from the report:

The need for a human rights-centered approach to tech

The UN Guiding Principles on Business and Human Rights (UNGPs) impose on business enterprises the responsibility to respect human rights, including but not limited to the right to freedom of expression and information, in places where they operate.

This is especially applicable to platforms and their business models. Coined in 2014 by American author and scholar Shoshana Zuboff, the term surveillance capitalism describes the business model predicated on harvesting user experience through online platforms, smartphones, apps, and other devices, and manipulating behavior for monetization. (READ: What you need to know about surveillance capitalism)

The issue of human rights is also relevant when it comes to platforms’ content moderation. At present, platforms can arbitrarily impose policies that are not in sync with international human rights law, and are under no regulatory body to check on them.

Disinformation and misinformation propagate lies and incite hate and conflict against individuals and groups of people. International human rights law, the report argued, could provide a universal framework for defining a problematic content and addressing it. (READ: With anti-terror law, police-sponsored hate and disinformation even more dangerous)

Transparency

Platforms should be transparent to users, vetted researchers, civil society, and regulators about its algorithm, content moderation, policies, terms and conditions, content targeting, and social influence building  – functions that affect how the public view the world and process information.

Transparency is also needed to determine if platforms are abiding by their own policies and responsibilities.

The information provided by platforms must also be open to audit by regulators and vetted researchers to ensure companies are operating as intended. The participation of civil society would also be critical here.

The platforms should also be upfront when it comes to their conflicts of interest, in order to stop commercial and political interests from influencing the information space.

However, to prevent a repeat of the Facebook data misuse of Cambridge Analytica, the report suggested “differential privacy” as an option, wherein confidential data are made widely available for accurate data analysis.

“Differential privacy addresses the paradox of learning nothing about an individual while learning useful information about a population,” the report said.

In social media’s battle for our attention, real connection becomes the casualty

In social media’s battle for our attention, real connection becomes the casualty
Regulation

To ensure platforms are abiding by their own policies and transparency requirements, the group proposed public regulation.

As starting point, the report suggested the transparency regulation models for Europe and some of the current proposals in the United States.

But when it comes to content moderation, the group cautioned against public regulation for fear it might lead to censorship. Seemingly recognizing at-risk nations like the Philippines, the report said “government demands can be as problematic as company policies in some jurisdictions.”

Regulators, too, are not fool-proof. The report suggested that there should be democratic safeguards against potential abuses or malpractices of governments and the regulators themselves. These include making available to the public the number and nature of personal data they requested from companies and the content they sought to be taken down, among others.

The organization said it can play a leading role in inventing new models of public regulation and co-regulation for the much-needed global governance framework.

Accountability

The report proposed a legally binding transparency to solve online content moderation and disinformation issues.

While they admit it won’t end all the problems, “it is a necessary condition to develop a more balanced equilibrium of power” between the platforms and democratic societies. After all, it is just commensurate with the power that platforms hold over the information ecosystems.

Proposed sanctions for non-compliance range from large fines, publicity, to administrative penalties.

The report also proposed the creation of a Digital Standards Enforcement Agency to enforce safety and quality standards in the digital sphere. Some of its proposed powers include the authority to prosecute non-compliant offenders; enforce professional standards in software engineering, as these engineers are the platform builders; and non-compliance orders, among others.

This could be a contentious issue but the organization said it could launch a feasibility study on the implementation of the agency.

Here are the 12 main recommendations of the working group to states and tech companies:

Public regulation to impose transparency requirements on platforms

  • Transparency requirements should relate to all platforms’ core functions in the public information ecosystem: content moderation, content ranking, content targeting, and social influence building. 
  • Regulators in charge of enforcing transparency requirements should have strong democratic oversight and audit processes.
  • Sanctions for non-compliance could include large fines, mandatory publicity in the form of banners, liability of the CEO, and administrative sanctions such as closing access to a country’s market.

A new set of baseline principles on content moderation

  • Platforms should follow a set of Human Rights Principles for Content Moderation based on international human rights law
  • Platforms should assume the same kinds of obligation in terms of pluralism that broadcasters have, like the voluntary fairness doctrine
  • Platforms should expand the number of moderators and spend a minimal percentage of their income to improve quality of content review, and particularly, in at-risk countries.

New approaches to the design of platforms

  • A Digital Standards Enforcement Agency to enforce safety and quality standards of digital architecture and software engineering. FID could launch a feasibility study on how such an agency would operate.
  • Conflicts of interests of platforms should be prohibited to avoid the information and communication space being governed or influenced by commercial, political or any other interests.
  • A co-regulatory framework for the promotion of public interest journalistic contents should be defined, based on self-regulatory standards such as the Journalism Trust Initiative; use of friction to slow down the spread of potentially harmful viral content should be added. (READ: Increasing sharing friction, trust, and safety spending may be key Facebook fixes)

Safeguards should be established in closed messaging services when they enter into a public space logic.

  • Limit some of the functions to curb the virality of misleading content; impose opt-in features to receive group messages and measures to combat bulk messaging and automated behavior
  • Platforms should inform users on the origin of the messages they received, especially those that have been forwarded
  • Platforms should reinforce notification mechanisms of illegal content by users, as well as appeal mechanisms for users that were banned

– Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Face, Person, Human

author

Camille Elemia

Camille Elemia is a former multimedia reporter for Rappler. She covered media and disinformation, the Senate, the Office of the President, and politics.