The international organization Forum on Information and Democracy (FID) laid out policy recommendations for states and tech companies on how to stop information chaos, protect democracies, and uphold human rights worldwide.
In a 128-page report, they identified 4 structural challenges and proposed concrete solutions for each of the following:
The report, which detailed 12 main recommendations and a total of 250 proposals, was produced by a team of rapporteurs and the FID working group on infodemics, co-chaired by Rappler CEO Maria Ressa and former member of the EU Parliament Marietje Schaake.
Christophe Deloire, the forum's chairperson, said “a structural solution is possible to end the informational chaos that poses a vital threat to democracies.”
“The exercise of human rights presupposes that democratic systems impose rules on the entities that create the standards and the architectures of choice in the digital space,” Deloire said.
“Social media, once an enabler, is now the destroyer, building division—‘us against them’ thinking— into the design of their platforms…. It’s time to end the whack-a-mole approach of the technology platforms to fix what they have broken,” Ressa said.
“The past years have offered a wake-up call for those who needed it….Without explicit and enforceable safeguards, the technologies promised to advance democracy will prove to be the ones that undermine it. It is now vital that democracy is made more resilient," said Schaake.
Since 2019, at least 37 countries, mostly from Europe, have signed the FID- and Reporters Without Borders-led International Partnership on Information and Democracy, which calls on platforms to uphold their responsibilities. The nations also vowed to ensure their legislations and policies promote a healthy digital space that "fosters access to reliable information" and uphold freedom of expression.
In Asia, only India and South Korea have so far signed the declaration.
In summary, here are my 4 key takeaways from the report:
The UN Guiding Principles on Business and Human Rights (UNGPs) impose on business enterprises the responsibility to respect human rights, including but not limited to the right to freedom of expression and information, in places where they operate.
This is especially applicable to platforms and their business models. Coined in 2014 by American author and scholar Shoshana Zuboff, the term surveillance capitalism describes the business model predicated on harvesting user experience through online platforms, smartphones, apps, and other devices, and manipulating behavior for monetization. (READ: What you need to know about surveillance capitalism)
The issue of human rights is also relevant when it comes to platforms' content moderation. At present, platforms can arbitrarily impose policies that are not in sync with international human rights law, and are under no regulatory body to check on them.
Disinformation and misinformation propagate lies and incite hate and conflict against individuals and groups of people. International human rights law, the report argued, could provide a universal framework for defining a problematic content and addressing it. (READ: With anti-terror law, police-sponsored hate and disinformation even more dangerous)
Platforms should be transparent to users, vetted researchers, civil society, and regulators about its algorithm, content moderation, policies, terms and conditions, content targeting, and social influence building – functions that affect how the public view the world and process information.
Transparency is also needed to determine if platforms are abiding by their own policies and responsibilities.
The information provided by platforms must also be open to audit by regulators and vetted researchers to ensure companies are operating as intended. The participation of civil society would also be critical here.
The platforms should also be upfront when it comes to their conflicts of interest, in order to stop commercial and political interests from influencing the information space.
However, to prevent a repeat of the Facebook data misuse of Cambridge Analytica, the report suggested “differential privacy” as an option, wherein confidential data are made widely available for accurate data analysis.
“Differential privacy addresses the paradox of learning nothing about an individual while learning useful information about a population,” the report said.
To ensure platforms are abiding by their own policies and transparency requirements, the group proposed public regulation.
As starting point, the report suggested the transparency regulation models for Europe and some of the current proposals in the United States.
But when it comes to content moderation, the group cautioned against public regulation for fear it might lead to censorship. Seemingly recognizing at-risk nations like the Philippines, the report said “government demands can be as problematic as company policies in some jurisdictions.”
Regulators, too, are not fool-proof. The report suggested that there should be democratic safeguards against potential abuses or malpractices of governments and the regulators themselves. These include making available to the public the number and nature of personal data they requested from companies and the content they sought to be taken down, among others.
The organization said it can play a leading role in inventing new models of public regulation and co-regulation for the much-needed global governance framework.
The report proposed a legally binding transparency to solve online content moderation and disinformation issues.
While they admit it won’t end all the problems, “it is a necessary condition to develop a more balanced equilibrium of power” between the platforms and democratic societies. After all, it is just commensurate with the power that platforms hold over the information ecosystems.
Proposed sanctions for non-compliance range from large fines, publicity, to administrative penalties.
The report also proposed the creation of a Digital Standards Enforcement Agency to enforce safety and quality standards in the digital sphere. Some of its proposed powers include the authority to prosecute non-compliant offenders; enforce professional standards in software engineering, as these engineers are the platform builders; and non-compliance orders, among others.
This could be a contentious issue but the organization said it could launch a feasibility study on the implementation of the agency.
Public regulation to impose transparency requirements on platforms
A new set of baseline principles on content moderation
New approaches to the design of platforms
Safeguards should be established in closed messaging services when they enter into a public space logic.
Camille Elemia is Rappler's lead reporter for media, disinformation issues, and democracy. She won an ILO award in 2017. She received the prestigious Fulbright-Hubert Humphrey fellowship in 2019, allowing her to further study media and politics in the US. Email firstname.lastname@example.org