internet safety

What’s in UNESCO’s guidelines for regulating digital platforms?

Victor Barreiro Jr.

This is AI generated summarization, which may have errors. For context, always refer to the full article.

What’s in UNESCO’s guidelines for regulating digital platforms?
What's needed to enable the safeguarding of human rights and freedom of expression online? What should digital platforms and civil society be responsible for?

MANILA, Philippines – During the Internet for Trust Global Conference of the United Nations Educational, Scientific and Cultural Organization (UNESCO), held from February 21 to 23 in Paris, the organization released and discussed the latest draft document for its “Guidelines for regulating digital platforms: a multi-stakeholder approach to safeguarding freedom of expression and access information.”

The document is being developed through discussions with various stakeholders and through public consultations. It aims to help member states and digital platforms that plan to review their processes and rules on technology, and push for regulatory reform.

Below is an overview of the guidelines.

Creating an enabling environment

As UNESCO wrote in the draft guidelines, “Creating a safe and secure internet environment for users while protecting freedom of expression and access to information is not simply an engineering question. It is also a responsibility for societies as a whole and therefore requires whole-of-society solutions.”

To that end, states should be dutybound to promote and guarantee freedom of expression and the right to access information. They should also avoid censoring legitimate content.

States should thus, in broad strokes, have the following in place:

  • States should respect Article 19(3) of the International Covenant on Civil and Political Rights (ICCPR), which states that “any restrictions applied to content should have a basis in law, have a legitimate aim, and be necessary and proportional, ensuring that users’ rights to freedom of expression, access to information, equality and non-discrimination, autonomy, dignity, reputation, privacy, association, and public participation are protected.”
  • There should be ways to remedy breaches of rights.
  • States should ensure that restrictions imposed on digital platforms have a legitimate basis and are open, clear, and specific about the type, number, and legal basis of requests they make for content restrictions.
  • States should not take disproportionate measures – such as prior censorship or internet shutdowns – under the pretense of stopping disinformation or some other reason inconsistent with the ICCPR.
  • States should not impose a general monitoring obligation or an obligation for digital platforms to take proactive measures as it relates to illegal content, with digital platforms not held liable when they act in good faith and with due diligence to put a stop to illegal content.
  • States should ensure staff of digital platforms are not subject to criminal penalties for alleged or potential breaches of regulations in relation to content moderation and curation.
  • States should complement regulation with pushes for media and information literacy in relation to digital platforms, so users can be empowered. The push for media and information literacy should take advantage of the expertise of media and information literacy experts, academics, civil society organizations, and access to information institutions.
  • The regulatory system with responsibilities in this area should be structured as independent and have external review systems in place, including being subject to legislative scrutiny, requiring reports and audits, and being transparent and consulting with the various stakeholders (such as the platforms and the users).
What digital platforms should be doing

UNESCO says digital platforms should follow five key principles:

  • Platforms respect human rights in content moderation and curation, ensuring their policies are consistent with human rights standards, and that human moderators receive adequate support and protections for the work they’re doing.
  • Platforms are transparent, being open about how they operate, with understandable and auditable policies. Such transparency includes information about tools, systems, and processes for moderation and content curation, including in regard to automated processes.
  • Platforms empower users to understand and make informed decisions about the digital services they are using.
  • Platforms are accountable to relevant stakeholders – the users, the public, and the regulatory system – in implementing their terms of service and content policies. This includes giving users rights of redress against content-related decisions.
  • Platforms conduct due diligence on the effect their work has on human rights, including how their policies and practices impact or potentially create risks for human rights.
How intergovernmental organizations and civil society can help

Stakeholders engaged with the services of a digital platform as a user, policymaker, watchdog, or by any other means – should play an active role in consultations on the operation of a regulatory system.

  • Civil society plays a critical role in understanding the nature of and countering abusive behavior online, as well as challenging regulation that unduly restricts freedom of expression, access to information, and other human rights.
  • Researchers can work on identifying patterns of abusive behavior and possible causes that could be addressed, as well as oversight. As such, independent institutions and researchers can support risk assessments, audits, investigations, and other types of reports on platforms’ practices and activities.
  • Media and fact-checking organizations’ role is in promoting information as a public good, as well as dealing with content that risks significant harm to democracy and the enjoyment of human rights on their own platforms.
  • Intergovernmental organizations should, according to their respective mandates, support relevant stakeholders in guaranteeing implementation of guidelines complies with international human rights law. These include giving technical assistance, monitoring and reporting human rights violations, developing relevant standards, and facilitating multi-stakeholder dialogue, among other initiatives.
  • The technical community, including those involved in engineering and data sciences, also has a role in understanding the human rights impacts and ethical impacts of what they are developing.

The 28-page draft document of guidelines can be read here.

The deadline for providing written inputs to the 2.0 version of the Guidelines is on March 8. Contributions are taken in via an online commenting platform, available here. – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Person, Human, Sleeve

author

Victor Barreiro Jr.

Victor Barreiro Jr is part of Rappler's Central Desk. An avid patron of role-playing games and science fiction and fantasy shows, he also yearns to do good in the world, and hopes his work with Rappler helps to increase the good that's out there.