top of page

UNESCO Global Conference "Internet for trust"


The UNESCO Secretariat is now developing Guidelines for regulating digital platforms: a multistakeholder approach to safeguarding freedom of expression and access to information (The Guidelines). It began with the Windhoek+30 Declaration in November 2021, where the declaration recognized information as a public good and set three goals to guarantee that shared resource for the whole humanity: the transparency of digital platforms; citizens empowered through media and information literacy; and media viability This universal entitlement is both a means and ends for the fulfillment of collective human aspirations, including the 2030 Agenda for Sustainable Development. Information empowers citizens to exercise their fundamental rights, supports gender equality, and allows for participation and trust in democratic governance and sustainable development, leaving no one behind. The focus of the guidelines relates to freedom of expression in the areas of education, science and culture. The aim of the guidelines is to support the development and implementation of regulatory processes that guarantee freedom of expression and access to information while dealing with content that is illegal and content that risks significant harm to democracy and the enjoyment of human rights. They may serve as a resource for stakeholders, and they will inform regulatory processes under development in a manner consistent with international human rights standards that must be open, transparent, multistakeholder, and evidence-based. The guidelines will enrich and support a global multistakeholder shared space to debate and share good practices, serve as a tool for all relevant stakeholders, and add to existing evidence-based policy approaches.


The approach to regulation of digital platforms must deal with illegal and risky content that may harm democracy, with a focus on systems and processes used by platforms, rather than expecting the regulatory system to judge the appropriateness or legality of single pieces of content. Specific decisions about the legality of specific content should be open to review by a judicial body. Moreover, digital platforms are expected to be transparent, while key media and information literacy skills for users must be promoted. To accomplish such goals of regulation, all stakeholders involved have a role in sustaining an enabling environment for freedom of expression and the right to information, while dealing with content that risks significant harm to democracy and the enjoyment of human rights. Thus, it is crucial to create a safe and secure internet environment for users, while protecting freedom of expression and access to information - a responsibility that belongs to the society as a whole. Digital platforms should comply with five key principles:

  1. Platforms respect human rights in content moderation and curation

  2. Platforms are transparent

  3. Platforms empower users to understand and make informed decisions

  4. Platforms are accountable to relevant stakeholders

  5. Platforms conduct human rights due diligence, evaluating the risks and impact on human rights and their policies and practices


Intergovernmental organizations should support relevant stakeholders in guaranteeing the implementation of these guidelines. Civil society plays a critical role in understanding the nature of and countering abusive behavior online, as well as challenging regulation that unduly restricts freedom of expression, access to information, and other human rights. Researchers have a role in supporting risk assessments, audits, investigations, and other types of reports on platforms’ practices and activities. Media and fact-checking organizations have a role in promoting information as a public good and dealing with content that risks significant harm to democracy and the enjoyment of human rights on their platforms. Engineers, data scientists, and all the technical community involved have a role in understanding the human rights and ethical impacts of the products and services they are developing. All stakeholders should have an active role in consultations on the operation of the regulatory systems.


The guidelines are meant to be generally applicable to any system of regulation, irrespective of its specific modalities, and accept that local contexts will impact how regulation is enacted and implemented. All processes should be open and transparent, including the multistakeholder consultation. The key characteristic of the independent regulator model is decision-making independence. UNESCO has highlighted that an independent authority is better placed to act impartially in the public interest and to avoid undue influence from political or industry interests. The regulatory system should primarily focus on the systems and processes used by digital platforms to moderate and curate content. The system should look at how digital platforms promote freedom of expression and access to information and the measures it has established to deal with illegal content and content that risks significant harm to democracy and the enjoyment of human rights. Thus, the regulatory system should have the power to assess applications or perform inspectorial, investigative, or other compliance functions over digital platforms. The digital platforms should respect human rights and adhere to international human rights standards, and are expected to have structures and processes in place to do so. They are accountable for regulatory systems in the following areas:

  • Content moderation and curation policies. Structures and processes should be applied consistently and fairly across all regions and languages. Human content moderation: moderators should be adequately trained, sufficiently staffed, fluent in the language concerned, vetted, and psychologically supported. Use of automated systems for content moderation and curation.

  • Platforms are transparent. Digital platforms should report to the regulatory system on how they fulfill the principles of transparency, explicability, and reporting against what they say they do in terms of services and community standards. This regards meaningful transparency; transparency in relation to terms of service, transparency in relation to content moderation and curation policies and practices, transparency in relation to user complaints mechanisms, transparency and commercial dimensions, data access for research purposes. Platforms should provide access to non-personal data and anonymized data for vetted researchers that is necessary for them to undertake research on content to understand the impact of digital platforms

  • Platforms empower users. It is important to demonstrate how users can report potential abuses of the policies, and have the means to understand local context and local conditions when responding to user complaints and ensure that their systems are designed in a culturally sensitive way. This includes media and information literacy, language and accessibility - content that risk significant harm for democracy and human rights is not amplified by automated curation or recommended mechanisms simply due to a lack of linguistic capacity of those mechanisms, children’s rights.

  • Platforms are accountable to relevant stakeholders. This includes use of automated tools; and user appeal and redress - the appeals mechanism should follow the seven principles outlined in the UN Guiding Principles on business and human rights for effective complaints mechanisms: legitimacy, accessibility, predictability, equitability, transparency, rights-compatibility, and continuous learning.

  • Platforms conduct human rights due diligence. This includes human rights safeguards and risks assessment - platforms should conduct periodic risk assessments to identify and address any actual or potential harm to human rights impact or their operations. Risk assessment should also be undertaken to protect the exercise of speech by minority users and for the protection of journalists and human rights defenders; specific measurement to fight gendered disinformation and online gender-based violence - women in public life are targeted by disinformation, fake stories, sexual harassment and threats. To fight gendered disinformation and online gender-based violence, digital platforms should conduct annual human rights and gender impact assessments, use privacy-enhancing technology to identify algorithmic amplification of gendered disinformation, crate engineering teams of men and women who are trained to develop algorithmic solutions to different forms of gendered disinformation, and developing inclusive structured community feedback mechanisms to eliminate gender bias in generative AI; specific measures of the integrity of elections: digital platforms should have a specific risk assessment process for any election event, like considering the users and levels of influence of advertisement; and specific measures in emergencies, conflict, and crisis: platforms should have risk assessment and mitigation policies in place for emergencies, crises, and conflict and other sudden world events where content that risks significant harm to democracy and the enjoyment of human rights is likely to increase and where its impact is likely to be rapid and severe.


Digital platforms have empowered societies with enormous opportunities for people to communicate, engage and learn. They offer great potential for communities in social or cultural vulnerability, democratizing spaces for communication and opportunities to have diverse voices engage with one another. But due to the fact that key risks were not taken into consideration earlier, this potential has been increasing over recent decades. The goal of the guidelines is to support the development and implementation of regulatory processes that guarantee freedom of expression and access to information while dealing with illegal content and content that poses a risk to harm the democracy and enjoyment of human rights.The present draft guidelines was be the basis for the dialogue taking place during the Internet for Trust Global Conference.


On the same topics, the Final Declaration of the XXI Infopoverty World Conference, that took place on December 3, 2021 at the UNHQ and online on UN Webcast, advanced the idea of a Regulator,


one able to ensure respect for human rights, the enhancement of cultural identities, and a standardization that avoids any discrimination and is inclusive of all social components beginning with the most disadvantaged. Further, this Regulator must be able to provide services to those in need in terms of affordability and sustainability for all. A Regulator which, we hope, the gathering of Nations can establish, and equip with adequate instruments.


- Final Declaration, 21st Infopoverty World Conference “How to build a fairer and more inclusive Digital Society?

23 visualizzazioni0 commenti

Post recenti

Mostra tutti
bottom of page