By Mia Hem
Digital platforms have revolutionized global interactions, shaping social, cultural, and political dynamics. They connect people, foster discussions, and hold potential for positive change. However, the rise of misinformation, disinformation, and hate speech has tainted the digital landscape, posing significant threats to society. To address these challenges, the United Nations proposes a Code of Conduct for Information Integrity on Digital Platforms
While digital platforms have amplified marginalized voices and united global movements, they have also facilitated the rapid spread of falsehoods and hate speech. This darker side threatens peace, equality, and environmental well-being. Misinformation and disinformation, often fueled by anonymous actors, have the potential to incite violence and undermine established facts. Hate speech and disinformation pose a threat to democratic institutions, human rights, and the achievement of Sustainable Development Goals. Disinformation is a lucrative business involving mainstream and "dark" public relations firms. The attention economy model of digital platforms, driven by algorithms, rewards and amplifies harmful content, including misinformation and hate speech. With this, we understand that there is a need for regulation and human rights considerations. Various countries have introduced legislation to regulate digital platforms, but some laws infringe on protected speech and human rights. Governments and political figures have exploited concerns about information integrity to restrict access to information and manipulate platforms. Regulation should carefully balance legality, necessity, and proportionality under human rights law. Information integrity should be promoted based on international norms and standards, including human rights law, sovereignty, and non-intervention in domestic affairs. Freedom of expression, encompassing the right to seek, receive, and impart information through any media, is a fundamental human right, which must be maintained and strengthened - in a way that does not exploit the users.
Misinformation and disinformation also have detrimental effects on areas such as health, climate action, democracy, gender equality, security, and humanitarian response. False information during crises like the COVID-19 pandemic can endanger lives. Hate speech and weaponized information incite violence and prolong conflicts. Climate mis- and disinformation impede necessary actions to address the climate emergency. It erodes trust in institutions, interferes with electoral processes, and targets marginalized groups. The decline of independent media exacerbates the spread of disinformation, leading to information pollution and the loss of reliable news sources.
So, how can we overcome such concerns? The European Union has adopted the Digital Services Act and the Code of Practice on Disinformation to combat illegal content and address disinformation. However, gaps remain in policy, transparency, and implementation. Algorithms that prioritize engagement often lead users to polarizing or provocative content. Moderation mechanisms face challenges, including outsourcing, lack of resources, and mistreatment of moderators. Advertisers should avoid funding harmful content, while independent media and fact-checking initiatives play a crucial role in countering misinformation. The implications of emerging technologies, such as generative artificial intelligence and deepfakes, must be anticipated. Furthermore, the United Nations proposes a Code of Conduct to address mis- and disinformation and hate speech online. It is built upon principles such as commitment to information integrity, respect for human rights, support for independent media, increased transparency, user empowerment, strengthened research and data access, scaled-up responses, stronger disincentives, and enhanced trust and safety. The Code will be implemented at the national level, drawing upon recommendations such as refraining from disinformation and hate speech, protecting user rights, promoting transparency, enabling user empowerment, and investing in research. Ethical use of artificial intelligence is emphasized, and global cooperation is essential to effectively address the challenges.
Digital platforms hold immense potential for positive impact, but the proliferation of misinformation, disinformation, and hate speech threatens societal well-being. The United Nations' proposed Code of Conduct for Information Integrity on Digital Platforms aims to counter these challenges. By upholding human rights, promoting transparency, empowering users, supporting independent media, and anticipating emerging threats, the Code seeks to strengthen information integrity and combat harmful content. Global cooperation is vital to protect the integrity of information in the digital age.