Skip to main content
Photo: OHCHR

The Association for Progressive Communications (APC) welcomes the report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression on content regulation in the digital age, a topic that is of great concern to APC and our members. On the one hand, there is a lot of “noise” in the mainstream media about so-called “fake news”, the spread of “extremist” content, and various forms of unlawful speech online, and frustration that platforms are not doing enough, quickly enough to address it. On the other hand, we observe a fairly rushed response from platforms to increase in-house moderation of content. The experience of human rights defenders and activists suggests that platforms are removing content in a manner that seems to imply political bias.

The topic is also important to APC as we continue to seek solutions for combating online gender-based violence that are consistent with international human rights standards. Too often, the response to offensive and dangerous, though lawful, expression is censorship, in the form of takedowns, blocking, filtering or criminalising content. Censorship is increasingly being implemented by private actors, with little transparency or accountability, and disproportionately impacts groups and individuals who face discrimination in society – in other words, people and groups who look to social media platforms to amplify their voices, form associations, and organise for change. For civil society and multistakeholder forums that deal with content regulation in the digital age more broadly, this is a useful moment to assess the strengths and shortcomings of excessive state regulation and inadequate self-regulatory regimes when it comes to protecting the wide range of rights that internet users around the world have come to rely on to exercise their rights online and offline.  [1]

As recognised by the Human Rights Council, the internet is an enabler of human rights. Social media platforms are increasingly where people turn to find information which can be used as basis to form opinions; to express themselves; to associate and organise with others; and to protest. They enable participation in public life, often for people who, without the internet, would not be able to do so. Companies that operate these platforms regulate content in ways that lack clarity and consistency. They often violate rights without accountability or remedy for users.

We value that the report examines the obligations and responsibilities of both states and companies. We commend the Special Rapporteur for addressing this issue in a way that is rooted in international human rights standards and reflects the lived experiences of people around the world, particularly groups at risk and people who are marginalised based on sexual orientation and gender identity, cultural, linguistic or political contexts.

In many respects the report’s strength lies in the questions it raises; but this also requires a response from the actors concerned with online freedom of expression.

For example, the report recommends that companies use human rights law as the authoritative global standard for ensuring freedom of expression on their platforms, not the varying laws of states or their own private interests. We agree strongly with this stance. What needs to be elaborated is the practical steps that companies can take to implement this recommendation, and how the UN can support this process. It is also necessary to consider how the implementation of this recommendation can be followed up on and monitored effectively.

We strongly endorse the report’s recommendation that companies that run online content platforms be more transparent and publicly accountable. We support the recommendation that they, in collaboration with other stakeholders, establish a “Social Media Council” – originally suggested by ARTICLE 19 – to facilitate greater transparency and accountability. We look forward to further discussion on how this Council can operate in practice, and what immediate next steps are needed to establish it.

APC welcomes the Special Rapporteur’s reminder to states of their obligations to uphold rights in the context of online content platforms. In particular, we value that he reminds states to avoid delegation of regulatory functions to private actors and to ensure that all requests by states follow due process and meet established conditions of legality, necessity, proportionality and legitimacy. The Special Rapporteur recommends that states “create an environment where companies are incentivised to uphold human rights principles.” Achieving this may require regulation, and would definitely require discussions between the state, companies and civil society, as well as perhaps investors. The Special Rapporteur’s guidance in this area would be very welcome.

In many respects the report should be seen as marking the beginning of a process that will require concerted effort from civil society organisations, companies and states. APC calls on states and companies to positively engage with the report and be guided by it in designing and carrying out human rights impact assessments of their policies and practices. APC also calls on existing global and regional multistakeholder processes to develop solutions that are framed by and strengthen human rights frameworks and commit to contributing to this process.

[1] From APC’s contribution to the call for input by the Special Rapporteur.
 
 

See also:

 

Reorienting rules for rights: A summary of the report on online content regulation by the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression

 

Content regulation: State responses to report on freedom of opinion and expression