Pasar al contenido principal
Download
Definition

Platform responsibility and accountability are terms that evolved in the context of the expanding role that social media platforms play in managing, moderating and curating content online and in removing user accounts. Not holding the carriers or hosters of online content liable for the content their users share and create is important to ensure freedom of expression and the free flow of information. This is reflected in the Manila Principles on Intermediary Liability. The notion of platform “responsibility and accountability” is not a retreat from this position. What it implies is that even if they are not to be held legally liable for the content they make available, platforms do need to take responsibility and be accountable for their own actions to manipulate, rank, filter, moderate and take down content or user accounts. Platform responsibility and accountability are also necessary for these intermediaries to live up to their responsibilities under the UN Guiding Principles on Business and Human Rights.

The problem

Social media platforms are used to form opinions, for expression, to associate and organise with others, and to protest. They enable participation in public life, often for people who, without the internet, would not be able to do so. When these platforms remove content or user accounts, it constrains activism and silences dissent. Bias in how content is moderated – or not moderated – can result in, for example, ethnic violence offline. Misogynistic and anti-feminist speech makes online spaces hostile and unsafe for women. Disinformation is moderated in opaque ways, undermining trust in democratic processes. Companies that operate these platforms regulate content in ways that lack clarity and consistency. They often violate rights without accountability or remedy for users.

The change we want to see

Companies should use international human rights law as the authoritative global standard for ensuring freedom of expression and other rights on their platforms, not the varying laws of states or their own private interests.

How APC works on this issue

By contributing to policy processes, and raising awareness. We will also engage critically with social media platforms, calling on them to align their terms of service (ToS) and community guidelines with international human rights standards; to be more transparent and consistent in their application of their ToS and community guidelines; to provide users with access to remedy; and to be more responsive to users outside of North America and Europe, as well as those in positions of marginalisation or vulnerability. The idea of a “social media council”, proposed by ARTICLE 19, can facilitate greater transparency and accountability on the part of platforms. This will prevent civil society from having to negotiate with “one platform at a time” and having to sign non-disclosure agreements.

Regional implications

Platforms are global and all APC regions are affected. However, there are some regions where civil society actors have been affected more profoundly by actions to control content by both governments and platforms, for example, Palestine. In addition, regions and communities whose primary language is not one that Facebook prioritises are more affected, as they are not able to navigate flagging systems, or might have their content taken down due to a lack of language knowledge by content moderators.

Some spaces and institutions to engage with
  • The United Nations Human Rights Council and its special procedures

  • Direct interaction with platforms through bodies such as the Facebook Oversight Board

  • The global Internet Governance Forum (IGF) and national and regional IGF initiatives (NRIs)

  • Digital rights events and conferences such as RightsCon

  • Content regulation discussions at the national level.

Read more

Reorienting rules for rights: A summary of the report on online content regulation by the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression

Content regulation in the digital age: APC submission to the United Nations Special Rapporteur on the right to freedom of opinion and expression

APC issue paper by Dr. Mathias Vermeulen: “Online content: To regulate or not to regulate – is that the question?”

APC input to the public consultation on the Santa Clara Principles on Transparency and Accountability in Content Moderation