Pasar al contenido principal

Dear [Representatives from Facebook, Google, Microsoft, and Twitter]

We, the undersigned organizations, are writing in response to the Global Internet Forum to Counter Terrorism (‘GIFCT’) call for expressions of interest to join its Independent Advisory Committee (IAC). As human rights and civil liberties organizations, many of us have engaged with your companies and through GIFCT convenings over the past few years in the spirit of open and honest exchange, with the goal of promoting fundamental human rights and ensuring accountability for governmental and corporate actors alike.

In that same spirit, we write today to share some of our key concerns about the IAC specifically, and the growing role of GIFCT more broadly in regulating content online. Many of our organizations have discussed concerns with you over the past few years, including our deep skepticism about the creation of a shared hash database and the risks involved in content removal coordination among companies; the lack of clarity over how GIFCT defines or distinguishes “terrorism,” “violent extremism,” “extremism,” and support for or incitement to them; increasing reference by governments to GIFCT as a quasi-official body. Unfortunately, we have yet to see GIFCT genuinely address these issues. In fact, unless GIFCT significantly changes course now to address the concerns we lay out below, we believe participation of civil society on the IAC will be window-dressing for the real threats to human rights posed by GIFCT. For these and other reasons, our organizations will not apply for membership in the IAC at this time.

Extralegal censorship from government involvement in GIFCT

We have always been concerned that GIFCT—even though framed to us as a voluntary, industry-only entity—would ultimately be vested with some kind of governmental authority or otherwise entangled with state actors. This appears to have happened with the formalisation of the IAC and its inclusion of governments as members. Governments will almost certainly use their influence in GIFCT to further leverage member companies’ “Community Guidelines” and content moderation policies as a way to secure global removal of speech.

This would not only significantly undermine formal mechanisms to hold governments and companies to account, it will also inevitably lead to greater censorship of protected speech, hinder independent journalism and research, and bury or destroy evidence that could lead to war crimes prosecutions. We know from years of experience and evidence that GIFCT members already remove significant amounts of protected speech under their “Community Guidelines” by using broad definitions of what constitutes, for example, “support for violent extremism,” as well as by relying upon lists issued by national governments to determine affiliation with terrorist organizations. We are deeply concerned that increasing government influence on “Community Guidelines”-based removals will result in further weakening of users’ freedom of expression.

Increasing scope and use of shared hash database

Even prior to the launch of GIFCT in 2017, the launch of the shared hash database in December 2016 prompted many of our organizations to raise concerns about the existence of a centralized resource focused on content removal across platforms. Such a centralized repository, based on the contributions of individual companies according to their own idiosyncratic definitions of “terrorist” content, risks creating a lowest-common-denominator definition of “terrorism” and perpetuates the incorrect notion that there exists a global consensus on the meanings of “terrorist” and “violent extremist” content. Moreover, we are concerned that the definitions and taxonomy used to populate the database have been applied in a discriminatory manner. While the GIFCT’s profile was raised significantly by the Christchurch Call—itself a reaction to a white supremacist attack on Christchurch’s Muslim community—we have not seen any indication that GIFCT’s focus goes beyond what it considers to be Islamist-linked violent extremist or terrorist content.

While we understand that each participating company retains the right to make individualized decisions about whether to remove any particular post from its service on the basis of its own definition of these terms, in practice we are concerned that small companies will use the database to automate removal because they do not have the resources to carry out individualized reviews. We also understand that even the largest companies sometimes automate content removal decisions. This has resulted and will continue to result in the destruction of evidence of war crimes and the stifling of critical expression, including speech challenging government policies, corporations, and violent extremism. Even when humans are involved in reviewing content, companies’ moderation systems have often proven unable to distinguish the nuance between material that constitutes incitement to terrorism and legitimate reporting on human rights abuses.

Persistent lack of transparency around GIFCT activity

Many of our organizations have called on GIFCT and its member companies to increase public transparency about its membership, activities, and relationship with governments. We welcomed the first GIFCT transparency report published last year, but GIFCT must publish more detailed and meaningful information on a regular basis, particularly as it formalizes its relationship with government officials. We emphasize that transparency to the IAC alone is insufficient; decisions made by GIFCT member companies affect individuals and communities around the world.

In addition to our concerns about the very existence of the shared hash database, many of our organizations have repeatedly highlighted shortcomings in transparency around it. There is little visibility to anyone outside of the GIFCT member companies as to what content is represented in the hash database. We understand that the shared hash database is a collection of hashes and not itself a repository of content that could be reviewed. This, however, is not an answer to the underlying concern that GIFCT is maintaining a shared content removal resource that cannot be objectively evaluated to determine whether, for example, protected speech is being censored, or evidence of war crimes or other valuable evidence is being destroyed.

Independence and role of NGOs

We also have several concerns relating to the GIFCT/IAC and our role as NGOs. We deeply value our independence and ability to speak publicly on a host of issues, including human rights, the rule of law, governance, and the impact of technology on society. Sitting with governments on the IAC could compromise our ability to do so, as some NGOs may receive funding from governments on the IAC or face surveillance or threats of reprisals from them.

Another concern is that the power dynamics between government officials, including law enforcement, and individuals representing civil society organizations will place our interactions on inherently unequal footing. Our experience with multi-stakeholder initiatives that involve the private sector, civil society, and governments is varied, but when companies act in an opaque and deferential manner towards government officials in these contexts, governments can wield extraordinary influence on the outcomes of these initiatives. Indeed, this has been our experience with GIFCT thus far: governments have been directly involved in the negotiations about the future of the GIFCT, while civil society has been relegated to a barely consulted afterthought. This dynamic can make it very difficult to prevent, modify, or reverse decisions that harm human rights or to hold governments accountable for the policies or actions they promote. We are concerned that the IAC structure will only increase governments’ influence over GIFCT.

In our view, for GIFCT to be regarded as a credible entity that seeks to protect human rights, its members should at the very least:

  • Conduct and share publicly an independent assessment of the risks to freedom of expression and other human rights that stem from GIFCT, including those related to its not-for-profit status. This assessment should include:

    • A thorough analysis of the legal landscape(s) in which GIFCT will operate, including laws that may compel GIFCT to disclose user data;

    • An analysis of the human rights risks of:

      • Government pressure and influence on participating companies to remove lawful content under their content moderation policies;

      • Use of overly broad and discriminatory criteria for removing “terrorist” or “extremist” content;

      • Use of hash-matching as a means for identifying content for automated removal or human review;

      • The lack of transparency about the contents and operation of the hash database;

      • GIFCT’s potential relationships with governments and other entities; and

    • A plan setting out how GIFCT will mitigate any identified risks, including through a credible and effective process to ensure that actions can be remedied if they harm rights.

  • Commit to accepting an independent, external audit or review of the content represented in the hash database, and take whatever steps are necessary to do so, including creating a continually updated repository, or asking member companies to do so individually, of the material the hash database reflects.

  • Prioritize making information publicly available about GIFCT’s practices, including the operation of and content reflected in the hash database.

We share these concerns and recommendations with you in a spirit of candor and with the goal of continued dialogue. We know these are complex challenges and that the global environment for freedom of expression and preserving an open Internet is more fraught than ever.

Sincerely,

Access Now
Amnesty International
ARTICLE 19
Association for Progressive Communications (APC)
BlueLink Foundation
Center for Democracy & Technology
Committee to Protect Journalists
Dangerous Speech Project
Derechos Digitales
Electronic Frontier Foundation
Human Rights Watch
International Commission of Jurists
Privacy International
Ranking Digital Rights
Rights Watch (UK)
SMEX
Syrian Archive
WITNESS