The UN Human Rights Council (HRC) will hold its 50th session in Geneva from 13 June to 8 July 2022. As this latest session begins, APC reiterates the importance of supporting and strengthening the Council as a key mechanism for advancing human rights online. APC considers the HRC sessions an important opportunity to influence the setting up of international standards on human rights in digital contexts and to raise awareness regarding violations of human rights online in specific countries. APC’s priorities at this HRC 50th session include the following:
-
Gender-based violence online, in particular the issue of gendered disinformation and violence against women journalists.
-
Freedom of expression and association.
-
The human rights impact of the tech sector.
The full programme of work and all the reports that will be presented can be accessed here.
The main Twitter hashtag for the session is #HRC50, and plenary sessions will be live streamed and archived here. We are @apc_news and we'll be tweeting under #InternetRights, #HumanRightsOnline, #FeministInternet and other thematic hashtags.
Mandate holder nominations and renewals
The new mandate holders who will be appointed during this session include the special procedures on freedom of religion or belief, the right to education, torture and other cruel, inhuman or degrading treatment or punishment, and members of the working groups on involuntary disappearances and human rights and translational corporations (more information here).
APC and partner and member organisations will be calling on the HRC to renew the mandate of the Independent Expert on violence and discrimination on the basis of sexual orientation and gender identity during its 50th session. Please contact us should you wish to join the effort to #RenewIESOGI.
Resolutions related to human rights in the digital age
APC will be following the negotiations around two main resolutions during HRC 50:
-
The Core Group members (Brazil, Canada, Fiji, Namibia, the Netherlands and Sweden) will be presenting a substantive resolution on freedom of expression. Two years ago (at HRC 44), the Core Group initiated this resolution (44/12) on freedom of opinion and expression for the first time. The theme of that resolution was access to information. The resolution was adopted by consensus on 16 July 2020. Now the Core Group is working on a new resolution focusing on the importance of digital literacy for freedom of expression.
-
Another important resolution will be the one on the rights to freedom of peaceful assembly and of association (led by the Czech Republic, Indonesia, Lithuania, Maldives and Mexico), which should refer to the importance of the internet for the exercise of these rights, including access by all, and should include the mandate renewal for the respective Special Rapporteur.
Reports related to human rights and digital technology
Below, we list and summarise some of the key findings of the reports that will be presented during this session that touch on the issue of human rights and digital technologies. APC will be taking part in some of the Interactive Dialogues with relevant special procedures.
Education
In her new report, Special Rapporteur on the right to education Koumbou Boly Barry addresses the risks and opportunities of the digitalisation of education and their impact on the right to education. She insists that serious thought should be given to the place and content of digital education, its meaning and efficiency, and its impact on the health and education of children and other learners.
Education must address the essential features of availability, accessibility, acceptability and adaptability. The introduction of digital education may enhance these features or jeopardise them, depending on the context and policy measures accompanying that process.
The digitalisation of education opens up new policy choices for governments. In the context of limited budgets and austerity measures, one of the greatest challenges seems to be finding the balance between investment in the human factor, namely teachers and in-person schooling, and investment in digital technologies. According to the Special Rapporteur, however, this is a false alternative, as digitalising education must be accompanied by significant investment in the human factor, most particularly teachers, who remain key to the implementation of the right to education.
She also calls the attention of decision makers at all levels to understand the profit-driven agenda of digital technology lobbyists and companies, who push them into introducing digital technologies rapidly in schools, and how this can negatively affect education systems for the benefit of a few.
For the Special Rapporteur, while it would be unfair to highlight only the problems, stakeholders must keep in mind that technology that is not regulated according to international human rights principles can lead to harmful dynamics. Believing that digital technology will trigger a fundamental transformation of education systems and solve all problems is expecting too much from technology, which needs active and intentional steering to produce positive changes for a better implementation of the right to education.
Under international human rights law, a number of important provisions must be respected and implemented, relating to states’ obligation to allocate the maximum of their available resources towards ensuring free, quality education, the rights to non-discrimination and equality, the prohibition of retrogressive measures, and the requirement that limitations on human rights must be legal and proportionate to a legitimate aim. Human rights within the education sphere, such as the right to privacy, must be respected.
Extreme poverty and human rights: Non-take-up of rights in the context of social protection
In his report to HRC 50, the Special Rapporteur on extreme poverty and human rights, Olivier De Schutter, addresses social protection and the challenge of “non-take-ups”, and how automation of social benefits can impact on this issue.
For him, social protection is an investment that societies make to enhance resilience against shocks, create an inclusive economy and achieve multiplier effects for the realisation of human rights. Despite its potential, however, social protection benefits often go unused even though they are designed to protect individuals throughout their lives, a phenomenon known as “non-take-up”.
Within this context, the automation of benefits can reduce the administrative complexity for potential recipients and increase take-up. However, automation also carries several risks for the most excluded and vulnerable groups. In particular, mechanisms that seek to ensure the automatic provision of social protection benefits tend to rely on existing administrative data, automatically conferring a benefit or automatically identifying those eligible for a benefit based on the beneficiary being listed in a certain registry. The aims are laudable: they are to simplify applications and disbursements, and to ensure administrative bodies do not require potential claimants to provide documents that another part of the administration already holds. However, those not legally registered owing to their administrative situation may not benefit from automation, which results in a paradoxical situation whereby the most vulnerable groups – people unregistered at birth, undocumented migrants, individuals without a fixed address or informal workers, among others – run the greatest risk of being excluded.
Moreover, he highlights, poverty is a dynamic condition and administrative registries may not always provide fully up-to-date information, taking into account certain life events leading to destitution. Automation is therefore desirable, provided specific care is taken that it does not lead to such exclusions and that claimants can demonstrate their eligibility through means other than their inclusion in certain databanks.
The digitalisation of processes for claiming social protection benefits may exacerbate the digital divide and may lead to more, not less, uncertainty for vulnerable groups. It can also discourage people from applying because of the reliance of online procedures on algorithms designed to detect fraud, even unrelated to the claiming of the benefit itself. “Welfare surveillance”, as it is known, may thus deter individuals from applying for benefits they would otherwise be eligible to receive.
Freedom of opinion and expression
The report to be presented by the Special Rapporteur on freedom of expression and opinion, Irene Khan, examines the opportunities, challenges and threats to media in the digital age. According to her, longstanding problems of violent attacks on and legal harassment of journalists with impunity, censorship of content and manipulation of regulatory authorities have been entrenched, aggravated and augmented by digital technology. Notable new manifestations include gender-based online violence, targeted surveillance of journalists, legislation restricting information online, “media capture” by state or corporate interests and viral disinformation campaigns that undermine public trust in independent journalism. The challenges are multiple, complex and often interconnected.
The Special Rapporteur highlights the problem of online attacks against women journalists as one of the most serious contemporary threats to their safety, gender equality and media freedom. She notes that vicious, coordinated, highly sexualised and malicious attacks often target women from religious and ethnic minorities or gender non-conforming people.
Another important aspect raised in the report is the targeted electronic surveillance of journalists and how it poses a challenge to investigative journalism, puts the confidentiality of journalistic sources at risk and exposes both journalists and their sources to increased physical harm.
The challenge posed by the spread of fake news laws is also addressed, with the Special Rapporteur expressing concern with the fact that these generally fail to meet the three-pronged test of legality, legitimate aims and necessity set out in international human rights law.
In her recommendations, among others, the Special Rapporteur calls on states to refrain from compelling digital companies to restrict or remove journalistic content without judicial due process. As part of transparency reporting, digital companies should inform the public and the media about content restrictions requested by states.
Elimination of discrimination and violence against women
The report of the Working Group on discrimination against women and girls on girls’ and young women’s activism highlights that digital gender-based violence and harassment add a further layer of challenges to girls’ and young women’s activism.
Digital technologies may be used to blackmail, control, surveil, coerce, harass, humiliate or objectify girl and young women activists, including by resorting to “deep-fake” pornographic content and death threats. As a result, many victims of these practices limit their online activities, leading to self-censorship; endure stigma in their families and communities; or flee online spaces altogether. The majority of the young women and girls consulted for the report had experienced some form of targeted and gendered online abuse, including threatening messages, sexual harassment and the sharing of private images without their consent.
The Working Group points out that attacks against girl and young women activists are often orchestrated with the aim of discrediting and delegitimising them and exposing them to ridicule, contempt or defamation. In some cases, their families may prohibit them from continuing their activism because of the reputational damage that may follow. In certain countries, their very presence on social media may constitute a great risk to girls’ and young women’s personal integrity.
Large-scale data collection and algorithm-driven analysis targeting sensitive information create new threats for activists, particularly those from lesbian, gay, bisexual, transgender, intersex and queer+ communities.
Meanwhile, the Special Rapporteur on violence against women, its causes and consequences, Reem Alsalem, dedicated her report to HRC 50 to the issue of violence against Indigenous women and girls. In it, she explains that while the situation seems to be improving, there is still a lack of comprehensive, comparative and disaggregated data and statistics on violence against Indigenous women and girls at the domestic, regional and international levels. This renders it difficult to determine the full extent of violence against women, its manifestations and its consequences. In turn, this presents obstacles to developing evidence-based policies and plans to prevent gender-based violence against Indigenous women and girls and provide effective support and protection.
Terrorism and human rights
This session, the Office of the United Nations High Commissioner for Human Rights (OHCHR) is presenting a report to the HRC focusing on the impact of counter-terrorism measures on the enjoyment of the rights to equality and to non-discrimination.
A number of states have incorporated surveillance measures as a central part of their responses to terrorism. Many states have significantly expanded the powers of law enforcement and security agencies to conduct targeted and bulk or mass surveillance. They have taken advantage of new technologies, and have increasingly made demands of and imposed obligations on private companies, including internet service providers, search engine providers and social media companies, to facilitate data collection, as well as taking specific steps to moderate content online. The report states that while the interception of communications provides a valuable source of information by which states can investigate, forestall and prosecute acts of terrorism and other serious crime, such measures often risk contravening international human rights law, specifically the rights to privacy and to non-discrimination.
The OHCHR considers that digital technologies are playing an increasing role in the fight against terrorism. At the same time, these technologies have significant human rights impacts, including on the enjoyment of the right to non-discrimination. For example, automation to facilitate data gathering and processing in mass surveillance programmes can amplify the discriminatory impacts of surveillance. The use of biometric recognition technologies, such as facial recognition, including in the fight against terrorism, entails significant risks of profiling on the basis of race, ethnicity and religion. Because of these risks, the High Commissioner has called for a moratorium on the use of facial recognition technologies in public spaces. Furthermore, private surveillance technologies, reportedly marketed for use to combat terrorism and serious crime, have been used to target people of minority backgrounds, journalists, human rights defenders, political opponents and dissidents. This situation has led the High Commissioner and the Special Rapporteur on the right to freedom of opinion and expression to call on states to implement a moratorium on the sale and transfer of such surveillance tools until compliance with human rights standards can be guaranteed.
The surveillance and moderation of content online are both key issues that highlight the role that private actors have come to play in counter-terrorism efforts.
New and emerging digital technologies, business and human rights
In its resolution 47/23, the Human Rights Council requested the OHCHR to prepare a report on the practical application of the Guiding Principles on Business and Human Rights to the activities of technology companies.
The report recalls that the Guiding Principles set out the distinct but complementary role of states and companies in preventing and addressing human rights harms associated with business activity. They are comprised of three separate but mutually reinforcing pillars:
-
Pillar I. The state duty to protect against human rights abuses by third parties, including businesses, through appropriate policies, regulation and adjudication.
-
Pillar II. The corporate responsibility to respect human rights, by not infringing on the rights of others, and to address adverse impacts on human rights related to their activities.
-
Pillar III. Access to remedy for victims of corporate-related human rights abuse through judicial or non-judicial mechanisms.
Under Pillar I, as part of their regulatory and policy functions, states are called on to adopt a “smart mix” of voluntary and mandatory measures to require companies, including in the technology sector, to respect human rights. The report reviews some concrete examples of such measures, including the enactment of legislation that will effectively require companies in all sectors to communicate about the presence and effects of human rights due diligence policies and systems. This and other developments, in particular proposals related to mandatory human rights due diligence requirements for companies, will have implications for how technology companies design, develop and sell products and services, for example by mandating greater transparency about the human rights impacts and mitigation measures put in place.
Under this same Pillar, the report highlights states’ duty to protect human rights when they act as an economic actor, including requiring human rights due diligence when financing, supporting or owning a business or when outsourcing or contracting in relation to public services or procuring goods and services. The need for policy coherence in regulations is also stressed, as well as stakeholder engagement. The OHCHR refers to concerns that some regulatory attempts intended to address adverse impacts on users of digital products and services might instead make it more difficult for intermediaries to respect the rights of their users, because of broad and vague definitions and scope; excessive penalties, including significant liability for company personnel; and requirements to remove content under strict timelines or by means of automated tools without attention to necessary safeguards.
The issue of due diligence is broadly covered under Pillar II, where the OHCHR also refers to the importance of adequate remedies. In accordance with the Guiding Principles, if harm has occurred in connection with a business’s activities, products or services, the company is expected to engage in remedial action, either through its own remediation mechanism or by participating in one. According to the report, when companies have in place credible and effective mechanisms for stakeholders to raise grievances, this can enhance the robustness of a company’s efforts to identify and assess human rights impacts
Pillar III guidance refers to the responsiveness of judicial mechanisms to cases of human rights harms arising from the use of technologies and the role of state-based non-judicial mechanisms, such as products standards authorities, licensing authorities, regulators responsible for the implementation of data protection laws, information and privacy commissioners, state ombudsperson services, public health and safety bodies, professional standards bodies and national human rights institutions.
Privately operated mechanisms are also mentioned, in particular their extreme diversity in terms of their design and operation, the ways they can engage with affected individuals and communities, and the kinds of remedies they offer.
The report recommends broader attention to the “remedy ecosystem” for remedying technology-related harms. That is, to ensure that people affected by business-related human rights abuses have a “realistic and readily identifiable remedy pathway”, greater attention needs to be given to the ways in which different remediation mechanisms and processes interact, in order to highlight areas where greater coherence and interoperability between different types of processes (for example, judicial and non-judicial) may serve to enhance access to remedies by affected people and groups.
A number of concrete recommendations are presented at the end of the report, addressed to both states and technology companies.
Panel and other discussions to be held at the 50th session of the Human Rights Council
APC will follow three main panels and other discussions planned for this session:
-
The annual full-day discussion on the human rights of women, to take place on 27 June (as per HRC resolutions 6/30 and 47/15).
-
The panel discussion on the adverse impact of climate change on the full and effective enjoyment of human rights by people in vulnerable situations, to be held on 28 June (as per HRC resolutions 6/30 and 47/15).
-
The high-level panel discussion on countering the negative impact of disinformation on the enjoyment and realisation of human rights and on ensuring a human rights-based response, to take place on 28 June (as per HRC resolution 49/21).