Pasar al contenido principal
Controversial Cases on AI in Republic of Korea

In South Korea, there have been cases where automatic algorithms and AI have raised concern about the negative impact on human rights. In particular, the opacity and discrimination of recruitment AI have been debated. Additionally, issues like the privacy violation and hate speech of AI chatbot Lee Ruda have raised concerns. The algorithm manipulation of Kakao Taxi and and the Ministry of Justice’s immigration identification tracking AI’s unauthorized provision of facial information for AI training have also caused a lot of social debate. In some cases, regulatory agencies such as the Korea Fair Trade Commission and the Personal Information Protection Commission have intervened and administratively sanctioned the cases, but the government and some members of the National Assembly have continued attempting to deregulate under the guise of protecting and fostering the domestic AI industry.


Introduction

  • Artificial intelligence (AI) products, such as home appliances and automated algorithms, of Korean big tech companies, often referred to as “indigenous portals” in Korean society, have been rapidly dominating the market. However, there has been no effective legal intervention to prevent their negative impact on the market and fundamental rights, including the right to privacy.

    • The Korea Fair Trade Commission (hereafter “KFTC”) has been trying to regulate the unfairness of proprietary algorithms of big techs such as Naver and Kakao, but investigations normally take a long time and are difficult to prove. In 2020, the KFTC determined that NAVER Shopping’s self-preference conduct was illegal and imposed a fine of KRW 26.6 billion. This was the first case to apply the Fair Trade Act to unfair conduct through algorithmic manipulation of an online platform, but NAVER filed a lawsuit against the decision, which is currently under trial at the Supreme Court in 2023.

    • The Act on Promotion of Information and Communications Network Utilization and Information Protection has been protecting personal information since 1999. But it has mainly focused on the issue of personal information leakage, and the issue of misuse for other purposes, such as the collection and use of behavioral information by companies, has been relatively broadly allowed.

    • In 2011, the Personal Information Protection Act (hereafter “PIPA”) was enacted as the basic law for personal information protection, and the Personal Information Protection Commission (hereafter “PIPC”) was established. However, in 2020, in response to the needs of the new technology industry, the so-called “Data 3 Acts“, three personal data protection laws, were amended in order to relax the regulation regarding personal data protection, made it more difficult for data subjects to exercise their rights to pseudonymised personal data.

  • Major countries such as the European Union and the United States are pursuing legislation to regulate high-risk AI.

    • Currently, no legislation exists in the Republic of Korea to prohibit or regulate high-risk AI, nor are there specific requirements for transparency and accountability in public procurement of AI for citizens.

    • In particular, the Republic of Korea has laws that prohibit discrimination based on characteristics such as gender, disability, and age. However, there is no comprehensive anti-discrimination law, making the standards for regulating AI bias and discrimination unclear. The Constitution and the National Human Rights Commission Act declare prohibition of discrimination in principle and provide relief, but it is unclear whether discrimination by AI can be effectively regulated. In addition, there is no legal system to restrict bias and discrimination by AI that affects a specific group of people rather than a specific individual.

  • Public institutions have been introducing automated algorithms and AI, some of which are high-risk AI. However, there is no legal system in place to ensure non-discrimination, legality, due process, and redress of rights.

    • In 2018, the Korea Student Aid Foundation, a quasi-governmental organization that provides student loans to university students, analyzed the factors that affect student loan delinquency through a “Decision Tree Analysis” and published a report on “Characteristics of Student Loan Delinquency”.

      • This analysis was published to highlight the issue of discrimination against young people based on their salary level or university.

      • However, in Korean society, education and region have historically been important discriminatory factors. The pattern analysis that does not take these into account may stigmatize young people from certain groups and lead to further bias and discrimination when used in decision-making, including financial services.

    • In 2019, the Seoul city pushed to introduce so-called “robot investigators” using AI, but it was halted after the PIPC deemed it illegal.

      • Officials in the city of Seoul, acting as special judicial police officers, investigate cases related to ‘crimes against the people’s livelihood,’ including those in food, healthcare, trademarks, loans, door-to-door sales, and real estate sectors, and send cases to prosecutors.

      • The robot investigator automatically collects and categorizes tens of thousands of online posts, both public and private, based on the fact that crimes are often committed through social media. This process implies that even if an unspecified number of non-Seoul residents post something on their SNS that includes “Botox” or “special offer for newlyweds,” the robot investigator will collect these posts and review them for criminal relevance.

      • The PIPC believes that the robot investigator’s operation is similar to an ‘online stop-and-frisk’ and determined that it was an unlawful collection of personal information without a legal basis.

    • In 2021, the city of Bucheon, Gyeonggi-do, developed a facial recognition tracking system that recognizes and tracks the faces of COVID-19 cases and contacts in real time on all public CCTVs in the city and automatically collects their cell phone numbers from nearby base stations. However, the implementation was suspended after the controversy was reported in foreign media.

    • The General Act on Public Administration, enacted in 2021, provides that “an administrative authority may impose a disposition using a fully-automated system (including systems in which artificial intelligence technologies are employed): Provided, That the same shall not apply to dispositions imposed at its discretion.”(Article 20) This enables fully automated administrative disposition using AI.

      • The PIPC, as amended in 2023, establishes a provision on the rights of data subjects to fully-automated decisions and provides for the right to refuse such decisions or request an explanation if they have a significant impact on their rights or obligations (Article 37(2)). However, this provision does not cover automated dispositions by administrative authorities as allowed under Article 20 of the General Act on Public Administration. This exclusion creates legal ambiguity regarding the exercise of data subjects’ rights in such cases.

      • In 2023, a taxi with a maximum speed of 110 km/h was ticketed by the police for speeding 142 km/h. A media investigation revealed that the automated enforcement equipment introduced by the local police department had an error and measured the speed of the vehicle in the next lane. People estimate that there may have been more victims in the two years since introducing the equipment.

      • Experts point out that errors in equipment and measurement methods should be monitored regularly.

      • AI used by police to measure speed, recognize numbers, etc. to issue tickets is considered high risk.

      • Below, we present in more detail some of the cases that have sparked controversy in Korean society, particularly around high-risk recruitment AI, general purpose AI chatbots, platform labor, and AI for immigration control.

Read the full research here.

 

This publication was produced with the support of an APC subgrant, made possible by funding from the Swedish International Development Cooperation Agency (Sida).