On April 11, 2025, Billboard reported the significant arrest of eight individuals by HYBE and South Korean law enforcement, stemming from the illicit creation of deepfake content involving popular K-pop groups BTS and LE SSERAFIM. This operation was the result of a collaborative investigation between HYBE and the Northern Gyeonggi Provincial Police Agency (NGPPA).
The arrested suspects are accused of generating and disseminating AI-manipulated videos and images that misappropriated the likenesses of artists associated with HYBE. In response, Lee Jae-sang, CEO of HYBE, emphasized the company’s commitment to safeguarding artist rights, stating:
“HYBE will respond firmly to crimes that infringe on our artists’ portrait rights and reputations, based on a zero-tolerance and no-settlement policy. We will continue to monitor and take legal action to eradicate such serious crimes.”
As a leading force in the K-pop industry, responsible for bands like BTS, SEVENTEEN, and LE SSERAFIM, HYBE is proactively addressing the rising surge of deepfake content. In a strategic move made in February 2025, the company established a memorandum of understanding with the NGPPA, aimed at expediting investigations and launching a hotline for reporting these offenses. This partnership is crucial for protecting artists against online exploitation and streamlining legal actions against offenders.
The Deepfake Dilemma: A Growing Concern in K-pop
The K-pop landscape faced a disturbing crisis in 2024, as over 200 artists—predominantly female idols—became victims of AI-generated sexually explicit images. This alarming incident not only intensified discussions around digital privacy rights and the safety of public figures but also highlighted a troubling rise in cybercrime.
The arrests of the eight suspects signal a positive shift in combating the deepfake epidemic that has plagued South Korea since last year. Ho-seung Kim, NGPPA’s district chief, noted:
“Deepfake crimes that exploit the vulnerabilities of celebrities—who often find it difficult to report such offenses themselves—are on the rise. Deepfake is a serious type of crime that can destroy the daily lives of victims, and crimes targeting public figures are no exception.”
Deepfake technology leverages artificial intelligence to superimpose individuals’ faces onto explicit imagery without consent. According to a 2023 research study conducted by the cybersecurity firm Security Hero, South Korea ranks as the most affected nation concerning deepfake pornography, with 53% of such content featuring K-pop celebrities and actresses.
Moreover, a detailed analysis by The Straits Times revealed that approximately 95,820 deepfake pornographic videos featuring K-pop artists and other women were identified across ten different websites. Additionally, 85 channels dedicated to deepfake content were discovered on major video platforms such as YouTube, Vimeo, and Dailymotion.
This deepfake crisis has not spared any major K-pop agency, with renowned groups like TWICE, NewJeans, BLACKPINK, and IVE also targeted. A particularly distressing incident involved unauthorized deepfake images of IVE members Jang Wonyoung and An Yujin being mistakenly posted on the group’s official Weibo account, prompting widespread public concern and an official response from their management, Starship Entertainment.
In light of recent events, multiple K-pop companies have publicly committed to taking stringent measures against deepfake producers. JYP Entertainment, which represents artists like TWICE and ITZY, announced plans to gather all relevant evidence and enforce rigorous legal actions against offenders. Similarly, YG Entertainment, the agency behind BLACKPINK and BabyMonster, is actively monitoring online platforms to eliminate illicit content while leveraging all available legal recourse.
In September 2024, President Yoon Suk-yeol called for a comprehensive investigation into the surge of deepfake pornography, revealing that victims, particularly minors, comprised about 60% of those affected. As a result, law enforcement has escalated efforts to combat digital sex crimes. The National Police Agency reported that since September 2021, they have conducted 515 undercover investigations into illegal deepfake pornography, leading to the arrest of 1,415 individuals.