Back To Top

S. Korea to expand scope of deepfake sex crime undercover probes

Producing or distributing deepfake porn may lead to up to 7 years in prison

University students hold signs calling on the Education Ministry and universities to protect the victims of deepfake sex crimes and roll out support measures for them during a rally held near Gwanghwamun Square in central Seoul on Oct. 18. (Newsis)
University students hold signs calling on the Education Ministry and universities to protect the victims of deepfake sex crimes and roll out support measures for them during a rally held near Gwanghwamun Square in central Seoul on Oct. 18. (Newsis)

In an effort to clamp down on the rapid spread of deepfake sex crimes here, investigative authorities will be able to go undercover to catch digital sex offenders in cases where victims are adults and enable online platforms to block sexually explicit content before review, the Korean government said Wednesday.

Rolling out a comprehensive package of strengthened measures to handle AI-generated sexual abuse, the government announced plans to push for a revision to the Act on Special Cases Concerning the Punishment of Sexual Crimes to allow undercover investigations chasing online sexual predators in crimes targeting adults to collect evidence. This procedure, however, will require a request from the prosecutor and the court's approval to ensure the process is legally justified.

Currently, undercover investigations are limited to deepfake sex crimes targeting children or teenagers.

This marks the first pan-governmental response plan following concerns over the lack of a coordinated strategy by related institutions for tackling deepfake sex crimes in the country.

Under the plan, the government will amend the Telecommunications Business Act to allow platforms with deepfake materials to block such images and videos first before requesting the Korea Communications Commission to review the material. To minimize harm and protect victims, the operating platform will have to remove the illegal content within 24 hours of the broadcasting regulator's request.

People caught possessing, purchasing, storing or even viewing deepfake porn content will face up to three years in prison or a fine of up to 30 million won ($21,592).

A person who creates or edits sexually explicit videos or images and distributes such online can now be punished with up to seven years in prison, compared to the five years under current law. Those without the intent to distribute online can also be punished.

Those exploiting deepfake sex crimes involving minors via blackmail or coercion will be handed a maximum of three and five years in jail, respectively.

The Justice Ministry is pushing to propose amendments to the sexual crimes punishment law to confiscate offenders' profits gained through deepfake sex crimes. The government added that it would grant leniency to individuals who self-report their participation in deepfake sex crimes.

As deepfake crimes circulate online, domestic and international platform operators considered "intermediaries providing harmful content to minors," such as Telegram, could potentially be subject to regulation for distributing open channel links to unspecified people that could encourage them to have access to explicit materials.

Corrective orders and fines will be issued against value-added service providers like Naver and Meta if they fail to prevent the distribution of illegally filmed materials and deepfake content, according to the KCC. The regulator noted that in countries like France and the UK, social media companies or online service providers are responsible for managing illegal content on their platforms.

In line with the digital transformation, officials noted that they would use AI technology to detect deepfake content automatically in real-time. Once detected, the system will automatically request that the platform operations delete such content.

Authorities also said they plan to team up with platform operators to respond to removing or blocking deepfake pornographic content and set up additional hotlines to receive reports on illegal activities to safeguard victims. They plan to create a website where victims can report incidents, making it easier for them to access support services.

In addition, the government plans to strengthen cooperation with overseas-based social network platform companies so that they provide subscribers' personal information in response to official requests from Korean courts and investigative agencies.

Moreover, schools and youth facilities will educate teenagers that creating, sharing or watching sexually explicit materials is a serious crime, while universities will set up various methods -- such as deepfake prevention booths -- to raise awareness.

A total of 781 deepfake victims asked for help from the Advocacy Center for Online Sexual Abuse Victims in the first eight months of this year, and 288 of them were minors, according to the Woman's Human Rights Institute of Korea. Also, some 387 individuals have been apprehended this year for deepfake sex crimes, according to police in late September.

In an effort to clamp down on the rapid spread of deepfake sex crimes here, investigative authorities will be able to go undercover to catch digital sex offenders in cases where victims are adults and enable online platforms to block sexually explicit content before review, the Korean government said Wednesday.

Rolling out a comprehensive package of strengthened measures to handle AI-generated sexual abuse, the government announced plans to push for a revision to the Act on Special Cases Concerning the Punishment of Sexual Crimes to allow undercover investigations chasing online sexual predators in crimes targeting adults to collect evidence. This procedure, however, will require a request from the prosecutor and the court's approval to ensure the process is legally justified.

Currently, undercover investigations are limited to deepfake sex crimes targeting children or teenagers.

This marks the first pan-governmental response plan following concerns over the lack of a coordinated strategy by related institutions for tackling deepfake sex crimes in the country.

Under the plan, the government will amend the Telecommunications Business Act to allow platforms with deepfake materials to block such images and videos first before requesting the Korea Communications Commission to review the material. To minimize harm and protect victims, the operating platform will have to remove the illegal content within 24 hours of the broadcasting regulator's request.

People caught possessing, purchasing, storing or even viewing deepfake porn content will face up to three years in prison or a fine of up to 30 million won ($21,592).

A person who creates or edits sexually explicit videos or images and distributes such online can now be punished with up to seven years in prison, compared to the five years under current law. Those without the intent to distribute online can also be punished.

Those exploiting deepfake sex crimes involving minors via blackmail or coercion will be handed a maximum of three and five years in jail, respectively.

The Justice Ministry is pushing to propose amendments to the sexual crimes punishment law to confiscate offenders' profits gained through deepfake sex crimes. The government added that it would grant leniency to individuals who self-report their participation in deepfake sex crimes.

As deepfake crimes circulate online, domestic and international platform operators considered "intermediaries providing harmful content to minors," such as Telegram, could potentially be subject to regulation for distributing open channel links to unspecified people that could encourage them to have access to explicit materials.

Corrective orders and fines will be issued against value-added service providers like Naver and Meta if they fail to prevent the distribution of illegally filmed materials and deepfake content, according to the KCC. The regulator noted that in countries like France and the UK, social media companies or online service providers are responsible for managing illegal content on their platforms.

In line with the digital transformation, officials noted that they would use AI technology to detect deepfake content automatically in real-time. Once detected, the system will automatically request that the platform operations delete such content.

Authorities also said they plan to team up with platform operators to respond to removing or blocking deepfake pornographic content and set up additional hotlines to receive reports on illegal activities to safeguard victims. They plan to create a website where victims can report incidents, making it easier for them to access support services.

In addition, the government plans to strengthen cooperation with overseas-based social network platform companies so that they provide subscribers' personal information in response to official requests from Korean courts and investigative agencies.

Moreover, schools and youth facilities will educate teenagers that creating, sharing or watching sexually explicit materials is a serious crime, while universities will set up various methods -- such as deepfake prevention booths -- to raise awareness.

A total of 781 deepfake victims asked for help from the Advocacy Center for Online Sexual Abuse Victims in the first eight months of this year, and 288 of them were minors, according to the Woman's Human Rights Institute of Korea. Also, some 387 individuals have been apprehended this year for deepfake sex crimes, according to police in late September.



By Park Jun-hee (junheee@heraldcorp.com)
MOST POPULAR
LATEST NEWS
subscribe
지나쌤