Back To Top

[Editorial] Deepfakes pose real threat

Spread of explicit AI-generated fake images call for more proactive measures

Sexually explicit fake images of Taylor Swift, widely believed to have been generated by artificial intelligence tools, spread on social media last week at a dizzying pace, deeply alarming government officials, security experts, actors and many others.

The high-profile incident might be blamed on the widely expected side effects of generative AI, which is capable of creating fake photos using real images circulating on the internet. This creates an environment in which "deepfakes," fake images or videos that appear to be real, are used to scam or threaten innocent victims.

Only a few years ago, cyber swindlers had to at least acquire basic editing techniques to generate even a basic-level fake image. Now, hyper-realistic deepfake photos can be easily created on popular AI-based image programs with a mix of relatively simple prompts.

The improved access to powerful AI-based image and video tools is now gaining momentum and spreading even faster, aided by the network effect of the internet. Therefore, the speed at which deepfake images spread is also becoming faster and faster.

In contrast, authorities and social media companies remain slow to identify and crack down on illegally forged photos. It is telling that a single forged image of Swift was already viewed by so many users on X, formerly Twitter, before the account in question was suspended. Not to mention that her images quickly went beyond X to reach other platforms and were shared by social media users across the world.

Coincidentally, the Swift deepfake case has provided a timely lesson for Korean policymakers and politicians preparing to win seats in the forthcoming parliamentary election in April, as the country’s new law banning deepfakes in campaigns came into effect Monday.

In December, the National Assembly passed revisions to the Public Official Election Act which were designed to prohibit the use of deepfake images in election campaigns starting from 90 days before election day.

Those who use deepfake videos, photos or even sound to increase their chances could face up to seven years in prison or a fine of up to 50 million won ($37,000).

However, the new regulation has a clear and inevitable limitation in preventing the generation and sharing of deepfake images and videos. Given that social media and local online communities are quick to share negative pictures with other members, authorities will likely find it extremely difficult to detect, track and remove fast-evolving faked images generated in violation of the related law.

It’s simple math. There are far too many online users experimenting with AI-based image-generation software. In contrast, only 62 officials have joined the newly launched AI monitoring team at the National Election Commission.

Doctored images spilling through the regulatory cracks could have a make-or-break impact on candidates, especially if a deepfake video of a rival announcing a piece of self-defeating fake news is disseminated toward the end of the campaign period.

Asking politicians and their aides not to engage in the creation and proliferation of deepfakes is not enough. Voters must also voluntarily stay away from fake content and report any problematic images or videos they may come across to the election watchdog. Major portal sites, which run large-scale online news sections, should set up an effective system to filter out deepfakes.

Aside from elections, politicians, celebrities and ordinary people are likely to be exposed to increasingly sophisticated deepfake threats in tandem with the galloping advances in generative AI technology. Today’s deepfakes might be regarded as shallow or unsophisticated in a couple of years. Policymakers are advised to brace for and take proactive measures against far more advanced faked images and videos in the near future, including moving toward the adoption of new AI solutions powerful enough to crush fake output.



By Korea Herald (khnews@heraldcorp.com)
MOST POPULAR
LATEST NEWS
subscribe
피터빈트