South Korean lawmakers on Thursday passed a bill that criminalises the possession and viewing of sexually explicit deepfake images and videos. The new legislation introduces strict penalties, including imprisonment and hefty fines, in response to growing public outrage over the distribution of such content.
The move follows widespread criticism of Telegram group chats where illegal deepfake videos were created and circulated, prompting calls for stronger legal measures. According to the Bill, individuals found purchasing, saving, or viewing sexually explicit deepfakes could face up to three years in prison or a fine of up to 30 million won ($22,600).
Heavy penalties for deepfakes
Until now, producing sexually explicit deepfakes with the intent to distribute has been punishable by up to five years in prison or a fine of 50 million won ($37,900) under South Korea’s Sexual Violence Prevention and Victims Protection Act. The new law, however, raises the stakes, increasing the maximum sentence to seven years in prison, regardless of intent, when it takes effect.
The bill now awaits final approval from President Yoon Suk Yeol before it can be enacted.
Rising crimes spur crackdown
South Korea has seen a significant rise in deepfake sex crime cases in recent years. The Yonhap news agency reported on Thursday that police have already handled more than 800 cases involving sexually explicit deepfakes this year alone, a staggering increase from 156 cases in 2021. Most victims and offenders in these cases are teenagers, police revealed.
In response to the growing threat, authorities have also launched an investigation into Telegram, focusing on whether the encrypted messaging app played a role in facilitating the distribution of illegal deepfake content.
Legislations against deepfake
The surge in deepfake crimes is not unique to South Korea. Countries worldwide are grappling with the implications of this technology. In the United States, Congress is currently debating legislation aimed at giving victims of non-consensual deepfakes the right to sue, and at criminalising the publication of such material. One proposal would also compel tech companies to remove explicit deepfake content swiftly.
More From This Section
Earlier this year, social media platform X (formerly Twitter) temporarily blocked searches for Taylor Swift after fake sexually explicit images of the pop star spread across the platform, highlighting the global scale of the issue.