|
(Scatter Lab) |
South Korean civic groups on Wednesday filed a petition with the country’s human rights watchdog over a now-suspended artificial intelligence chatbot for its prejudiced and offensive language against women and minorities.
An association of civic groups asked the National Human Rights Commission of Korea to look into human rights violations in connection with the chatbot Lee Luda, which was developed by local startup Scatter Lab.
The groups, which include the People’s Solidarity for Participatory Democracy and Lawyers for a Democratic Society, also demanded changes to current laws and urged institutions to prevent human rights violations stemming from abuse of AI technologies.
“The Lee Luda case does not only constitute to violations of human rights of individuals but it also represents how abuse of AI technologies can have negative impact on human rights,” the groups said in a statement.
The social media-based AI chatbot Lee Luda, which was designed to speak like a 20-year-old female university student, attracted more than 750,000 users with its realistic and natural response. The bot learned from some 10 billion real-life conversations between young couples drawn from the country’s popular messenger app KakaoTalk.
However, the bot services were suspended less than a month after its launch, as members of the public lodged complaints over Luda’s use of hate speech towards sexual minorities and the disabled in conversations. Some also alleged that there were male users who had managed to manipulate the chatbot into engaging in sexual conversations.
The company also faced suspicions over possible violation of privacy laws in the process of retrieving personal data from its users, with many complaining that their real names and addresses had popped up in conversations with Luda.
The company apologized over the matter, saying that it failed to “sufficiently communicate” with its users.
Delivering policy recommendations to the human rights watchdog, the groups called for an overhaul of relevant institutions and regulations to prevent violations of privacy and freedom of expression by abuse of AI technologies and algorithms.
“Korea is adopting new technologies in commercial sectors without question in the name of the Fourth Industrial Revolution, and there is neither legislative nor administrative basis to protect citizens’ rights,” they said.
The groups also asked the NHRCK to give out recommendations for victims whose personal data had been used without consent in the Luda case to be compensated.
By Ock Hyun-ju (
laeticia.ock@heraldcorp.com)