Back To Top

Effectiveness of parental control apps thrown into doubt

Mobile carriers, government regulators urged to strengthen filtering system to better protect children from harmful content amid privacy dispute

(graphic: 123rf)
(graphic: 123rf)

A growing number of clashes between parents and their children revolve around the use of parental control apps that filter out obscene content on online platforms, but there are concerns these apps may not even be working.

The prolonged COVID-19 pandemic is forcing students to study from home through smartphone apps, prompting parents to apply filtering apps to block harmful online content.

Parental control apps can be installed on computers and mobile devices, with a comprehensive list of sites that can prevent minors from accessing inappropriate content. Major parental control apps like Family Link and Mobile Fence include extra functionalities, such as location tracking, a record of sleeping hours and app installation controls.

Many parents worry that their children could fall prey to criminal activities instigated online. Public awareness of the way vulnerable internet users are targeted has risen since the high-profile “nth room” sex crime case sent shock waves throughout Korean society last year. The suspects reportedly used social media to lure in unsuspecting victims and ran secret chat rooms on Telegram to sell the obscene materials they produced.

But children claim parental control apps infringe upon their privacy. An online petition was filed against such apps, and in March the National Human Rights Commission of Korea decided that monitoring with such apps violates children’s human rights, touching off a round of disputes about the pros and cons of tracking mobile activities. Aside from the privacy issue, industry watchers have said there are big loopholes in the related government policy.

Under the Telecommunications Business Act, mobile carriers are required to provide tools to block harmful media and information, including pornographic sites.

But questions are being raised about whether such preventive apps are working properly, as most users do not install them and the government is not channeling enough resources to strengthen the protective measures, according to critics.

Experts pointed out that the apps are bound to have limitations as they block only sites that have been reported by parents as harmful for children.

To expand coverage, the Korean Communications Standards Commission is preparing to unveil a standard video database to help track and block obscene videos by collecting and analyzing their digital features. But this type of filtering is based on a method that identifies an exact copy of a video in the database, which means site operators can bypass the filtering through simple modifications to the video.

Another aspect to consider is the enforcement of the regulations.

KT, SK Telecom and LG U+, which allow subscribers to download and install their parental control apps, are supposed to notify parents or legal guardians when the apps are deleted or remain unused for 15 days. The Korea Telecommunications Commission is also required to monitor whether the carriers send out the obligatory notifications in time.

According to data revealed by Rep. Jung Pi-mo at the National Assembly last year, the average proportion of younger mobile users who installed parental control apps stood at 38 percent. By carrier, SK Telecom has 54.8 percent of minors who installed its app, followed by KT with 24.7 percent and LG U+ with 17.7 percent.

“The Korea Communications Commission said it’s conducting regular checkups, but the low installation rate raises questions about whether mobile carriers and the government are doing their jobs properly,” said Rep. Jung, a member of the Science, ICT, Broadcasting and Communications Committee at the National Assembly.

In addition to issues related to parental control apps, the government faces a herculean task in sorting out the huge number of harmful sites reported by parents, civic groups and individuals.

It takes time to review obscene sites and add them to the database, and it is widely accepted that many harmful sites avoid the scrutiny of parents and government officials. As companies depend on the government’s database, even if parental apps are installed, filtering is not comprehensive, experts have pointed out.

Some software developers argue that it is time to apply filtering technology based on artificial intelligence, as automated solutions can be updated quickly and reduce the burden of tracking inappropriate sites for minors.

Korea’s internet giants Naver and Kakao are using AI solutions to block harmful content, as required by the revised law that makes it mandatory for platform operators to take steps to filter out obscene sites.

Naver’s filtering system, updated to its second version from this month, classifies harmful content into four categories before filtering out the sites. Naver said its system was about 98 percent accurate in identifying problematic content.

Kakao applies deep learning technology to remove obscene content from its news, blog and online TV services.

By Yang Sung-jin (insight@heraldcorp.com)
MOST POPULAR
LATEST NEWS
leadersclub
subscribe
지나쌤