Back To Top

Facebook unveils report on enforcement of its community standards

US social networking giant Facebook Inc. said Thursday it has released an updated report on how it has been making efforts to detect content that violates community rules to proactively remove it from both Facebook and Instagram.

The report contains data and metrics that show which categories are the most problematic in terms of violations of Facebook rules and the number of cases that violated regulated standards.

Among the wide variety of content were child nudity, child sexual exploitation and terrorist propaganda, as well as illicit firearms, drug sales and suicide, according to Facebook.


(Yonhap)
(Yonhap)

"Our content policy team is a global network who work 24 hours and 365 days to monitor the potentially disturbing content," Yoo Dong-yeong, an official at the Facebook Asia Pacific regional office, told reporters in Seoul.

Facebook said 15,000 people at Facebook review potentially harmful content and provide reviews on reported content in over 50 languages, including Korean.

Facebook said its ability to spot and remove potentially harmful content on both websites and apps has vastly improved.

On Facebook, 99 percent of the removed content was "proactively detected," meaning the rest was reported. On Instagram, 94.6 percent of the removed content was proactively detected.

Some 2.5 million posts concerning suicide and self-harm were taken down from Facebook between the fourth quarter of last year and the first quarter of this year, while 845,000 pieces of similar content were removed from Instagram, according to Facebook.

Yoo said Facebook removes content that depict suicide or self-injury, even graphic imagery that experts deem to lead others to engage in similar behaviors. (Yonhap)

MOST POPULAR
LATEST NEWS
leadersclub
subscribe
소아쌤