YouTube has removed more than 58 million videos and 224 million comments in the third quarter alone because the videos and comments in question violated the website’s policies.
Hundreds of thousands of the videos were taken down because they failed to comply with YouTube’s child safety rules. Apart from that, YouTube also removed videos that contained nudity and extremist content, a report by the Daily Mail stated.
According to YouTube, a chunk of the content that it removed this quarter was spam, accounting for nearly 80 percent of channel take-downs. Per the report, the video-streaming unit of Alphabet Inc’s Google announced the removal of content on Thursday to show that the website is doing its best to suppress problematic content.
As the report detailed, government officials and interest groups in the United States, Asia, and Europe have been putting pressure on social media channels like YouTube, Facebook, and other social networking services to promptly identify and take down the content of extremist or hateful nature that is likely to incite violence.
According to a proposal by the European Union, fines should be imposed on online services if they don’t take extremist material down “within one hour of a government order to do so,” the report stated.
The report also quoted an anonymous official from India’s Ministry of Home Affairs who said on Thursday that “social media firms had agreed to tackle authorities’ requests to remove objectionable content within 36 hours.”
In 2018, YouTube started issuing quarterly reports about its efforts related to the enforcement of policies. Akin to the present quarter, the vast majority of content taken down during the first two quarters was also spam, per YouTube.
YouTube detects objectionable content through its automated detection tools that quickly identify nudity, spam, extremist material, and any other content that violates YouTube’s guidelines.
The Daily Mail report also said that YouTube removed 10,400 videos in September this year, 90 percent of which contained content related to violent extremism, while some 279,600 videos were taken down because they violated YouTube’s child safety guidelines. The video in question, however, had received fewer than 10 views when they were taken down.
The report said that although YouTube is taking inappropriate content down, it still has bigger challenges when dealing with material that promotes hateful rhetoric or dangerous behavior, as the automated detection technologies for those policies are relatively less efficient at present.
The video-streaming site, therefore, relies on users to report potentially harmful comments and videos, which means that the content already received a lot of views before being flagged by users as inappropriate.