Some of the world’s most prominent companies are cutting their losses after learning their advertisements were appearing on potentially exploitative videos of children on YouTube, USA Today is reporting. A Wired article by K.G. Orphanides brought these videos back into public conversation, noting that there were popular videos with hundreds of thousands of comments on the platform that showed children in suggestive positions.
Video blogger Matt Watson also made his own video diving into this scandal, and showed how YouTube’s algorithm recommends similar videos after viewing one of that nature. In addition, the comments section appears to be a breeding ground for pedophiles, with users posting suggestive responses and even trading information with each other. YouTube says they are taking action by disabling comments on tens of millions of videos with minors in them, removing thousands of inappropriate comments on videos featuring minors, and terminating more than 400 channels for inappropriate comments. In addition, the platform is reporting illegal comments to the National Center for Missing & Exploited Children and has disabled auto-complete algorithms that made it easy for users to discover such content.
While this is a good start, Haley Halverson, the vice president of advocacy and outreach at the National Center on Sexual Exploitation, still says that YouTube is “monetiz[ing] videos that eroticize young children and that serve as hubs for pedophiles to network and trade information and links to more graphic child pornography.” AT&T, Epic Games, Disney, Nestle, Hasbro, and Kellogg are a few of the many companies pulling their advertisements from the site following this reveal. Many of the companies — AT&T, Epic Games, Kellogg, and Hasbro among them — have released official statements saying they will keep their ads removed from the site until the issue is under control.
— Forbes (@Forbes) February 18, 2019
Along with updating their community guidelines to specify how they catch and remove inappropriate content, YouTube had reportedly reached out to advertising agencies to inform them of the company’s plans to hold YouTube channel operators accountable for moderating the comments on their uploaded videos.
“There’s more to be done, and we continue to work to improve and catch abuse more quickly,” YouTube said in a statement.
Unfortunately, due to the massive size of the site, it may not be realistic for YouTube to be able to catch all of the exploitative content. A Video Bureau Study pointed out as much in a study called “Risky Business: Exploring Brand Safety on YouTube.”
“The sheer number of videos results in a platform that is very long tail, consisting of thousands of channels of largely user-generated content with varying degrees of brand safety,” the report said. “The threat of inappropriate content also extends to YouTube’s premium content given the lack of transparency and relative creative autonomy of top influencers.”