The health and nutrition company GNC has suspended advertising on YouTube after some of its ads were seen on videos that seemed to show children in sexually exploitative positions.
The company made the announcement after a viral video exposed the network of videos of children — many uploaded by the children themselves — that were cobbled together to highlight instances where they were in revealing positions. The videos included children showing off gymnastics moves, stretching as part of physical therapy, and sitting with friends while wearing revealing clothing. The comment section were filled with sexually explicit comments and reportedly included others who were trading links to child pornography.
The video caused an uproar online, with many calling out to companies that had advertisements during the videos. On Tuesday, GNC responded.
“Thank you for reaching out,” the company wrote on Twitter. “We have a strict advertisement placement policy to ensure our ads do not run with harmful or unsafe content. We have paused advertising on YouTube while we investigate further.”
The video that sparked the uproar had been posted by a YouTuber who goes by MattWhatItIs, exposing what appeared to be a worldwide network of videos exploiting children and sharing sexually explicit content. The video showed how the site’s algorithm could be triggered to show nothing but questionable content, where other users from across the globe would post videos of children in sometimes explicit poses and content.
Many of these videos were filled with users posting time stamps that jumped directly to parts of the video where the children were in revealing positions. Commenting on some of these videos had been disabled, which appeared to show that YouTube was aware of the issues, yet the video was not taken down.
The video got viral attention, making it to the top of Reddit and drawing viral interest on Twitter with the hashtag #YouTubeWakeUp.
I’m absolutely DISGUSTED at YouTube for ignoring the sexual exploitation of minors on it’s platform!!! Watch this video for proof that its algorithm facilitates pedophilia#YOUTUBEWAKEUPhttps://t.co/Jkwjppvu2
— Amanda Weisbrod (@amanda_weisbrod) February 18, 2019
— Will (@thisbtfo) February 18, 2019
YouTube had been called out before for allowing sexually exploitative and harmful videos on its platform, including some that featured violence and sexual content aimed toward children. As Buzzfeed News noted, the site overhauled its safety and oversight practices in 2017 to hire 10,000 more moderators to oversee content.