Epic Games Pulls Fortnite Ads Following YouTube Pedophile Sex Ring Discovery (VIDEO)
It wasn’t long ago when a shocking discovery exposed a bombshell hiding within the depths of YouTube: the platform was inadvertently playing host to a legion of pedophiles. Videos across the platform were revealed to feature predatory child exploitation, with YouTube seemingly doing little to fix the alarming situation. Now that the controversy has gone viral, advertisers have begun to take notice, among them being Fortnite publisher Epic Games.
In a statement with The Verge, Epic has announced their decision to remove video ads from YouTube in protest of these recent findings. Specifically halting “pre-roll” advertising, the company states they’ve reached out to Google and YouTube to investigate these malicious videos and “eliminate this type of content from their service.”
We have paused all pre-roll advertising. Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service.”
The unsettling development originates from an extensive exposee by YouTuber Matt Watson (a.k.a. MattsWhatItIs), where he details an apparent glitch in YouTube’s “Recommended Videos” algorithm, allowing a culture of child exploitation and pornography to prosper right under the platform’s nose. The video, seen below, demonstrates a simple, innocuous YouTube search can easily yield unsavory results, namely a bevy of content sexualizing underage children.
Youtube is Facilitating the Sexual Exploitation of Children, and it’s Being Monetized (2019)
byu/Mattwatson07 invideos
Watson’s video has been circulating on Reddit over the last few days, the post quickly gaining thousands of upvotes and and shooting to the front page of the site. The post was accompanied by a message regarding the extent of his discovery, including the presence of monetized ads appearing alongside many of these questionable videos.
Youtube’s recommended algorithm is facilitating pedophiles’ ability to connect with each-other, trade contact info, and link to actual child pornography in the comments. I can consistently get access to it from vanilla, never-before-used Youtube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks. I have made a twenty Youtube video showing the process, and where there is video evidence that these videos are being monetized by big brands like McDonald’s and Disney.”
The exposure of YouTube’s massively compromised system looks to have struck a chord with redditors as well as advertisers, as notable brands have now seen their products indirectly associated with pedophiles. As companies likely wouldn’t appreciate their wares tainted by depraved content, its likely advertisers may choose to follow the lead of Fortnite until YouTube resolves their long-standing problem.
In the meantime, YouTube appears to finally be taking some action against this predatory behavior. A YouTube spokesperson told The Verge, “We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments. Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
What do you think? Was Epic right in pulling their Fortnite ads off YouTube? Let us know in the comments below and be sure to follow Don’t Feed the Gamers on Twitter and Facebook to be informed of the latest gaming and entertainment news 24 hours a day! For more headlines recently featured on DFTG, check out these next few news stories:
Eric Hall2712 Posts
Phone-browsing Wikipedia in one hand and clutching his trusty controller in the other, the legendary Eric Hall spreads his wealth of knowledge as a writer for Don't Feed the Gamers. Be sure to catch his biweekly "Throwback Thursday" segment for a nostalgic look at trivia from the past.