Fortnite pauses YouTube ads in response to pedophile network

Epic Games joined several other YouTube advertisement partners in pulling and pausing their pre-roll advertisements on Google's video service. For the time being, Fortnite ads are not playing on YouTube.


Epic Games is not afraid of taking principled stands, and this has been increasingly apparent with the increased clout they have gained with the success of Fortnite. Today, the company announced that they will be pausing advertising on YouTube following reports of a pedophile network actively engaging the comments section of monetized videos. 

Epic Games is joined by Disney and Nestle is their decision to pause ads on YouTube. Reports circulating across the Internet today describe a disturbing and seemingly coordinated effort by pedophiles to target videos featuring children. Many times, family videos posted to YouTube may feature a child without all of their clothing on. These comments are frequently nothing more than timestamps that direct other viewers to moments of child nudity in videos.

There are further details of videos of young girls playing Twister, engaging in gymnastics, eating popsicles, or playing in a pool that are being attacked by a horde of these degenerates. Time codes of crotch shots, soliciting exchanges of phone numbers, and promises to swap videos on other platforms (Snapchat, Whatsapp, Kik) are all being reported by YouTube users. 

A spokesperson for Fortnite developer Epic Games detailed the decision to pause all pre-roll YouTube advertising saying, “through our advertising agency, we have reached out to YouTube to determine actions they’ll take to eliminate this type of content from their service.”

YouTube declined to comment on any specific advertisers, but did state that, "any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments."

It is important to understand that the content creators involved in this are not the people at fault. YouTube comments are not known for their moderation, and these reports are a fine example of the challenges Google faces as the platform becomes more influential in the world of marketing and advertisment. 


Asif Khan is the CEO and majority shareholder of Shacknews. He began his career in video game journalism as a freelancer in 2001 for Asif is a CPA and was formerly an investment adviser representative. After much success in his own personal investments, he retired from his day job in financial services and is currently focused on new private investments. His favorite PC game of all time is Duke Nukem 3D, and he is an unapologetic fan of most things Nintendo. Asif first frequented the Shack when it was sCary's Shugashack to find all things Quake. When he is not immersed in investments or gaming he is a purveyor of fine electronic music. Asif also has an irrational love of Cleveland sports.

From The Chatty
  • reply
    February 20, 2019 3:25 PM

    Asif Khan posted a new article, Fortnite pauses YouTube ads in response to pedophile network

    • reply
      February 20, 2019 3:42 PM

      I had been busy the last few days and only heard about the "CP" deletion issue (targeting Pokemon Go or Club Penguin videos), but I finally watched Watson's video on this.

      Wow, that's... really bad. I doubt this was an intentional result of YT's recommendation system, which is based on what other users that watched similar videos would subsequently watch, but still... that's a crazy crazy wormhole.

      • reply
        February 20, 2019 4:56 PM

        I just listened to Watson's video, this is fucked up. I can see exactly how the wormhole thing functions. You take a type of video that would be incredibly unpopular to the general public, IE "kids doing literally anything" and you get a whole group of sick fucks who go on youtube and watch those and only those videos and youtube's recommendation sees that and groups them all together, because they are never going to get grouped into anything else because they are of no use or interest to anyone else. You can almost replicate this effect by watching a video of any other stupidly unpopular hobby. Watch a video about making miniatures for model train displays, and welcome to the endless world of model train fanatics in the sidebar unless you actively watch other stuff. It's youtube's recommendation system working as intended, you showed interest in something, here's everything we could find that might be similar. Only these people have co-opted that effect to in a way, flash mob view videos they want added to their pool of videos, and chances are there aren't enough other people watching these videos to ever cause them to get lumped into anything else other than a pedo network.

        Completely sick.

    • reply
      February 20, 2019 4:14 PM

      Just opened up an incognito tab to check whether this "underage softcore" recommendation wormhole still happened. Searched "bikini haul", clicked through to a couple videos, then started seeing vids of very young girls and clicked through a couple of those.

      Yep. Nothing but underage softcore in the recommendations. Click on a video of a young girl enjoying a lolipop, and you'll get recommendations for little girls enjoying various different phallic foods.

      Really fuckin' weird. Like holy shit, what a PR nightmare. How has this not been fixed yet, even in some kind of slapdash bandaid fix while they work out a more permanent solution?

      All I'm seeing is that the comments are often disabled. Like great, now the pedophiles can't post timestamps and network with each other, but you don't need networking when YOUTUBE IS STILL RECOMMENDING AN ENTIRE CATALOG OF UNDERAGE SOFTCORE!!!!!!

      • reply
        February 20, 2019 4:16 PM

        Assuming that YouTube recommendations work in a similar manner as Google's search engine works, it seems like it would be readily possible to create with a network of people all agreeing to link or watch each others videos to feed the algorithm biased information and throw it off.

    • reply
      February 20, 2019 4:23 PM

      I had to parse that title three times. THEN I still had to read the article. I remember that story a few months back where someone was posting videos of their teenage daughters (and their teenage friends), basically just hanging out on camera, and the channel was way more popular than it should have been and I couldn't figure out why, then I glanced at the comments and oooooh those dudes.

    • reply
      February 20, 2019 4:28 PM

      Had no idea this was a thing. Wish I still didn't, but glad people are bringing attention to the issue.

    • reply
      February 20, 2019 4:35 PM

      It’s a seriously fucked up situation and it needs to be Google’s top priority to resolve

    • reply
      February 20, 2019 4:40 PM


      • reply
        February 20, 2019 4:44 PM

        Seems like a good time to link this again, the answer apparently is they don’t.

        • reply
          February 20, 2019 5:05 PM

          This guy finds the weirdest most fucked up depressing shit on the internet and produces some great content about it....with the most annoying editing ever.

          • reply
            February 20, 2019 5:08 PM

            It’s strangely entertaining and cringe inducing informative.

      • reply
        February 20, 2019 4:54 PM

        It's designed, from the ground up, wrong.

        Never assumes bad intentions, just like PageRank never assumed SEO.

        The next generation of social software will be very different, basically assuming ISIS pedos trying to influence GOP voters at every turn.

        Problem is that ads need virality, but that virality is the attack vector for stupidity.

        Ad-free, heavily moderated, expensive networks are a potential future. I pay for ad-free Youtube and would be more for an algo-free one - the recommendation side bar is utter shite.

        • reply
          February 20, 2019 6:28 PM

          It should be possible to perform something like graph analysis to identify a group of highly interconnected videos, and then spot check those to see the type of video content and comment to determine if it is part of this wormhole, without losing the benefits of the recommendation system (which nearly always shows me a positive cute animal video after watching a depressing political video)

          But that requires a lot more human involvement which I don't feel YouTube or Google really want to do.

      • reply
        February 20, 2019 7:11 PM

        optimize your ML algorithm for more than just engagement

      • reply
        February 21, 2019 5:34 AM

        Temporarily at least disable the recommendation engine

    • reply
      February 20, 2019 4:51 PM

      Much worse happened to Bing a few months ago I was reading.

      Apparently certain search terms would return tons of insges plus Bing would offer up additional searches to try.

      This on top of the tumblr scandal too. Sites are struggling with this stuff right now.

    • reply
      February 20, 2019 4:52 PM

      I can see that eventually they won't allow videos with anyone under 18. Which would be awful.

      • reply
        February 20, 2019 5:04 PM

        Really? would youtube be awful without kids on it? That sounds great actually, could we have online multiplayer games without kids too?

      • reply
        February 20, 2019 5:08 PM

        No, what I can see is that conservative groups will LEVERAGE THE SHIT out of this situation and bring around some draconian censorship laws

        • reply
          February 20, 2019 11:55 PM

          aaaaaand anything else even vaguely related to anything that benefits big money will be shoehorned into the bill, but if you don't support it then THE DEMOCRATS ARE SIDING WITH THE PEDOPHILES!

      • reply
        February 20, 2019 10:00 PM

        That would never happen.

      • reply
        February 21, 2019 7:12 AM

        Or just disable the comment section for any video with anyone under 18 in it.

    • reply
      February 20, 2019 6:59 PM

      This is tough to wrap my head around in terms of applying a fix. It might be something we need to live with like taking my kid to the park. But maybe, I haven't understood the situation? Help

      The content creators aren't posting pedophilia. Fact?

      The pedophiles are gaming the algorithm to filter for non-pedophilia content that other pedophiles have found arousing, thereby, dumping the searcher in a relevant "wormhole". However, this act makes the previously non-pedophilia content, suddenly pedophilia because ab pedophile watched it? Is that right?

      In other words, a gross freak jerked off to normal activities that I can witness in everyday life, e.g. a kid eating a popsicle, kids dancing like adults, etc. And that means content providers and creators must write algorithms that predict and prevent deviants and predators from getting a crazy jerk off they're going to find anywhere they look?

      • reply
        February 20, 2019 9:58 PM

        That what I was thinking last night. How do you fix this? Does it need a fix? I don't need to watch any of these videos, but I'm sure most of it is normal kid stuff and then you probably have parents that are aware of the situation and exploit their kids.

      • reply
        February 20, 2019 11:59 PM

        I mean, if they just made a big deal out of selectively disabling recommendations for conspiracy videos, this seems like it should be the same sort of thing. Make a tag for the kind of recommendation category of "very young girls being provocative" and fucking disable that shit, at least broadening the recommendations so if you're at "girl eating popsicle" maybe you get recommended pizza reviews or a family vlog.

Hello, Meet Lola