Why Are There Fake Cartoons On YouTube – And What Can Parents Do About It?

Spoof Peppa Pig and Paw Patrol videos are scaring children.
|

It’s estimated that 300 hours of video are uploaded to YouTube every single minute – that’s a staggering amount of, let’s be honest, often exceedingly bland content that gets zero views. There’s also entertaining stuff in there – stuff that’s appealing to tired parents wanting to entertain their kids for a while.

But among the too-long clips of not-cute-enough pets, unspectacular walkthroughs of video games and underwhelming “unboxings” – there’s a phenomena that seems to be designed specifically to upset children: fake cartoons. 

These disturbing videos are made to appeal to kids or are offered to them algorithmically. They’ve been referred to as “Elsagate”, due to the large proportion of videos featuring the Frozen princess – but Peppa Pig seems to be the an often-abused character, too.

Open Image Modal
Christian Rummel via Getty Images

So, What’s Going On?

A child may be looking up their favourite cartoon on YouTube – Peppa Pig, let’s say – and ends up being horrified by something hideous showing up in the search results or on autoplay. It isn’t new – parents reported this happening in 2017 – but it’s gained fresh attention on social media recently after one mum drew attention to it.

Chelsea Ross said she handed her son a tablet and a few minutes later, found him watching Peppa Pig have a drug overdose. Her Facebook post about the warning, from 24 February, has had nearly 30k shares in the past week. 

Writer James Bridle’s lengthy Medium post on the subject in November 2017 puts it clearly: “To expose children to this content is abuse.” He wrote: “This is a deeply dark time, in which the structures we have built to sustain ourselves are being used against us — all of us — in systematic and automated ways.”

What’s Hidden In Fake Cartoons?

The videos often begin with legitimate (and probably stolen) content from a famous show, and then after some time has elapsed, switches to inappropriate, violent or sexual footage. Disturbing instructions for self-harm have been found buried in videos. Many retain the original audio while presenting disturbing content on-screen, so as not to alert parents.

There are also graphic, disturbing parodies of well-known cartoons that might be presented as a standard episode, before segueing into newly-created parody featuring violence and disturbing behaviour. One such video found by Wales Online, for instance, featured Paw Patrol characters committing suicide by overdosing on pills. 

Other cartoons feature low-budget videos of popular characters, with poor fancy dress costumes and adult themes, like the one reported by Forbes entitled ‘Spiderman Watching Under Anna’s Skirt’. And there’s also computer-generated stuff, where algorithms bringing together popular search terms end up producing bizarre content that’s disturbing for children.

Content of this kind is viewable on YouTube’s website and app. It’s also been found through the YouTube Kids app, which aims to be a safe space restricted to child-friendly content.

What Is YouTube Doing About It?

YouTube announced last month that it is working on improving its recommendations algorithm, so viewers aren’t led to inappropriate videos.

In 2017, it updated its policies to make “videos depicting family entertainment characters engaged in inappropriate behaviour” ineligible for advertising money in a bid to dissuade people from making such videos – this only works as a discouragement of content made for financial reasons, of course, not videos made deliberately to horrify.

YouTube does not allow users under 13 to create or own accounts on YouTube. In cases where it identifies an account may be run by someone under this age, it is routed to the “Underage Accounts review process”. The account is terminated or the owner is asked for legal verification that they’re over the relevant age of consent. 

In response to the recent bout of parents worrying about the fake Peppa Pig cartoons, a YouTube spokesperson said in a statement to HuffPost UK: “We work to ensure the videos in YouTube Kids are family-friendly and take feedback very seriously. We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video.

“Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed. We’ve also been investing in new controls for parents including the ability to hand pick videos and channels in the app. We are making constant improvements to our systems and recognise there’s more work to do.”

What Can Parents Do About It?

Within the YouTube and YouTube Kids apps, parents can set usage timers and block individual channels. Reporting inappropriate videos to YouTube will lead to them being checked (by a real person rather than an algorithm), and videos that are found to be in violation of the site’s community guidelines are removed. There is a subreddit thread where concerned parents share their discoveries and work on further solutions.

There are also other small, practical things parents can do, according to CNET: Disable the search function so your child will be recommended a smaller fraction of videos, and create a ‘whitelist’ for your child, which is a “hand-picked” list of videos by you that your child can watch. 

But, if we’re honest with ourselves as parents, we all know what the solution ultimately is – resisting the urge to hand your kid a tablet and let them amuse themselves unsupervised for hours. And there’s always the telly. CBeebies might not be everyone’s cup of tea, but it’s unlikely to traumatise anyone.