Do you know what your children are watching on YouTube Kids? Don’t just answer “cartoons” or “Peppa Pig,” what are they really watching? Have you stopped to look at the titles of the videos? Or watched any of the skits being performed?
If you’re not actively monitoring your children’s use of the platform, chances are they are seeing something you don’t want them to see. Far from the safe environment of children’s entertainment that it’s supposed to be, YouTube Kids is awash in disturbing, surreal, and even traumatizing material.
How Did This Happen?
In November 2017, writer James Bridle wrote an article that went viral detailing the dark underbelly of YouTube Kids. In his essay, Bridle cites examples of nightmarish videos featuring beloved cartoon characters buried alive, live action performers acting out bizarre scenes of kidnappings, and a fixation on themes of pregnancy and medical procedures.
Disturbing content to be sure, but what made these videos stand out isn’t just the troubling themes, but their prevalence. These videos were not some abnormality on the service, not the occasional mean-spirited prank uploaded by a troll like you might expect. No, these bizarre videos make up a huge swath of YouTube Kids material and they are only proliferating. These were not obscure oddities but some of the most viewed content on the platform.
How did this happen? Bridle describes it as the “industrialized nightmare production” of YouTube Kids. A combination of algorithmic gaming (trying to create content that will be automatically picked up and featured by YouTube’s automated recommendation system), a monetization system that rewards raw viewer count over any other metric, and machine production.
That’s right, machine production. One of the most unsettling things about these videos is that outside of the videos featuring live actors, the vast majority of them are created by machines. There is little to no human involvement behind the production of this material and it’s one of the chief reasons this kind of content is so prolific on the service.
When one thinks of animation, you think of the outstanding work of Disney, or DreamWorks, with large teams of animators lovingly hand drawing each cell or manipulating each model. That’s not the case with YouTube material made to specifically manipulate the recommended algorithm and proliferate en masse. These videos are assembled using automated templates based on what is currently getting clicks. Popular themes and characters are either fed into a system or automatically picked up and videos are generated with actions and characters being entirely swappable or customizable depending on what the video needs.
Think of it like have a big list of verbs and nouns. The animation is the verb while the characters are the noun. When the algorithm notices an uptick on the popularity of videos about playing catch, it will pull out its pre-made animation clip of models playing catch (often purchased as a ready made package) and map it to two popular characters, such as Spider-Man and Elsa. It adds some generic music, machine-creates a nonsense title based on hitting keywords YouTube’s algorithm will notice and respond to (“PLAYING CATCH – Elsa and Spider-Man Kids Learning Video”) and then automatically uploads it to the channel. Nobody will have ever seen the video before it uploads. While someone on the other end is hoping to extract ad revenue from the clicks, there is no judgement or discernment as to what should be seen, since nobody is at the wheel. It’s just machines making material based on material made by other machines.
Setting aside the obviously unsettling content like violence or medical procedures, even generic content like nursery rhymes and singalong videos still become warped and unhealthy under this automated system. Because the automatic video assembly system is always looking to maximize hits and visibility, it will fold as many themes and ideas into a video as possible to juice its ranking in the system. You won’t just have Spider-Man and Elsa playing catch, you’ll have nursery rhyme baby Elsa singing a song to Darth Vader, who in turn will play “Family Fingers” with Woody, Buzz, and Shrek, before showing back-to-back-to-back sequences of all dozen or so characters playing catch and identifying the colour of each ball they pass back and forth. This is nightmarish mish-mash of ideas and characters overlapping each other. Then, if that video does well, other automatic assembly systems will pick up the same idea, swap out the characters or add yet another bizarre element, and on and on it goes; the snake eating its own tail forever.
What is being done about it?
We all roll our eyes looking back at stodgy old-timers insisting jazz music was ruining the minds of youth, or that television was going to make the Baby Boomer generation illiterate. Technology and entertainment will always be scapegoated to a certain extent by the preceding generation that doesn’t “get it.” But this? This is different. This legitimately does seem dangerous. Who knows what kind of effect repeated exposure to this kind of sludge could have on a young mind? For obvious reasons, there are no long-term studies related to this phenomenon so there is no way to know with certainty.
What’s worse, is there is no filter for what is or isn’t appropriate for children other than YouTube’s manual reporting features that allow individual users to flag suspicious videos. That is already an imperfect system for handling harmful material for adults, but with material specifically made for toddlers and pre-schoolers, nothing will ever be reported unless a parent happens to notice what the video actually contains (easy enough to miss if a kid is just sitting quietly with a tablet). Even when videos are reported and taken down, there is channel after channel dedicated to creating this kind of content as fast as possible without any human involvement. For each video that is taken down, a dozen come to take its place. Understanding that, it’s easy to see how YouTube Kids has become absolutely flooded with this content.
Since James Bridle’s article hit last November and gained some mainstream attention, YouTube pledged to clean up the Kids section. They promised they would adjust their automatic sorting and recommendation system to steer children away from problematic content and make content milling operations less rewarding. While many of the most egregious videos listed in the article were indeed removed, and a few new reporting features were introduced, YouTube has failed to meaningfully follow through on their word.
Wired article published last March (Warning: disturbing content), YouTube Kids is worse than ever. The article details videos with even more extreme disturbing content. They didn’t need to dig deep to find these videos either, all of them were accessible simply by going to a legitimate children’s channel such as a CBeebies video and simply following the recommended suggestions that appeared below it. A few clicks, and they were seeing videos of Paw-Patrol characters attempting suicide.
Exasperatingly, these videos were not only available but insanely popular, often with millions of views to their name. To YouTube’s credit, each video the Wired team reported was promptly taken down, but what good does that really serve? By the time they were discovered and reported these videos were already viewed millions of times, and there are hundreds or even thousands of equally distressing videos already on the service to take their place. There is no swimming against the tide.
How to Protect Your Young Children
Until YouTube makes fundamental changes to the way the YouTube Kids service works, the situation will not improve. And unfortunately, for as embarrassing and contentious as this problem has been for YouTube, they are not incentivized to take the steps needed to make the platform safe for children. YouTube relies on a never-ending supply of independent content creators regularly uploading content, it can never verify each individual creator or video. Similarly, it is dependent on its automated sorting and recommendation features. They can tweak how it works here and there, but it will always be manipulated by channels seeking to maximize views and ad revenue.
With this in mind, the simplest and best advice is to not permit small children from viewing YouTube on their own. Only allow them to view YouTube content under direct supervision so you can personally intervene if the system recommends an unsettling video.
Of course, one of the attractive features of YouTube Kids is that it can serve as a distraction for a child. If you’re a busy parent who just needs 20 minutes to cook a meal or do some work, having an automated video stream that can keep them busy for a little while is a Godsend. But you need be able to trust that content won’t harm them.
There are a number of paid subscription services that are curated and specifically designed to be safe for children. Netflix has a kids section that restricts access to all other content while serving up hundreds of kid-friendly shows and movies. Best of all, it doesn’t cost anything to access beyond your Netflix subscription. PBS KIDS Video and DisneyNOW are also good alternatives.
If you can’t let go of YouTube Kids (perhaps there is a creator or set of videos your children enjoy), you can help shield them by creating playlists of videos you’ve already reviewed and know to be safe. In fact, by subscribing to YouTube Red, you can even download playlists of videos and store them to your device. That way when your children want to browse YouTube but you’re busy, you can set the device to airplane mode, put on the playlist, and know that they can only see videos you’ve already approved.
It’s a little extra work but well worth the effort to avoid the worst of what YouTube Kids has to offer.