TL;DR No, YouTube is not safe and responsible parents should pay attention on what children watch on YouTube


In the age of online, user-produced digital media, the debate about how to keep children safe from inappropriate media is a very complicated one. The debate has amplified more recently with some disturbing incidents on the Google-owned app known as YouTube Kids that has left some parents wondering whether or not the app really is kid-friendly.

YouTube Kids is an app meant for children under 13 years of age where they can watch their favorite videos. It works much the same as the regular YouTube, but has stricter algorithms and monitoring that only allow kid-friendly content onto the platform, at least in theory.

Malik Ducard, YouTube’s global head of family and learning content has said that all videos are automatically scanned when they’re uploaded to the YouTube Kids app and then monitored thereafter. Ducard admits that the system is not perfect, and so encourages parents to report videos that they deem inappropriate on the app. When a video is reported, it goes to one of many dedicated YouTube Kids team members who review the video and decide if it should be removed from the app. While some parents have shown deep concern over what kinds of videos have slipped through the cracks, Ducard points out that less than .005 percent of the millions of videos viewed in the app were removed for being inappropriate.

However, for some parents, the algorithm and the YouTube Kids team is not getting the job done well enough. For example, Staci Burns, a mother in Fort Wayne, Indiana reported to the New York Times that her three-year-old son was watching videos of the popular cartoon, PAW Patrol, on the YouTube Kids app when he suddenly complained that the monsters were scaring him. It turns out that what her son had been watching was not the original cartoon, but a knock-off version where the characters were being hypnotized by a demon-possessed doll and committing suicide. The video may have made it through the algorithms because it looked to the algorithm like the original cartoon.

There are other instances of these knock-off videos of popular cartoons such as Peppa Pig being tortured at the dentist or drinking bleach. Others include characters committing lewd acts, such as urinating on each other.

The question has now become how we can prevent things like this from happening. As mentioned before, it’s not as if Google is neglecting the problem; they are actively working to keep inappropriate media out of reach of young children. But when you have a platform that is essentially user created, keeping out every piece of inappropriate content becomes very difficult, regardless of Google’s genuine effort.

The problem is compounded when you consider YouTube’s other algorithm; the one that essentially makes YouTube function: the search algorithm. This is how YouTube’s search algorithm works: when someone types a word or a string of words into the search bar, YouTube’s search algorithm pulls up results based on keywords in the title and description that match the search. Some keywords are more highly ranked than others because they are searched so often, which means that videos that use these keywords may see more play time. Videos that get viewed more often see more revenue, meaning there is a high incentive for using these keywords.

This practice is known as SEO (search engine optimization) and functions the same way on YouTube as it does on Google. On Google, many companies will try to game Google’s search algorithm so they appear in the top positions on search results. Since YouTube is the second most popular search engine in world, surpassed only by Google itself, many YouTube users are taking the same SEO approach on the video platform to place highly in searches. This is also apparently happening in the YouTube Kids app.

In an article on Medium, writer James Bridle points to the keyword system as a partial cause for all the weird and sometimes disturbing things that YouTube Kids turns up. He mentions videos with titles that he calls “word salads” that are basically a mass jumble of keywords. Bridle says that these titles are created in an effort to capture as many highly-ranked keywords as possible to turn up frequently in search results and autoplay lists in order to generate revenue. He takes it one step further, arguing that, in some cases, the video content no longer determines the title, but the keyword-heavy title determines the content of the video. So if the keyword salad ends up being “buried alive,” “Peppa Pig,” “Aladdin,” and “suicide,” the animator may make a video to match.

The consequence is that, even though some videos may be purposefully disturbing, such as Peppa Pig being tortured at the dentist, they can appear well-ranked, together with safe videos. And because young children are not as equipped to tell the difference, say, between a real Peppa Pig episode and a disturbing knock off, or understand why their favorite characters are behaving strangely, even more videos slip through the YouTube Kids app cracks. For this reason, the question about whether or not YouTube Kids really is kid-friendly is a very valid one.

With more and more calls from parents to better safeguard their children, the YouTube team has taken steps to bring that .005 percentage down even further, including launching upgraded versions of its YouTube Kids App frequently with tighter security. The last iteration added the ability for parents to create kids’ accounts, to enable parental controls per kid, which means different rules can be set by the age of each children. Parents can now also block certain content, and even turn off the search function, allowing the app to only play what is suggested or has already been played.

But Google’s attempt to staunch the flow of inappropriate media on the YouTube Kids app is a deeply flawed one. At the core of this decision is YouTube’s own values such as this one: “Freedom of Opportunity: We believe everyone should have a chance to be discovered, build a business and succeed on their own terms, and that people—not gatekeepers—decide what’s popular.”

While this is a very noble value, it’s one that pertains to adults, not children. And because the YouTube Kids app is still hinged upon this core value, it means that the app is more geared toward protecting the ability of adults to “be discovered, build a business and succeed on their own terms” than it is aimed at keeping children safe from disturbing content.

And the problems with user generated content are not limited to parenting, by the way. YouTube is flooded with hate speech videos and is pointed to be one of the main channels terrorists deliver its propaganda to recruit new members. In fact, many advertisers pulled their campaigns off of YouTube recently because their ads were being shown in hate speech videos.

But while adults can process and analyze hate speech videos that appear on their feed, young children cannot, as mentioned before, distinguish between appropriate content, or content that may cause them trauma.

And consider this – YouTube has over 1 billion hours of videos watched per day. Doing some back-of-the-napkin math, if there are only .005 videos taken down for inappropriate content, we are talking about 50 thousand hours of improper content on YouTube every single day. Even if we take into account that not all videos are watched by children, it still leaves a lot of room for children to accidentally stumble into an inappropriate video.

And this is what we don’t want. We don’t want to be on the defensive when it comes to our children’s safety and development. We want to be proactive. It’s not good enough to report and blacklist a video once a child has already watched it. We should be whitelisting videos before they even turn up in front of our children’s eyes.

The fundamental problem is that Google as a company believes that algorithms can solve all problems. I agree algorithms are very efficient and help companies scale operations while keeping costs low. A fully automated algorithmic approach for search makes total sense, because the consequences of a bad search result now and then are manageable. If Google’s search algorithm misses what you searched for .005 percent of the times, you would hardly notice, let alone mind a strange result every now and then. However, 50 thousand hours of improper video per day can cause potential psychological harm to many children.

The ideal solution would be for Google to commit to curating all content on the YouTube Kids app. This would mean that the YouTube Kids team would only allow preapproved channels – such as ones owned by reputable kid’s brands such as Disney – to be watched by children by default. Then parents wouldn’t have to worry about a thing.

At the very least, Google could make it easy for parents to whitelist, instead of blacklist, the content on the app. This way, parents could choose which channels they think are appropriate for their children, and set the parental controls to only allow videos from those channels. Anything else would be strictly prohibited.

But, until Google fixes YouTube’s parental controls to allow whitelisting instead of blacklisting, our recommendation is to completely restrict the use of YouTube and the YouTube Kids app unattended. Instead, we recommend that parents watch videos alongside their children. This means that if a video does pop up that seems unsuitable, parents can skip the video and even report it. This would help keep YouTube Kids more kid-friendly and would also create a space for you and your children to spend time together.

It is certainly unfortunate that Google has not taken the necessary steps to effectively protect children from inappropriate content, but it is important to remember that you do have the power to decide what your child sees and doesn’t see and to keep them safe from unwanted content by watching videos together.

To learn more about this subject, read our article with reasons why parents should pay attention to children’s digital consumption. To stay up-to-date with news regarding parenting, subscribe to our blog.


Photo by Ludovic Toinel

Related Posts