Amazon, Apple, Facebook and Google are making sure parents of their users are happy. Are you doing the same?


In our increasingly technological world, it’s no surprise that the next generation is more and more drawn to technology. As children watch their parents, their older siblings, and maybe even their peers, turn to technology to solve problems and interact with the people in their lives, it’s easy to understand why children would want to do the same.

In fact, about 93% of 6-12 year olds in the United States have access to a tablet or smartphone, and 66% have their own device. In a collaboration between Facebook and the National PTA, it was discovered that three out of every five parents surveyed said their kids under the age of 13 use messaging apps, social media, or both, while 81% reported that their children started using social media between the ages of 8 and 13.

Without a doubt, this kind of technology can be very useful. The truth is that many of us would be unable to manage the complexities of social life and business without our devices. But with young children having so much access to screens and social media, a line is being crossed. In a recent article, we wrote about how smartphones can make you less smart. Among other things, studies have shown that simply the presence of a smartphone can reduce our attention and cognitive function. Is this what we want for our children?

It may or may not be surprising that young children are so technology savvy. What is certainly surprising, though, is how many children under the age of 13 are using social media when these sites prohibit use for children under 13 years old.

This prohibition is meant to comply with the Children’s Online Privacy protection act (COPPA)  which prohibits collection, use, and/or disclosure of personal information from and about children without explicit consent from parents. COPPA, and its twin in Europe, GDPR, were created to protect children’s online privacy.

But many parents think that social media sites don’t do enough to protect their children online. Nowadays, most children are more technologically capable than their parents, leaving parents feeling unequipped to deal with their children’s online lives.

In a recent article on the Facebook newsroom, a parent discloses some of her biggest fears and worries about her child’s online safety:

“I do feel overwhelmed, particularly because I’m not a big tech person. There’s a lot to keep up with, and I’m not keeping up with it.”

Facebook and other social media sites, many parents argue, do not do enough to block children under 13 from creating accounts. Moreover, the BBC reports that 8 in 10 parents whose children use social media sites such as Instagram or Snapchat aren’t even aware of the age restrictions. Facebook claims they delete underage accounts by the droves as soon as they realize the child’s age, but this might not be enough.

With such a low barrier to entry and no protections if a child does create an account, it’s understandable why many parents would be concerned about how their children are interacting with their devices, and what things they may or may not be exposed to online.

This is demonstrated most clearly by the YouTube Kids app that is directed at children younger than 13. The app is meant to be “kid-friendly” because of an algorithm that weeds out inappropriate content and makes it easy for parents to report any unsavory content that might happen to slip through the cracks. However, in our recent article about YouTube, it quickly became clear that, far from being kid-friendly, YouTube is rife with inappropriate content and hardly any human oversight. Parents also have very little control over what their children are watching on the app, having to resort to blacklisting content instead of whitelisting.

As parents have encountered more situations that leave them feeling incapable of protecting their children’s online lives, web services and social media sites such as Google, Amazon, Facebook, and Youtube, have started stepping up their game in regards to online safety for children under 13. In fact, not too long after our research on YouTube, the company’s CEO  announced a hiring frenzy for thousands of moderators whose purpose is to monitor content and block inappropriate content..

Right now, Facebook is launching a messaging app just for children under 13 years called Messenger Kids. Rather than allowing the child an account of their own, the app is an extension of the parents account. On the Messenger Kids app, only parents can add friends or delete messages, thereby putting parents in the driver’s seat when it comes to their children’s online engagement.

Messenger Kids is a step towards giving control to parents and we applaud that. However, the controls given to parents are limited still. Parents cannot limit the time kids spend using the app, for instance (a functionality Saferize offers). The app has faced some criticism from specialists as well. Some claim it is only a COPPA-compliant way for Facebook to tap into a new audience (under 13-aged users). As put by executive director of the Center for Digital Democracy, Jeffrey Chester, on a New York Times article:  “This is an attempt to create a feature that will help Facebook win over young people and keep their parents tied to the site.”.

Amazon is also taking action in regards to parental control issues. There have long been concerns about how children can safely interact with voice-activated systems, such as Alexa skills, and still remain within legal COPPA guidelines. Amazon has tackled this problem by introducing new kids’ skills and requiring that parents perform a verification process before their children are allowed to use the new skills. Adhering to COPPA guidelines, Amazon will refrain from collecting data on kids’ skills. But again, there has been some criticism with the way that Amazon is employing their parental controls. Since parents only have to complete the verification process once, children have access to everything on kids’ skills thereafter.

So while Alexa does comply with COPPA in that it doesn’t store children’s data or advertise to them, it offers very little oversight, and very little control to parents in regards to what their children are interacting with within the kids’ skills universe. But if Amazon were to use Saferize, parents would have control beyond the initial verification process. Instead of allowing access to every kids’ skill available, parents could make sure their kids were only interacting with skills they specifically approved of.

But perhaps the most advanced system of parental control within modern technological devices is Family Link by Google. It allows parents to control what apps their kids use and how long they’re using the device. They can even choose a turn-off time for the device and control it remotely.  

So what do all these companies have in common? They realized that giving control to parents is not just the legal thing to do, but ultimately, it is the right thing to do from a moral standpoint.. Moreover, these companies have realized that making parents happy is a smart business decision. If parents are unhappy with the app in regards to their children’s safety, they might prohibit the app altogether or share their negative experience on app stores and with other parents. Better to make parents feel comfortable by keeping children safe.

With all of that being said, as a publisher or developer, what are you planning on doing to keep children safe, keep parents comfortable, and stay within the legal guidelines of child online safety?

For this, we turn back to Facebook, who in their study mentioned their greatest takeaway: parents just want to feel they’re in control.

You don’t need to be a huge corporation with billions of dollars at your disposal to develop a parental control app like Google. Saferize will do this for you. Learn how, with Saferize, any company can easily provide a full-fledged parental control for their apps with minimal effort.

Related Posts