Last year, YouTube was lambasted for a wealth of videos created specifically for children, often-containing sexual references, violence and were considered exploitative. As it turned out, the site’s automatic recommend video generator was driving children down a warren of disturbing, uncanny and upsetting images. Now, the site faces another complaint about how they managing content directed at children, as a 20-strong group of consumer advocacy groups is preparing to file a complaint with federal officials that one of the world’s biggest online video sites is violating children’s privacy.

Despite not being a child friendly service (you have to be 13 years or older to use it), YouTube is a visual playground with an abundance of cartoons, unwrapping videos and sing-along songs. It’s blind-slightingly obvious that millions of children regularly use it. A recent study by a peer-reviewed medical journal even examined the impact of two-year-olds watching videos on their parent’s smartphone, to see if they could be used as a learning resource. Turns out that that as entertaining as they are, videos don’t actually teach your children much—if anything at all.

Under the Children’s Online Privacy Protection Act of 1998, there are certain things you can and cannot do when your website is “directed to children.” Not only do you have to consider subject matter, visual content, character choices, music, and language, among other things but you have you think about all the other information online services gather about us—the digital breadcrumbs that detail the inanimate information about our lives.

At the moment, YouTube’s privacy settings state that anyone who watches a video on their platforms consents to Google collecting data tied to their device, location, browsing habits, phone number and more. But the Children’s Online Protection Act doesn’t explicitly rules that if you want to get this kind of data on under 13-year-olds, you need their parent’s permission first.

YouTube argues that there are two versions of the site. The one that houses millions of cat videos and makeup tutorials, and then there’s the YouTube Kids app. The latter is designed for children and doesn’t allow interest-based advertising or remarketing. But Josh Golin, executive director of the Campaign for a Commercial-Free Childhood who is also leading the coalition, told the New York Times that YouTube had been “actively packaging under-13 content for advertisers.

Barbie pre- and mid- roll ads are in abundance on the site and Google’s been pretty forward in talking about how to grow your channel’s child audience. One of the complaints key issues is that with all this in mind, YouTube is fully aware that they have a prominent audience of pre-teens and children. They’re even encouraging content creators to maximize their engagement with that demographic. Yet, they refuse to protect them, all because it would mean curtailing their data collection trawl that happens every time you click accept to one of their services.

Tags