Advertisement
Most parents understand that the internet can be a dangerous place for children, and there are plenty of high-tech ways to minimize their exposure to undesirable content. However, none of these methods can completely sidestep the need for parents to pay attention to what their children are doing. Even if you’ve helped your child safely navigate to videos of their favorite cartoon characters on YouTube, don’t think you can sit back and relax content in the knowledge that their impressionable young minds won’t be corrupted – it’s actually quite the opposite.
YouTube has a lot of videos you wouldn’t want your kids to see, but even those that appear to portray familiar faces like Peppa Pig and the Frozen characters are not as safe as they might sound. Some very disturbing videos masquerading as acceptable children’s viewing material are even finding their way into the YouTube Kids app, which is supposed to be a safe environment in which kids can find videos that are appropriate for their ages.
The issue was the topic of a recent Medium post by James Bridle, who addressed the problem in-depth. YouTube has been inundated with a slew of low-quality videos with algorithmically created, nonsensical titles that combine an odd and disturbing assortment of topics. Some are simply bizarre, but others can be quite harmful.
For example, popular children’s characters are being used in violent and sexual situations, and kids are being shown these videos within the YouTube Kids app. There, children can stumble across videos showing the dogs from Paw Patrol committing suicide and Peppa Pig drinking bleach straight from the bottle. There are videos that show a Claymation version of Spiderman urinating on Frozen’s Elsa and characters from Nick Jr. shows visiting strip clubs. In another video, Mickey Mouse is hit by a car and dies in a pool of blood while a horrified Minnie looks on helplessly.
Some of them feature real kids being tortured or otherwise appearing to be in peril, such as families playing roughly with kids. In one, a little girl’s forehead is shaved and she appears to bleed. Others put forth more controversial ideas, such as those that depict male characters like Spiderman pregnant or dressing in drag.
Some of these videos might make their way into the app by mistake, but others are put three quite deliberately by people who get their kicks out of disturbing kids and who have found ways to slip past the algorithm.
Many parents think that the app only shows videos an actual human being has deemed appropriate, but this is not the case. Google uses a combination of automated analysis and input from users to funnel videos from the main site to the app, and it’s far from a perfect science.
Google has responded to the recent complaints about these problems by instituting some policy changes. Now, when videos are flagged from within the main YouTube app, they are automatically blocked from the YouTube Kids app and age-restricted. This means they won’t be able to display ads, which will take away the financial incentive driving some of these content creators. It’s not foolproof, however, because any inappropriate content that users don’t find and flag can pass right through to the app and upset or otherwise influence countless young minds before anyone notices it.
The bottom line is that far too many parents are using tablets and computers as babysitters for their children – and many of them are too busy being glued to their own devices to pay attention to what their kids are doing. If you want to protect your children, you can’t leave them in the hands of Google, YouTube, and other app makers. Watch what they’re doing online, and encourage them to play outdoors and spend time away from screens for their mental and physical health.
The Campaign for a Commercial-Free Childhood’s Executive Director, Josh Golin, said: “Algorithms are not a substitute for human intervention, and when it comes to creating a safe environment for children, you need humans.”
Sources include:
Submit a correction >>
Advertisement
Advertisements