YouTube prohibits minors from streaming unless accompanied by an adult

The platform continues the fight to protect its youngest users.

Image via YouTube

YouTube is changing the way minors use the platform in its ongoing battle to protect its youngest users against child exploitation by prohibiting them from streaming without adult supervision.

Recommended Videos

YouTube revealed in a blog post today that users under the age of 13 can no longer use the platform’s streaming services without being accompanied by an adult. Additionally, the Google-owned platform has launched new classifiers, or machine learning tools that identify content, on live content to target those featuring minors. Channels found in violation of these new terms could have their ability to stream removed.

YouTube has long been a popular go-to site for family-themed content, and family vloggers and toy reviews frequently populate the site. One channel, Ryan ToysReview, broke over 19 million subscribers for simply sharing a child’s thoughts on different types of toys—speaking to the massive audience on the site for these family-friendly videos.

To help further facilitate this participation from its younger users, YouTube released a feature in 2015 called YouTube Kids, a separate section of the site serving specifically as a place for minors to safely explore their interests on the platform.

But things haven’t been all sunshine and rainbows. In February, user MattsWhatItIs posted a 20-minute video highlighting the sexual exploitation of children on YouTube and the platform’s facilitation of these types of videos due to its video suggesting algorithms.

Earlier today, the New York Times published an investigation into YouTube’s recommendation system directing users looking for family-friendly content to a catalog of videos that sexualized children. What’s even more disturbing is that users who might be using the platform for more sexually-explicit means may be redirected to content featuring minors based on the platform’s suggestions.

Today’s new policy isn’t the first time the platform has taken steps to protect its youngest users. Over the past few years, YouTube has disabled comments across millions of videos featuring minors and implemented a classifier to help remove twice as many comments in violation of the platform’s policies.

Earlier this year, YouTube removed over 400 channels and deleted more than 800,000 videos that promoted content in violation of the site’s child safety policies—some videos were taken down before they even reached 10 views.

YouTube’s popular competitor Twitch has already taken precautions to protect younger users. The site’s Terms of Service prevents minors under the age of 13 from using the platform entirely. Those between the ages of 13 and 18 can only use Twitch under the supervision of a parent or legal guardian who’s agreed to be bound by the platform’s Terms of Service.

“YouTube is a company made up of parents and families, and we’ll always do everything we can to prevent any use of our platform that attempts to exploit or endanger minors,” today’s blog post reads. “Kids and families deserve the best protection we have to offer: We’re committed to investing in the teams and technology to make sure they get it.”

Author
Image of Rachel Samples
Rachel Samples
Managing Editor. In 2018, Rachel graduated from the University of Texas with a bachelor’s in Rhetoric and Writing and first entered the esports industry in the same year. Her favorite games include indies, deckbuilders, and the entire Mass Effect franchise. Need any calibrations?