It seems like YouTube has finally started to consider investing in some measures that will keep children safe across its platform, as the company was met with a lot of criticism in recent times which accused it of allowing disturbing content aimed at children to be easily accessible and the platform’s use by pedophiles.
Back in 2017, a number of global brands pulled the plug on their ad campaigns on Youtube after it was revealed that a large number of users left sexually explicit comments on videos of children, be they videos posted by pedophiles or by the children themselves, who had no idea the type of attention they would attract.
“We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content. We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally.” a Mars spokesperson said at the time. “Until we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.”
YouTube promised back then that it would toughen its approach to content that “attempts to pass as family-friendly, but is clearly not”.
However, not much seems to have changed since.
More recently, a video that was shared by YouTuber Matt Watson brought the issue back into the spotlight.
In his video, Watson detailed how what he called a “soft-core pedophile ring” was communicating on monetized videos of children (young girls particularly) and shared contact information, links to child pornography and timestamps where the other viewers could see the girls in compromising or otherwise sexually-implicit positions.
According to Watson, YouTube’s algorithm can send the users down a rabbit hole filled with exploitative content soon after they click through several of similar ones.
A lot of people have agreed with his claims after replicating the process, Wired included, who was flooded by videos of young girls swimming and doing other girl things girls do without thinking twice, such as eating popsicles.
A few clicks away, Wired managed to find content that was even more graphic. What’s worse is that those videos -most of them monetized- have managed to gather millions of views.
The companies whose ads have featured in those videos, such as Nestlé and reportedly even Disney, are now finally choosing to act and in turn, by canceling campaigns entirely, twist YouTube’s arm a little and force them to make a change.
‘In the real world, we wouldn’t bring kids to an adult area and then tell them to find the kids’ space or bring them to a kids’ space and say they can go to the adult area whenever they want.
Hemu Nigam, founder of cybersecurity advisory firm SSP Blue
According to YouTube, the company has already removed over 400 accounts that belonged to some of the people who commented on the videos as well as some of the videos themselves. In addition to that, it has also reported the illegal comments to the National Center for Missing and Exploited Children.
“Any content—including comments—that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” a YouTube spokesperson told Gizmodo in an email “We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
YouTube is currently considering moving all the content into a separate platform called YouTube Kids App where it could disable the recommendation function.
But this brings along with it another problem: children and family-oriented content is really popular on the platform and rakes in quite a bit of money: creators can earn anywhere between $1,000 and $5,000 for every million views.
YouTube Kids gets around a fraction of the traffic YouTube does, so that means the parents of the children who generate content that brings in the cash are not happy with the decision, since it would mean their earnings would see a serious drop.
The father of a YouTube star with 2 million subscribers, who chose to remain anonymous, told Marketwatch that he “looked at a few videos that had 90% of views come from YouTube Kids. Based on those numbers, revenue would decrease by 80%. So instead of $1.00 you’d receive $0.20.”
His videos mostly feature his daughter playing with toys. He declined telling Marketwatch how much revenue he currently makes from the channel.
For now, YouTube stated that moving the content is just “an idea” but people like Donna Rice Hughes, president of child internet safety non-profit Enough is Enough and former member of the FTC Child Online Protection Act Commission says that even moving the content to another platform wouldn’t be enough.
“I would like to see YouTube and social media platforms do real age verification. They say you have to be 13, but anyone can type in a birth date that says they’re 13,” she said. “These companies need to be putting kids’ safety first.”
Some parents, while they agree with the idea of making the platform safer for children, still keep revenue at the forefront of their mind.
The father of two boys, Gabe, 13 and Garrett, 10, who share a YouTube channel with around 1.7 million subscribers said that, while he applauds the move, he doesn’t believe the change would work.
“It’s not a viable business model for those of us creating good, family-friendly content.” he stated. “The revenue generated currently from views on the YouTube Kids app is very low, a tiny fraction of the main platform.”
Follow TechTheLead on Google News to get the news first.