BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

YouTube's Related Video Algorithm Helpful To Predators

This article is more than 5 years old.

© 2016 Bloomberg Finance LP

YouTube’s Related video algorithm has been accused of some pretty heinous things in recent years -- censorship, conspiracy theory-mongering, and far-right radicalization to list a few -- but none have been as grave as the latest charge: that of helping pedophiles locate and share borderline soft-core porn-like content.

Related videos on YouTube should be for cute animal compilations or upcoming music in an Autoplay list. What the Related video algorithm should not be doing is exposing preteens to predators.

YouTuber MattWhatItIs (real name Matt Watson) brought the Internet’s attention to this particular issue late Sunday night with a video essay that hit the front page of social news sharing site Reddit by dawn. In his video titled “Youtube is Facilitating the Sexual Exploitation of Children, and it's Being Monetized (2019),” Watson revealed how clips of prepubescent girls in states of undress have been viewed millions of times, with comment sections full of predators sharing timestamps and links to other obscene material. Those unsafe videos were also running ads at press time.  

In a Reddit comment calling out YouTube, Watson wrote:

Instead of wasting resources auto-flagging videos where content creators "use inappropriate language" and cover "controversial issues and sensitive events" they should be detecting exploitative videos, deleting the content, and enforcing their established age restrictions.

Replicating Watson’s research regarding the Related video algorithm was incredibly easy, including finding ads on content exploitative of children. My own search dropped me into this disgusting recommendation bubble within three clicks after searching “splits,” “gymnastics” and “bathing suit,” respectively.

Screengrab from YouTube

Some of the accounts uploading this kind of content are well-aware of the predators in their comment section. One account I identified, under the seemingly harmless moniker “Gymnastics Training” even engaged in banter sexualizing the minors and “Liked” comments left by pedophiles.

This is not the first time YouTube has been accused of endangering children, or for failing to protect children from pedophiles via the comment section on their site.

In 2018, a Times of London investigation determined YouTube was failing to remove exploitative live streams in a timely manner. In 2017, YouTube came under fire from its own Trusted Flagger community (and among child welfare experts) after the site moderators complained about the lack of action taken over child endangerment reports the trusted flaggers were submitting. These reports included child grooming as well as inappropriate comments.

In an unpublished 2017 interview with one of the trusted flaggers leading that awareness campaign, the volunteer expressed to Forbes how frustrated he was at the slow pace YouTube acted on submitted reports.  After generating 526 child endangerment reports in under 60 days, according to the volunteer, “we received in total only 15 replies, 7 of these resulted in removals with a further 8 being reviewed to find no violation of the Community Guidelines – just 2.8% of reports were ever even reviewed” he wrote.

At the time, the moderator wrote in an email to Forbes:

They need to do more to stop the thousands of child predators using the platform, by changing their approach from being ‘reactive’ to proactive –including reporting grooming to law enforcement, employing experts to actively search for abusive users, perhaps also utilising machine learning rather than relying solely on public reports.

When reached for comment during that controversy in 2017, a YouTube spokesperson said they would be “building additional comment review functionality and adding more internal staff to manage those reviews.”

Screengrab from YouTube

It is unclear if those tools have been implemented yet, two years later. As mentioned above, one YouTube comment about a man putting his tongue inside the young girl in the “Gymnastics Training” video was still up more than 12 hours after I reported it. Further, videos that clearly violate community guidelines just by the fact that the uploader is clearly presenting as a child and under 13, are still up and available.

This includes videos made by children highlighted in a 2013 Daily Dot investigative piece about pedophiles contacting minors via YouTube comments. Again, videos from 2013 that attract predatory behavior are still up in 2019. Fine, those videos have had their comment sections disabled, but they are still in Autoplay lists and still getting hundreds of thousands, even millions, of views. The bigger question is why are these videos still up in the first place considering it has been proven the children making the videos are under 13?

Instead of disabling comments on YouTube, the Google-owned company should remove the content created by a child AND notify the parents of the video content their child has been uploading to the site.  Google knows everything about you -- how hard would be to let the adult at the end of that YouTube account know their child is creating content attracting pedophiles? Disabling comments on a video a child has made does not deter predators from using the platform. Neatly cataloging videos predators would enjoy is also not a deterrent or a helpful way to deal with this complex societal issue.

YouTube was not available for an updated comment at press time.  To the corporation's credit, YouTube does work with local law enforcement and organizations dedicated to combatting child exploitation, like NCMEC. However, at this point, it is clearly not enough.