YouTube's Video Recommendation Algorithm Caters to Pedophiles, and They Refuse to Fix It

This Oct. 21, 2015 photo shows signage with a logo at the YouTube Space LA offices in Los Angeles. (AP Photo/Danny Moloshok)

YouTube makes money by showing advertisements with their video content, and the site benefits from its video recommendation algorithm that suggests related videos for viewers to watch next after the one they originally chose. But this algorithm has a dark side that caters to pedophiles, according to a report today by the New York Times, and YouTube is so far refusing to fix the problem.

Advertisement

The article, by Max Fisher and Amanda Taub, tells the story of a mother whose 10-year-old daughter uploaded an “innocent” video of herself playing with a friend in a backyard pool, and then was shocked to discover the video had racked up over 400,000 views in only a few days.

The culprit is a problematic feature of YouTube’s algorithm and how it detects content within videos to make suggestions for what to watch next.

Content that might originally be innocent — a child in a family video wearing pajamas or a bathing suit, or even a young child changing clothes or partially nude — is then curated with a collection of videos showing children in similarly revealing ways:

YouTube’s automated recommendation system — which drives most of the platform’s billions of views by suggesting what users should watch next — had begun showing the video to users who watched other videos of prepubescent, partially clothed children, a team of researchers has found.

YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content.

The result was a catalog of videos that experts say sexualizes children.

Also troubling was how the algorithm led viewers from adult erotic videos to recommend videos of children:

Advertisement

Users do not need to look for videos of children to end up watching them. The platform can lead them there through a progression of recommendations.

So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.

On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable.

A few months ago, Wired reported how pedophiles were using the YouTube comments sections to alert each other to titillating content involving children, and the videos were racking up millions of views and displaying advertising sponsored by major corporate brands. Like this more recent New York Times story, the videos individually appeared to be intended innocent, but a video of a young girl doing gymnastics, for example, might have comments mentioning the timestamp of when she does a split or a tight-fitting leotard exposes details about her anatomy.

It’s a horrifying idea to contemplate, that sexual predators are using technology to find ways to eroticize children, not to mention the risk for children featured in videos that go viral in these pedophile networks. Many of the videos that earned a lot of views also had comments from predators reaching out to the children in them and attempting to engage with them, requesting that they wear specific outfits, or do certain poses or gymnastic moves in future videos, for example.

Advertisement

YouTube has taken some remedial steps. The Wired story resulted in YouTube disabling the comments sections on many videos featuring children, and YouTube has removed videos that were targeted in this way by those with prurient interests, but it’s impossible to keep up with all the new content constantly uploaded every day.

After being contacted by the New York Times journalists, YouTube removed additional videos and did make adjustments to their recommendation algorithm, but admitted that it was a “routine tweak,” and they were not disabling the recommendation system across the board for videos featuring children, “because recommendations are the biggest traffic driver, [and] removing them would hurt ‘creators’ who rely on those clicks.”

In other words, YouTube is not willing to protect children if it might hurt their bottom line. Shifting the blame to video creators skips over how YouTube is also profiting from anything it does to drive traffic.

There is no way to efficiently have humans individually curating the vast quantity of content uploaded to YouTube, and the company has to rely on user reports and algorithms in order to police and organize its content. But parents need to be aware that traffic and profit will not be sacrificed to avoid letting pedophiles target their children. 

Parents must consider the immense risk to their children if they post publicly available content on YouTube and other social media sites. The algorithms and artificial intelligence programs are becoming advanced beyond most people’s expectations, and content that seemed so harmless and innocent — a birthday party at the beach, your little cousin’s first ballet recital, reading a bedtime story to your daughter — becomes terrifying when a child is targeted by someone who finds it erotic to watch children in bathing suits, leotards, or nightgowns.

Advertisement

Read my RedState article archive here.

Follow Sarah Rumpf on Twitter: @rumpfshaker.

Recommended

Join the conversation as a VIP Member

Trending on RedState Videos