Premium

Instagram Is Unsafe for Young Men As Meta Can't Fight One of the Largest Predators on the Platform

AP Photo/Marcio Jose Sanchez, File

About three nights ago, I was doomscrolling on Instagram at around 9 pm, when, as one does when scrolling Instagram, I landed on an ad. If you're an Instagram user, you get served with ads all the time. Usually, they involve things you'll see on television. Ads for insurance companies, grocery store ads, clothing, knick-knacks, you name it. 

The ad that suddenly popped up on my feed featured an attractive, voluptuous blond woman, and the first words you hear from the ad are "Take your top off." My jaw dropped as she did. It was an AI woman that looked very real, standing topless in an Instagram ad. Nothing covering her. She then proceeded to take the rest of her clothes off, standing completely nude. 

Before someone comes in and tells me how ads are served based on the likelihood of interest, and I have shown interest in nude AI women, I can tell you that my Instagram interests are pretty tame. I mainly love watching reels about cooking steak, woodworking, lawn care, and a lot of Christian content. My only guess is that the algorithm has identified me as male, and apparently, that's enough for these X-rated ads to find me. 

Moreover, this isn't the first time I've been served ads like this that promise uncensored chat and pictures from AI models. I usually report them, and I reported the ad above as well. As loyal readers will know, I consider AI companions a plague that is going to play a part in ruining our society, so much so that I've written around a dozen articles on the subject and even created a YouTube video around it. 


READ: The Oncoming AI Companion Plague Needs to Be Taken Far More Seriously


I've made it clear to Instagram that I'm not interested, and for a while, the algorithm keeps these ads from reaching me, but eventually they return. They're relentless. 

I used to think there must be some sort of deal or agreement between these AI porn companies and Meta, but after a bit of research, I'm learning that even Meta is trying its absolute best to rope these ads in and stop them from displaying, but they're having a really hard time doing so because these AI porn companies spend a lot of time and effort getting around the system to target men, hook them, and keep the cash flowing. 

On Indicator, a report showed that thousands of these ads have hit Meta, and despite continued reporting, more keep appearing despite Meta's best efforts to stop them: 

Indicator identified more than 2,000 ads for AI girlfriend apps/sites and over 1,000 for nudifier services that broke the platform’s rules. Most were placed since Meta’s June announcement that it had implemented new ways to detect ads for nudifiers and other nonconsensual image generation tools. 

The findings follow reporting from Indicator and the American Sunlight Project earlier this month that found over 4,000 nudifier ads had appeared on Meta’s products since the company's June announcement.

Daniel Roberts, a spokesperson for Meta, said the company removed the latest ads flagged by Indicator and disabled the accounts behind them. He said the company continues to invest in technology to detect violative ads and accounts.

The issue became so bad that Meta tried going straight to the source of one of these companies. According to CBS News, Meta sued the Hong Kong company Joy Timeline HK Limited, which was pushing "Crush AI" on all of Meta's platforms. According to the lawsuit, the company was continuously trying to "circumvent" the ad review process through various means to present — not just AI companion apps — but non-consensual nudifying apps to users.

These apps allow users to upload a photo of a real-life woman, oftentimes a celebrity or someone they know personally, and digitally remove their clothing to see them naked. 

Joy Timeline HK Limited gets around the safety measures implemented by Meta through deceptive means, according to CBS: 

Meta said advertisers of nudify apps use various means to to avoid detection on its platforms, including by using inoffensive imagery to try to circumvent tech used to identify such ads on it sites. As a result, it has developed better technology to detect ads from nudify apps that are presented as benign, Meta said Thursday.

"We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect with these ads," Meta said. 

But it's a game of cat and mouse, and Joy Timeline isn't the only company doing this. There are a myriad of AI porn sites and companies. As I've previously reported, these companies make hand over fist, selling digital T&A, chat, and video to millions, and the tech only improves every year. 

This means these companies have deep pockets and can spend a lot of time and money trying to get their sites advertised. It doesn't matter to them that they get caught and their ad is kicked off. They only need to sneak by the censors once to get you to notice them, and a handful of customers before the ad is taken down is still hundreds of dollars in their pocket if they manage to snag the targeted user. 

Our society is currently undergoing a male loneliness epidemic, with friendships and romantic relationships at dangerously low levels. This is especially true for young men in the Gen Z and Alpha generations. These AI apps feed on that sense of isolation, and as such, are successfully roping in men, especially young men. 

I want to reiterate something, because I often see messages of dismissal from readers. This isn't just the lonely nerd in his mother's basement being led to these apps. These are men of every variety. Blue collar workers, married men, fathers, jocks, geeks, and stand-up citizens of every kind. The lie of it is that these are virtual women, so there's no one being harmed, and it's innocent fun. 

It's not. It's addictive in the same way regular porn is addictive, only this one is on-demand and interactive. The dopamine surges you get from this kind of app feel very good to the brain (especially the male brain) to the point where you can become addicted very easily. It's a trap, and these companies know they only need to give you a taste to get you hooked. This is why they disregard Meta's rules for ads. They don't care if they're caught or banned, because they just needed to attract you one time. 

Besides, they have hundreds of accounts, and they can create hundreds more. 

So my advice is, if you have any control over your son's phone, keep him off Instagram. They're hunting for him. 

And if you're a grown man, keep your wits about you. They're hunting you, too. 

Recommended

Trending on RedState Videos