in ,

TERRIBLE!TERRIBLE!

BREAKING: Instagram Caught Running Massive Child Pornography Ring On Platform

Share this:

A new investigative report from the Wall Street Journal has revealed that Instagram is, essentially, running a massive child pornography and sex trafficking ring by helping to connect and promote networks of accounts that commission and purchase underage sex content.

The report, which came from investigations done by the outlet and researchers at Stanford University and the University of Massachusetts Amherst, revealed that Instagram’s algorithm actively promotes pedophile accounts, and guides pedophiles to content sellers through the recommendation systems “that excel in linking those who share niche interests.”

The social media platform has allowed people to search for explicit hashtags such as “pedowhore” and “preteensex,” and connected

The social media platform has allowed people to search for explicit hashtags such as “pedowhore” and “preteensex,” and connected those searchers to accounts that advertise the sales of child sex materials, accounts that are often claiming to be run by children and use handles such as “little slut for you.”

Those accounts offering to sell pedophile sex materials post “menus” of content, with some accounts inviting buyers to commission specific sexual acts.

Some of these menus include pricing for videos of children hurting themselves and “imagery of the minor performing sexual acts with animals,” Stanford Internet Observatory researchers found.

Content menus also include in-person “meet ups” with children.

Federal law and Meta content rules both prohibit the promotion of underage sex content. Meta told the outlet that it has set up an internal task force to address the issue, and acknowledged that there are problems within its enforcement operations.

“Child exploitation is a horrific crime,” the company said, adding, “We’re continuously investigating ways to actively defend against this behavior.”

Meta said that it has taken down 27 pedophile networks in the past two years and is planning more removals. Since being contacted by the Wall Street Journal, Meta said it has blocked thousands of hashtags, some with millions of posts, that sexualize children, and has restricted its algorithm from recommending to users search terms associated with sex abuse.

Test accounts set up by researchers at Stanford’s Internet Observatory and UMass’ Rescue Lab that viewed a single account in the network of pedophiles were instantly given “suggested for you” recommendations of child sex content buyers and sellers and accounts linking to content trading sites off-platform.

“Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children,” the Wall Street Journal wrote.

Stanford researchers discovered 405 sellers of “self-generated” child sex content. These accounts claimed to be run by children, some as young as 12. 112 of those accounts had a combined 22,000 unique followers.

Other accounts in Instagram’s pedophile community aggregate pro-pedophilia memes, or discuss their access to children.

Current and former Meta employees that worked on Instagram child-safety initiatives told the outlet that the estimated number of accounts that primarily follow such content is in the hundreds of thousands, if not millions.

In 2022, the National Center for Missing & Exploited Children received 31.9 million reports of child pornography, up 47 percent from two years earlier. Most of the reports came from internet companies.

Meta, whose apps include Instagram, Facebook, and WhatsApp, accounted for 85 percent of the child pornography reports filed to the center, including around 5 million from Instagram alone.

searchers to accounts that advertise the sales of child sex materials, accounts that are often claiming to be run by children and use handles such as “little slut for you.”

Those accounts offering to sell pedophile sex materials post “menus” of content, with some accounts inviting buyers to commission specific sexual acts.

Some of these menus include pricing for videos of children hurting themselves and “imagery of the minor performing sexual acts with animals,” Stanford Internet Observatory researchers found.

Subscribe
Notify of
guest

2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Jason
Jason
10 months ago

They took them down, but did they turn them in?

Brya
Brya
10 months ago

I’m sure Zuck has no knowledge of it………………..just kidding