Leaning Right

Instagram algorithm boosts ‘vast pedophile network,’ bombshell report claims

Instagram’s algorithms connect and promote a “vast network” of pedophiles interested in “underage sex content,” a bombshell new report found.

The Meta-owned social media company’s algorithms promote illicit content to accounts “openly devoted to the commission and purchase” of the material, according to investigations by the Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst.

“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests,” the WSJ and academic researchers found.

“Though out of sight for most on the platform, the sexualized accounts on Instagram are brazen about their interest,” the report says.

“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests,” the Wall Street Journal and academic researchers found. (Rafael Henrique/SOPA Images/LightRocket via Getty Images)

The researchers reportedly found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as “little slut for you,” the report claims.

According to the WSJ, Instagram accounts offering to sell illicit sex material generally do not publish it openly, instead posting “menus” of content.

Researchers at the Stanford Internet Observatory found that certain accounts invite buyers to commission specific acts, WSJ reported. Some menus include prices for videos of children harming themselves and “imagery of the minor performing sexual acts with animals.” At the right price, children are available for in-person “meet ups,” the researchers found.

The promotion of underage-sex content violates Meta’s own rules, in addition to being illegal under federal law.

In response to questions from WSJ, Meta “acknowledged problems within its enforcement operations and said it has set up an internal task force to address the issues raised.” “Child exploitation is a horrific crime,” the company told the Journal, adding, “We’re continuously investigating ways to actively defend against this behavior.”

Meta said it has in the past two years taken down 27 pedophile networks and is planning more removals, according to the report.

Since receiving queries from the newspaper, the platform said it has blocked “thousands of hashtags that sexualize children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse.”

It said it is also working on preventing its systems from recommending that potentially pedophilic adults connect with one another or interact with one another’s content, the report states.

Continue Reading at Fox News.