Site icon Stuff South Africa

Instagram’s child porn problem

Instagram Header (meta)

Instagram not only hosts child porn accounts, but its algorithms promote them, according to a bombshell new investigation that revealed a “vast paedophile network”.

Paedophiles are notorious for using the internet for their own nefarious purposes, often using obscure chat forums to share their smut. “Instagram doesn’t merely host these activities,” the Wall Street Journal (WSJ) reported. “Its algorithms promote them. Instagram connects paedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests,” the Journal found in its investigation with researchers from Stanford University and the University of Massachusetts Amherst.

They found the “sexualized accounts on Instagram are brazen about their interest”. People searching for phrases like “#pedowhore and #preteensex” are then shown the accounts that sell child porn, often claiming to be run by the kids themselves and using “overtly sexual handles incorporating words such as ‘little slut for you’.”

The content itself, the paper reports, isn’t openly published, but a “menu” of content that can be bought. Other accounts “invite buyers to commission specific acts… [including] prices for videos of children harming themselves and ‘imagery of the minor performing sexual acts with animals’,” researchers at the Stanford Internet Observatory found.

It staggers belief. Instagram has no excuse for allowing this.


Read More: Governments exploring ways to make social media less toxic


“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” said Alex Stamos, Facebook’s chief security officer until 2018 and now head of the Stanford Internet Observatory.  “I hope the company reinvests in human investigators,” he told the Journal.

Researchers set up test accounts, through which they viewed one of the paedophile network accounts. It was “immediately hit with ‘suggested for you’ recommendations of purported child-sex-content sellers and buyers, as well as accounts linking to off-platform content trading sites,” the paper reported. “Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children.”

There you have it. The ultimate logical conclusion of the hazards of the algorithms fuelling social media, where it promotes child porn to other paedophiles. Stamos is right, how did Instagram miss this?

The company said it removed 490,000 accounts for violating its child safety policies in January alone, according to the WSJ.

“Instagram is an on-ramp to places on the internet where there’s more explicit child sexual abuse,” Brian Levine, the director of the UMass Rescue Lab, told the paper. Levine wrote a 2022 report on internet child exploitation for the US Justice Department’s research arm, the National Institute of Justice.


Read More: Is 13 too young to have a TikTok or Instagram account?


With over 1.3 billion users, Instagram has a mass of teenage users. “The most important platform for these networks of buyers and sellers seems to be Instagram,” wrote Stanford’s researchers.

It’s worth reminding Facebook of whistleblower Frances Haugen’s warning in October 2021 that the social giant “prioritises growth over safety”. This is the most destructive example of this yet.

Exit mobile version