A group of three researchers were able to identify 405 sellers of self-generated child-sex material on Instagram, out of which, 112 accounts collectively had 22,000 unique followers

Meta’s Instagram is recommending content involving the sale, purchase and commission of child pornography to pedophiles, according to an investigation by The Wall Street Journal.

Instagram’s “algorithms promote” such illegal content and in some cases, even guide how to buy in-person meet-ups with children, the investigative report says.

Findings
A group of three researchers from Stanford and the University of Massachusetts Amherst were able to identify 405 sellers of self-generated child-sex material on the social media platform, out of which, 112 accounts collectively had 22,000 unique followers.

Hashtags involving graphic terms like #pedowhore, #preteensex, #pedobait and #mnsfw (minors not safe for work) have garnered millions of videos on Instagram.

Meta’s response
The social media giant has set up an internal task force and said it is continuously working on blocking child sexual abuse material (CSAM) networks in its platform. It also mentioned it removed 27 pedophile networks and blocked thousands of related hashtags between 2020 and 2022.

In January 2023 alone, more than 490,000 accounts were blocked for violating Meta’s child-safety policies. By the end of 2024, more than 34 million child sexual exploitation content was removed from Facebook and Instagram, says Meta.