Instagram’s recommendation algorithms have enabled a widespread network of pedophiles seeking illegal underage sexual content and activity, according to an investigation by the Wall Street Journal in collaboration with researchers from Stanford and the University of Massachusetts Amherst. Instagram, owned by Meta, is reported to actively guide pedophiles to content sellers via its recommendation systems. In response, Meta has set up an internal task force to investigate and address these claims, and has made attempts to dismantle abusive networks.Â
 Â
Key Points:Â
- The Wall Street Journal’s investigation revealed that Instagram’s recommendation algorithms promote activities related to child pornography, effectively connecting pedophiles to content sellers.
- Meta, Instagram’s parent company, acknowledges the issue and has set up an internal task force to investigate these claims and address them immediately. The company admits that there were instances when reports of child sexual abuse were not acted upon due to a software error, which has since been fixed.
- According to Meta, between 2020 and 2022, its policy enforcement teams dismantled 27 abusive networks and in January 2023 disabled more than 490,000 accounts violating child-safety policies.
- The investigation revealed that Instagram enabled people to search explicit hashtags associated with underage sex, connecting them to accounts advertising child sex material for sale. Researchers’ test accounts were immediately flooded with ‘suggested for you’ recommendations of accounts selling child-sex content.
- Compared to other platforms like Snapchat, TikTok, and Twitter, Instagram appears to facilitate networks of pedophiles seeking child-abuse content more extensively. The Stanford Internet Observatory team identified far fewer accounts offering to sell child-sex-abuse content on Twitter, which also removed such accounts more quickly.
Â