Mind Matters Natural and Artificial Intelligence News and Analysis
mockup-of-social-media-app-user-interface-in-dark-screen-mode-stockpack-adobe-stock
Mockup of social media app user interface in dark screen mode
Image licensed via Adobe Stock

The Dark Side of Instagram

An investigative report shows that Instagram algorithm promotes pedophilia networks
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

It’s an unfortunate fact that sex trafficking and pedophilia rings have benefitted from the invention of the internet. Even worse, Meta‘s Instagram is amplifying the problem – not because Meta wants to, of course, but because the algorithm promotes the activity. The Wall Street Journal did an investigative report in partnership with a team from Stanford University on Instagram’s promotion of pedophilia rings, with Jeff Horwitz and Katherine Blunt writing,

Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them. Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the Journal and the academic researchers found.

-Jeff Horwitz and Katherine Blunt, Instagram Connects Vast Pedophile Network – WSJ

They added that Meta is soundly opposed to this criminal activity and that the company took down 27 pedophilic networks over the past two years. But there is still much more work to be done, and as noted above, social media sites like Instagram have made connecting to a “like-minded” online group all the easier, even when the group in question shares horrific common interests.

Instagram is worse than other social media sites like Twitter, the report went on to show:

The Stanford team found 128 accounts offering to sell child-sex-abuse material on Twitter, less than a third the number they found on Instagram, which has a far larger overall user base than Twitter. Twitter didn’t recommend such accounts to the same degree as Instagram, and it took them down far more quickly, the team found.

Among other platforms popular with young people, Snapchat is used mainly for its direct messaging, so it doesn’t help create networks. And TikTok’s platform is one where “this type of content does not appear to proliferate,” the Stanford report said.

Twitter didn’t respond to requests for comment. TikTok and Snapchat declined to comment. 

David Thiel, chief technologist at the Stanford Internet Observatory, said, “Instagram’s problem comes down to content-discovery features, the ways topics are recommended and how much the platform relies on search and links between accounts.” Thiel, who previously worked at Meta on security and safety issues, added, “You have to put guardrails in place for something that growth-intensive to still be nominally safe, and Instagram hasn’t.” 

The platform has struggled to oversee a basic technology: keywords. Hashtags are a central part of content discovery on Instagram, allowing users to tag and find posts of interest to a particular community—from broad topics such as #fashion or #nba to narrower ones such as #embroidery or #spelunking.

The same search feature that could lead to something innocent can also be a pathway to darkness, and Meta has not done a sufficient job of regulating and overseeing it. Their weak infrastructure fails to deal with many dangerous accounts.

While Meta says it’s working on building the needed guardrails, critics wonder whether it will be enough to curb the evil we’re seeing online.


Mind Matters News

Breaking and noteworthy news from the exciting world of natural and artificial intelligence at MindMatters.ai.

The Dark Side of Instagram