Mind Matters Natural and Artificial Intelligence News and Analysis

TagChild sexual abuse (online)

mockup-of-social-media-app-user-interface-in-dark-screen-mode-stockpack-adobe-stock
Mockup of social media app user interface in dark screen mode

The Dark Side of Instagram

An investigative report shows that Instagram algorithm promotes pedophilia networks

It’s an unfortunate fact that sex trafficking and pedophilia rings have benefitted from the invention of the internet. Even worse, Meta‘s Instagram is amplifying the problem – not because Meta wants to, of course, but because the algorithm promotes the activity. The Wall Street Journal did an investigative report in partnership with a team from Stanford University on Instagram’s promotion of pedophilia rings, with Jeff Horwitz and Katherine Blunt writing, Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them. Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those Read More ›

closeup-of-unrecognizable-little-girl-using-smartphone-focus-on-hands-scrolling-through-internet-copy-space-stockpack-adobe-stock
Closeup of unrecognizable little girl using smartphone, focus on hands scrolling through internet, copy space

Drawing a Line: When Tech To Keep People Safe Seems Dangerous

A dispute at the Washington Post about tech aimed at detecting child sex abuse highlights some of the issues

Princeton computer scientists Jonathan Mayer and Anunay Kulshrestha thread that needle:: Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple’s own employees have been expressing alarm. The company insists reservations about the system are rooted in “misunderstandings.” We disagree. We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous. We’re not concerned because we misunderstand how Apple’s system works. The problem is, we understand exactly how it works. Opinion by the Editorial Board: Apple’s new child safety tool comes with privacy trade-offs — just like all the others Read More ›