Criminals Fake a Kidnapping Using AI
It hasn't taken long for criminals to capitalize on the AI boom- Share
-
-
arroba
It hasn’t taken long for criminals to capitalize on the AI boom. Unfortunately, the more potential a certain technology has as a tool, the more it can be leveraged for harm. In the case of generative AI, a mother recently said that criminals used AI to mimic her daughter’s voice, fake a kidnapping, and seek ransom money for it. After answering the phone, she heard her daughter’s sobbing voice, followed by a man demanding funds for a ransom. The hoax didn’t get too far, luckily. The mother was able to soon verify her daughter’s safety, but nonetheless, the scam was convincing and was understandably terrifying while it lasted. Victor Tangermann reported on the incident at Futurism, writing,
The fake kidnapping highlights a troubling new emergence of criminals making use of powerful AI cloning tools to mimic not only the speech but even the individual mannerisms of their victims.
To protect yourself, experts have some fairly straightforward advice.
“You’ve got to keep that stuff locked down,” FBI special agent Dan Mayo told WKYT, explaining that anybody with a big online presence could see it used against them.
-Victor Tangermann, Mom Says Creeps Used AI to Fake Daughter’s Kidnapping (futurism.com)
AI scams will only get more common and tricky, both as AI improves and scammers and criminals find more devious ways to misuse it. It’s wise take precaution and remain vigilant, especially in the age of social media when so many people have online personas and can thus be mimicked more easily.