The voice on the phone seemed frighteningly real. An American mother sat in disbelief when she heard her 15-year-old daughter sobbing, telling her kidnappers demanded a ransom. It was a scam that had rattled US authorities and the whole world, but it was the work of an AI clone. It was one of a new breed of calls that have taken the ‘caller-ID spoofing’ phenomenon to terrifying heights.
The technology behind these spoofed calls is called generative artificial intelligence, or GAN, which enables fraudsters to mimic the voices of people, even if they don’t have access to their private phone recordings. Cybercriminals can use it to dupe people into sending money via wire transfer or transferring ownership of assets. It is also used to manipulate the public with fake news stories and spread rumors, often based on false or twisted facts.
Despite the technology’s immense benefits, experts have long warned that unscrupulous individuals and organizations could use AI to demolish the boundaries between reality and fiction, handing cybercriminals a cheap and effective technology to propagate lies. The alleged kidnapping case in Arizona has highlighted this danger and underscored how AI could be abused.
According to experts, scammers all need a three-second audio sample of a person to clone their voice. They can collect the data from the public social media accounts of the victim, family members or friends who have the person’s number, interviews recorded by schools, and the like. They then use the AI cloning system to create the fake call and convince the victim that their loved ones are in trouble.
In the case of the Arizona girl, the mother, Jennifer DeStefano, was “100 percent” convinced that it was her daughter in distress, as the call came from a number she did not recognize. She told a local TV station: “It was completely her voice; it was how she would cry.” The AI-powered ruse ended within minutes when DeStefano established contact with her daughter, but it highlights how easily the technology can be exploited.
Almost anyone can fall for such a scam, with the Federal Trade Commission reporting that more than half of Americans have been duped into transferring money for this purpose. One in 12 Americans says they have lost cash in this way, with 6 percent saying that the amount they paid was more than $15,000, reports cybersecurity firm McAfee.
The latest AI cloning scams twist the old familiar scam where a person calls from an unknown number claiming to be a friend or family member who is in danger and needs help or money immediately. Consumers have been warned that such calls will likely be a scam for years. But now, with the rise of AI clones, such calls are more likely to sound convincing.
The best advice to avoid such a scam is to keep social media profiles locked down and not respond to urgent requests for money from unfamiliar sources. And if you are selling property, be wary of offers that require an upfront payment of taxes or fees to ensure the deal goes through.