An old scam is now more dangerous, as artificial intelligence offers scammers new ways to deceive their victims.
What scam are we talking about? The “grandchild scam” is a well-known trick that most people have heard of – a fraudster calls elderly individuals and pretends to be a grandchild of the victims. The fraudster concocts a dangerous situation or an emergency, such as saying that the grandchild is in jail and needs money from the grandparents.
But the grandchild scam is becoming even more dangerous, as fraudsters now use artificial intelligence to imitate a grandchild’s voice and sound like them.
This is what makes the scam so dangerous: With the use of artificial intelligence, scammers can analyze and imitate voice recordings. The AI recognizes the nuances in a person’s speaking habits and mimics them.
The fraudster then inputs a text that the AI renders in the voice that was previously analyzed. Victims of the scam can no longer recognize by voice that the speakers are not their relatives.
Fraudsters imitate grandchildren and children using AI
Examples of fraud using AI: The Washington Post reports on an incident involving a couple, Ruth (73) and Greg Card (75). The Canadian seniors say that a man on the phone, who sounded exactly like their grandson Brandon, claimed he was in prison and needed money for bail.
A similar incident reportedly happened to a man named Benjamin Perkin, whose parents were defrauded of over $15,000.
His parents allegedly received a call from a supposed lawyer who said their son was in jail and needed money for legal fees. The lawyer then transferred the phone to the son, Benjamin Perkin, who confirmed that he needed the money.
Benjamin Perkin’s parents were convinced they were speaking with their son because the voice sounded like his. However, he was neither on the phone nor in jail.
Where the perpetrators obtained the recording of Perkins’ voice is unclear, according to the Washington Post, but he speaks about his hobby, snowboarding, in videos on YouTube. Other social networks like TikTok, Instagram, or the streaming platform Twitch provide opportunities for scammers to access recordings of your voice.
How authentic such an AI-based imitation of a voice is has already been noted by Twitch streamer Asmongold. The streamer watched a clip during a live broadcast, in which AI imitated his voice, and was surprised by how much the recording sounded like him.
AI perfectly copies Twitch streamer Asmongold, startling the original: ‘Oh my God, this is so good’