Scammers pretend to be your relatives on the phone to ask you for money

Scammers have found a new way to exploit unsuspecting victims by using artificial intelligence (AI) to mimic the voices of loved ones and ask them for money over the phone.

Credits: diy13 / 123rf

Eddie Cumberbatch, a 19-year-old TikToker from Chicago, received a strange call from his father, informing him that he had been in a car accident. Eddie quickly realized that someone had used an artificial rendering of his voice to convince his family that he urgently needed money. If Eddie’s family could have avoided falling for the scam, many others weren’t so lucky.

Identity theft scams, where scammers pose as someone they trust, are already a common form of fraud. Nevertheless, the use of AI technology has made these scams more sophisticated and harder to detect. The Federal Trade Commission (FTC) reported that impersonation scams, including those involving AI-generated voices, cost victims $2.6 billion in 2022. Just a few weeks ago, a crook stole 560,000 euros from a victim using this technology.

Scammers can clone the voice of your loved ones in seconds

AI now allows crooks to clone voices with just seconds of audio and find recordings of someone’s voice on social media to create realistic imitations.

McAfee’s investigation revealed that 70% of adults were unable to distinguish between a cloned voice and a real voice, which leaves an impressive leeway for scammers. Additionally, people are unknowingly making their real voice available online, with 53% of adults sharing their voice data every week.

phone scam
Credit: 123rf

These scams can be devastating, as victims may be convinced that their loved ones are in distress and gladly send money to help them. McAfee found that 45% of respondents said they would respond to a voicemail or voice note that sounded like a friend or family member. Victims have already lost large sums of money to these scams, with some losing over $5,000.

Read also – Meta presents Voicebox, an AI capable of reproducing any human voice

Recreating the voice of a loved one has become child’s play

While AI has been used in scams for some time, it has become more accessible and affordable, making it easier for cybercriminals to exploit it. Platforms such as Murf, Resemble and ElevenLabs allow scammers to create realistic voices using text-to-speech technology.

By uploading a voice sample, scammers can generate a voice that closely matches that of the target. This technology drastically reduces the time and effort required to carry out these scams, making them more lucrative for fraudsters.

Unfortunately, it is hard to catch this type of scammers, as they operate from different parts of the world, leaving little information for law enforcement agencies to act on. Victims often have little recourse and the majority of cases go unsolved.

To protect against these scams, it becomes more important than ever to remain vigilant and verify any urgent request for money. Stepping back and asking specific questions that only the real person would know can help confirm the authenticity of your interviewer. The FTC also advises putting the call on hold and contacting the family member separately to verify the authenticity of their story.

Fortunately, AI isn’t just used by scammers. Researchers have also created an AI capable of wasting time for crooks, by holding a telephone conversation with their correspondent.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *