News

Cybercriminals use AI-generated voice

Artificial intelligence, also known as AI by its acronym in Spanish, has become one of the great revolutions of our time. It has opened up a whole world of possibilities in different sectors, but unfortunately It is also being used by cybercriminals to improve the range and success rate of your attacks.

According to recent information shared by The Washington Post, cybercriminals are using AI to emulate the voice of family and loved ones. The final result is so good that they have managed to fool many people making them believe that they were talking to their children or their grandchildren. One of the most serious cases that have been confirmed has been that of a couple who were led to believe that their son had killed an American diplomat, which allowed them to defraud them of a total of $15,449 that supposedly were going to go to “coasts legal” to defend themselves in the process.

The US Federal Trade Commission has also confirmed that this is an increasingly serious problem, and that last year 36,000 reports were registered of people who had been scammed by people posing as their relatives. More than 5,100 of those cases occurred over the phone, that is, simulating the voice of their loved ones, a fact that highlights how effective this type of scam has become.

If you are wondering how we can avoid these types of scams, the answer is quite simple, making sure that the person we are talking to is really who they say they are. I understand that when a problematic situation is portrayed to us, we can feel overwhelmed and that our first reaction is to focus on helping our loved one as much as possible, but we must keep cool for a few minutes and ask him personal questions that we know that only our son, our grandson or any other loved one in question could answer.

A few minutes can make the difference between avoiding a scam or ending up losing a lot of money, so you know, be careful and always use common sense, is the golden rule.

Warning, scroll to continue reading

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *