Tech

Deepfake: Scammers Impersonate Experts to Steal Sensitive Company Data

The FBI issued a warning earlier this week saying it had received a growing number of complaints about the use of deepfake videos in interviews for tech jobs involving access to systems and computers. sensitive information.

deepfake emmanuel macron
Deepfake Emmanuel Macron

More and more people are using deepfake technology to impersonate someone else in remote job interviews, the FBI said on Tuesday. In its press release, the FBI indicates that it has received an increasing number of complaints concerning people who superimpose videos, images or audio recordings of another person on themselves during job interviews for telecommuting positions.

For those who don’t know, “deepfakes” involve using programs powered by artificial intelligence to create realistic images of a person. For example, the technology can be used to replace a person’s face with that of a celebrity, or clone another person’s voice.

Also Read – Deepfake: this AI porn site is attracting more and more followers and it’s worrying

Remote job interviews are the new target for scammers

According to information from the FBI, the scammers use both false documents and personal identification information stolen from the victims to trick employers into hiring them for remote jobs. These telecommuting positions are often related to information technology and computer programming, as well as databases and software.

The problem is that some of these workstations allowed access to all kinds of data, from personal customer information to financial data and confidential business information.

Today it becomes increasingly difficult for recruiters to know who they are dealing with during a video interview, because some deepfake technologies are very convincing. However, the FBI announces that it is possible to detect a scam by looking for certain clues. ” During these interviews, the actions and lip movements of the person being interviewed on camera are not completely coordinated with the audio of the person speaking. said the FBI. ” Sometimes actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually “. Fortunately, companies like Facebook are now able to detect a deepfake and even track down its creator.

Source : FBI

Related Articles