Recent years have seen a plethora of recruiting concerns in the tech sector, including post-interview ghosting and bait-and-switch methods that harm both parties of a job offer. The FBI is now cautioning tech companies to be on the lookout for a surprising threat: DEEPFAKES interviews.

According to the agency’s newest PSA, bad actors are using DEEPFAKES to pass as other people in order to swindle their way into remote employment positions. The perpetrator begins by gathering sufficient personal information about their victim to successfully apply for jobs under that person’s name. Then, either by theft or through sloppily conducted online research, they amass a few high-quality images of the subject. When it’s time for the interview, the deceiver uses the images (and occasionally voice spoofing) to build and use a DEEPFAKES, which frequently looks like the target in a video medium.

According to the FBI, impersonating job candidates is common in IT and programming positions as well as any position that “has access to customer [personally identifiable information], financial data, business IT systems, and/or proprietary information.” Such information might be used to directly steal money from a corporation, damage the stock market, release rival goods or services, or sell vast volumes of private information. While it’s less likely that wrongdoers would wish to work in their unjustly acquired position for an extended period of time, there’s also a chance that they desire to earn US dollars from abroad or take advantage of benefits linked with a position they otherwise couldn’t achieve. Some even wonder if the impersonations could be a part of a larger operation threatening national security. 

Technology behind Deepfakes

Whether impersonations by job candidates are ever discovered during an interview is currently unknown. Even though some deep fakes are incredibly realistic, they are typically one-sided and infrequently a part of the verbal two-way street that characterizes a job interview. In a perfect world, a DEEPFAKE interviewee would be detectable to even the untrained eye. However, there is also something to be said about the rare stressed-out recruiter who, in their haste to fill a position or ten, can miss a worrying visual lag or mistake it for a bad internet connection. In this manner, the “ideal” criminal opportunity may be produced with technological prowess and a little bit of luck.

Although the FBI hasn’t provided particular advice for circumspect recruiters, it has issued a general warning about disorganized audio and pictures. According to the PSA, “In these interviews, the individual observed being interviewed on camera’s motions and lip movement do not properly coordinate with the audio of the person speaking. “At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”

Share this post

About the author

Leave a Reply

Your email address will not be published. Required fields are marked *