Remote work has created another problem – colleagues who may be deepfakes

By Jurica Dujmovic

People impersonate someone else to get a job

Imagine that you are an inexperienced technician who wants to apply for a job in a company.

Let’s say you have a roommate or a friend who already has a job and is much more familiar with the hellish concept of job interviews, which make you shiver when you think about it.

What if you could just stick your face on his body and have him do the interview instead? As long as no one knew the true tone of your voice, you would be ready. Enter the magical – and often dark – world of deepfakes!

A deepfake is a type of synthetic media in which a person’s face in an existing image or video is replaced with someone else’s likeness. The technology was developed in the 1990s, but became popular much later thanks to online hobbyist communities.

Deepfakes can range from funny and offbeat videos to sinister and dangerous videos depicting political statements that were never made or events that never happened. The situation is getting worse – deepfakes have been used to create non-consensual pornography, hoaxes, bullying and more.

However, today I would like to focus on one particular and interesting application of technology: pretending to be someone else in order to get a job.

Now you might be wondering why anyone would do that – it sounds too complicated and a risky and stressful approach to getting a real job. Unless… the reason for getting a particular job is not to gain gainful employment, but rather to gain access to company infrastructure so that sensitive information can be disclosed.

In this case, you wouldn’t be a camera-shy introvert, but rather a tech-savvy social engineer/hacker, pretending to be someone else — someone qualified.

According to the FBI alert released on June 28, the number of cases where malicious actors have successfully applied for work-from-home positions using stolen PII (Personally Identifiable Information) is on the rise.

This data was most likely acquired by hacking victims’ accounts or even companies’ HR databases.

The FBI warns that hackers use voice spoofing (mimicking another person’s voice using digital manipulation tools) and some publicly available forgery tools (DeepFaceLab, DeepFaceLive, FaceSwap, et al. ) to fool unwary investigators.

According to the FBI report, one way to recognize that something is wrong is to pay attention to the “actions and lip movements of the person seen being interviewed on camera”, which “do not completely coordinate with the ‘audio of the person speaking’.

However, this is not a completely foolproof way to detect deepfakes. These days, an increasing number of apps enable seamless, real-time integration into video calls, resulting in higher-quality lip-syncs. That’s not to say that deepfake videos are indistinguishable from reality.

Besides lip-sync imperfections, other tell-tale clues include facial discoloration, unnatural blinking patterns, blurring, weird digital background noise, and a difference in sharpness and video quality between the face and the rest. of the video (i.e. the face looks sharper and cleaner than the background image).

Identity theft is nothing new and technology is simply providing hackers with new tools to make the process easier. As always, the crucial step is training and educating staff to resist the social engineering part of the attack, namely not giving access to vital company infrastructure to new recruits who are not approved.

Although meeting the renter in person is always the best way to verify their identity, it is not always possible. In these cases, there are a few things that would mess up the eventual deepfake algorithm – asking a respondent to rotate their body (easy to do on a regular office chair), placing their face at an awkward angle in front of their camera or place their open palm in front of their face and move their hand at varying speeds, hiding and showing parts of the face between the fingers. These methods could fool some models by creating glitches and producing artifacts or blurs that could reveal a deepfake video.

When it comes to audio, watch out for phrasing, jerky phrases, and odd pitch inflections. Sometimes entire sentences are pre-synthesized, which can result in predefined responses. In this case, watch out for answers that seem out of context – if a respondent doesn’t answer a question, or answers it the same way multiple times, that could also be a red flag.

You can be the target of identity theft even if you are not an HR representative for your company.

I’ve written articles on how to protect against hacker attacks, so I won’t go into detail here. Instead, I urge you to maintain standard protective measures and practices, such as using strong, unique passwords for all your accounts and enabling two-factor authentication (2FA) in the possible.

In a job search scenario, keep an eye on your communications and reach out to your potential employer through different channels, if possible. A hacker won’t be able to cover them all, and receiving conflicting information from you and the hacker will raise HR’s suspicions of a possible impostor.

Being a little more proactive at this point in your job may be all it takes to dispel the illusion, impress your future employer, and land that job instead of your evil digital doppelganger.

Finally, if you think you’ve been the victim of identity theft, contact local law enforcement and file a report.

-Jurica Dujmovic

 

(END) Dow Jones Newswire

08-06-22 1503ET

Copyright (c) 2022 Dow Jones & Company, Inc.

Comments are closed.