I remember that I when I finished my MBA probably about
twenty years ago or so, interviewing for a job was so different. If there was an on-campus event, if the recruiter
felt that there was a good match, you would be called back for an interview the
next day, and believe it or not, it was all face to face. Back then, phone interviews were barely ever
used, unless there was some reason or another that the candidate could not come
in.
In fact, If I remember correctly, all of my interviews were faced
to face. When an offer was made, you
would get it directly mail, adding an extra zest to the hard work you put in to
get that job. But fast forward twenty
years later, everything in the recruiting industry is now done on Zoom or
Microsoft Teams. It seems like that the
phone has become the last option now.
Given all of these technological advancements, while it can
be a good thing, it has also greatly hindered the recruiting industry as
well. For example, if you get an email
from a recruiter about a possible job role, it is hard to tell even if it is
real or not. Or for that matter, given the
explosion of robocalls, how do you know if the person calling on the other end,
is even real or not? Is this person calling
from some call center in India?
Now, the threats to the recruiting industry have taken a
different for the worse. There has been an implosion in the use of what are known
as “Deepfakes”. I have written about
this before, but essentially, this is where a fake video reproduction is made
of a real-life person.
Probably a good example of this are the politicians. Back in the recent Presidential Elections,
many fake videos were made of all of the leading candidates.
These we were aired all over the TV networks, and even You
Tube. They would be asking for money
donations, but any money sent would of course be sent to an offshore account. To create an authentic look Deepfake, very
often AI is used. To make things even crazier,
Cyberattackers are also using stolen identities to further impersonate the victim.
This can be very easily done, as most job hunters actually
link a copy of their resume to their Linked In profile, which can be downloaded
quite easily. Or, job candidates don’t
sign out of their accounts (such as those of Career Builder, Dice, Simply
Hired, Glassdoor, etc.). which makes this an easy backdoor for the Cyberattacker to
penetrate into as well.
Also, it is not hard to make a video of somebody you do not
even know of. In this instance,
Cyberattackers can merely visit the social media profiles of their intended
targets, grab a video of the real person that has been posted, and replicate
them from those sources.
In fact, the FBI jus recently put out a warning about these
Deepfake videos, and quite interestingly enough, the industry that is falling
prey for this kind of scam is the IT one.
Also, this scam so far has been only used for remote work positions,
not for direct office roles. Although the
FBI cannot specify any motive for the Cyberattacker is going about this way to
launch a new threat vector, the thought is that by getting an offer through the
use of Deepfakes will give them quicker access to confidential information and
data, especially the PII datasets of both customers and employees alike.
Now, it takes on average for the Cyberattacker of at least a
few months before they can find a covert way to get in, by finding unknown
vulnerabilities and weaknesses.
Just recently also, the FBI put out yet another warning of threat
actors from North Korea acting like freelance contractors using Deepfakes as a
way to get interviewed for various IT jobs.
My Thoughts On This:
Apart from the political front, the use of Deepfakes have
also found their home in Social Engineering attacks as well. Here are some of the more notorious examples
of this:
*Back in 2019, the CEO of a German company was Deep faked
and convinced the victim to wire transfer $243,000 to aid in a business emergency
(which of course was not true).
*In the fall of 2021, an employee of a company based in the United
Arab Emirates (UAE) was Deep faked and convinced to transfer $35 million to a fictitious
offshore bank account.
(SOURCES FOR THIS: https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos-voice-in-unusual-cybercrime-case-11567157402;
https://www.darkreading.com/attacks-breaches/deepfake-audio-scores-35-million-in-corporate-heist).
But keep in mind that as Deepfakes are making quick headway,
so is the technology that is being used to catch them. For example, background checks are now
becoming more sophisticated. If a
recruiter has any doubt about a candidate being a Deepfake and they lack the
proof at the time of the interview, they can easily order today a deep and
comprehensive background check on the candidate.
Any mismatches can be detected quickly, and reported back.
Also, Deepfakes are not yet a perfect crafted technology as
of yet. For example, there is often a
lag time between the video and the audio whenever the Deepfake answers a
question from a recruiter. It is always very important to try to find this specific
nuance.
If you, the recruiter notice this, don’t dismiss just a bandwidth
or network connectivity issue. You could
be dealing with a Deepfake, especially if this time lapse goes on through the
entire interview.
Also keep in mind that Deepfakes are used for only short-term
purposes. As stated, the Cyberattacker
wants to use this method only as a quicker way in which to penetrate your lines
of defenses and from there, stay in.
Also, there is research that is also being done to better identify
Deepfakes early on in the process at the University of California in
Riverside. More information about this
can be seen here, at this link:
https://openaccess.thecvf.com/content/WACV2022/papers/Mazaheri_Detection_and_Localization_of_Facial_Expression_Manipulations_WACV_2022_paper.pdf
Research is also being done at George Mason University, and
more information about that can be seen here:
Finally, as recruiter, if you have any doubts about a job
candidate with whom you just interviewed, it is your right to conduct a deeper
background check. And of course, any discrepancies
that appears to lead to a Deepfake should be immediately reported to the FBI.
No comments:
Post a Comment