Deepfake Remote Workers - are you on guard?

As if businesses did not have enough to think about when hiring and retaining staff enter the world of the ‘Deepfake Remote Worker’. Fascinating and slightly scary examples being picked out in this April CNBC report.

 

Fake job seekers use AI to interview for remote jobs, tech CEOs say

 

Whilst AI is now used extensively in Application Tracking Systems companies now need to be on guard against AI being used to exploit and subvert the hiring process.

 

Consider the scenario:

  • You are looking to hire for a software engineering role to support in development working on your core business offering
  • Because you want the best candidate for the job and are looking to offer flexible working arrangements HR have approved you can hire a remote worker into this role.
  • Great candidate applies with a perfect CV, all interviews are over video – all questions are responded to with no hesitation or issue.
  • You offer them the job – HR progresses with screening – all checks out and they are hired, onboarded and given system access appropriate for the role
  • A few months later – you have been compromised and your company data is being sold on the Dark Web…
  • Forensic investigation points to your ‘best candidate ever’ but they have disappeared ….

 

This scenario is becoming more prevalent and is a shift in the threat landscape. During the hiring process Threat Actors are using AI to create the perfect candidate CV to get through the Application Tracking Systems, using Deepfake technology to present the candidate for interview, invent credible photo ID’s and plausible employment histories including fake social media accounts. Gartner are citing that by 2028 globally 1 in 4 job candidates will be fake!

 

Hiring remote staff?– the issue used to be that one person interviewed but another showed up on the first day who did not have the required skills … but now take that one step further the new staff member does not exist at all.

 

This is a real and present danger. It’s reported that thousands of workers from North Korea have already infiltrated Fortune 500 companies ref: https://economictimes.indiatimes.com/news/international/us/can-you-believe-this-north-korean-hackers-pose-as-u-s-developers-in-fortune-500-firms-funnel-millions-to-kim-jong-uns-nuclear-weapons-programs/articleshow/120101644.cms?from=mdr .

 

If we accept that it is unlikely that all organisations will move back to an on premise or hybrid model for all of their staff and fully remote work is here to stay the root cause of the issue is the hiring process and its ability to ensure that a real person is being hired and onboarded.

The most targeted role now for Deepfake Remote Workers is an engineering role due the level of access these typically have which may include access to production pipelines. Once inside malicious code / backdoors can be implanted into your products and even cascade to your customer base. Data could be stolen and sold or ransomware deployed for financial gain with all the associated immediate financial loss and longer lasting brand impact.

 

So what can organisations do to prevent against this? Here are some actions to consider -

 

  • Make sure the HR team is fully aware of this new risk and ensure all hiring managers are briefed of the risk posed by fake job candidates and what to look out for.
  • Throughout the hiring process consider how to best detect this is a real person – if possible (and budget permits) see the candidate in the flesh at least once. New video authentication technology is available which can support during interviews.
  • Rethink the appropriateness of certain roles being remote and/or offered to foreign nationals – if organisations still believe the talent benefit is there – mitigate the risk by revisiting the access granted to those roles such that the ability to impact production systems is reduced.
  • Upgrade your background check and screening programmes to better be able to detect AI generated content
  • Add remote roles with privileged systems access to the High Risk roles list – bolster logging and monitoring on these employees and conduct frequent rescreening.
  • Make sure your DLP controls are working to detect any suspicious data exfiltration.
  • Also, ensure that any third parties you use with access to your systems are equally aware of this threat and are taking suitable precautionary measures.

Now you know, are you worried? How confident are you that your organisation has not been infiltrated by the Deep Fake Employee?

 

 

NuroShift LTD (Company number 16283002 - United Kingdom) - VAT Number 487 6511 51 © Copyright. All rights reserved.

Registered Office - 19-20 Bourne Court Southend Road, Woodford Green, Essex, England, IG8 8HD

We need your consent to load the translations

We use a third-party service to translate the website content that may collect data about your activity. Please review the details in the privacy policy and accept the service to view the translations.