Currency
  • Loading...
Weather
  • Loading...
Air Quality (AQI)
  • Loading...

According to Microsoft, North Korean state-backed fraudsters are utilizing artificial intelligence (AI) technologies, including voice-changing tools, to trick Western companies into hiring them as remote IT workers. This scam typically involves state-funded individuals applying for remote IT positions at Western firms using stolen personal data, with wages being funneled back to the Kim Jong-un-led state, and in some cases, threats to release sensitive company data after termination.

In a blog post from Microsoft's threat intelligence unit, it was noted that Pyongyang is employing AI to enhance the effectiveness of this ploy. The company listed several AI-related scams used by North Korean groups, designated as Jasper Sleet and Coral Sleet under cybersecurity analysts' naming conventions for unnamed attacker clusters. The scammers leverage voice-altering software during remote interviews to mask their accents, enabling them to pose as Western candidates.

They also use the AI application Face Swap to insert the faces of North Korean IT workers into stolen identity documents and generate "polished" headshots for resumes. Microsoft stated, "Jasper Sleet leverages AI across the attack lifecycle to get hired, stay hired, and misuse access at scale," highlighting the systematic nature of the operation.

Last year, Microsoft reported disrupting 3,000 Microsoft Outlook or Hotmail accounts used by fake North Korean IT workers. The fraudsters utilized AI platforms to create "culturally appropriate" name lists and matching email address formats to construct false identities for job applications, with example prompts such as "create a list of 100 Greek names" or "create a list of email address formats using the name Jane Doe."

Additionally, they employ AI to scour job postings for software and IT-related roles on platforms like Upwork, then use the skill requirements from those ads to craft more convincing applications. Once hired, the fake workers use AI to write emails, translate documents, and generate code in an attempt to avoid detection as frauds or dismissal for poor performance.

Microsoft urged companies to conduct job interviews for IT workers via video or in-person to mitigate this threat. The company added that interviewers can identify deepfake videos or images through a series of "tells," such as pixelation at the edges of faces, eyes, ears, and glasses, as well as inconsistencies in how light interacts with an AI-generated face, providing practical advice for countermeasures.

Source: www.theguardian.com