CrowdStrike report details scale of North Korea's use of AI in remote work schemes — 320 known cases in the last year, funding nation's weapons programs
The Democratic People's Republic of Korea is using generative AI tools to land agents jobs at tech companies to fund its weapons programs.

CrowdStrike's latest Threat Report includes new information about China's increased targeting of North American telecommunications companies, Russia's continued efforts to support its invasion of Ukraine with cyberespionage, and other trends the security firm witnessed from July 2024 to June 2025. (Presumably excluding the period during which a faulty update to its software brought down global infrastructure.) But of particular interest is the sheer scale of North Korea's AI-supported tech worker schemes.
The company said that in the last 12 months, it has "investigated over 320 incidents where [North Korean] operatives obtained fraudulent employment as remote software developers" and that the hackers have "been able to sustain this pace by interweaving GenAI-powered tools that automate and optimize workflows at every stage of the hiring and employment process." Resumes? Fake. Social accounts? Fake. The person shown during a video call, the headshots, the messages they send? Fake, fake, fake.
"Once hired, [these] workers use GenAI code assistants [and] translation tools to assist with daily tasks and correspondence related to their legitimate job functions," CrowdStrike said. "Though an average employee may use GenAI in a similar manner, these tools—especially those enabling English-language communication—are especially crucial [to this group]. These operatives are not fluent in English, likely work three or four jobs simultaneously, and require GenAI to complete their work and manage and respond to multiple streams of communication."
We knew this had been happening—the Justice Department announced in July that it had made a flurry of arrests, sanctions, and investigations related to North Korea's fake tech workers. I noted at the time that U.S. officials started issuing warnings about these schemes in 2022 and that Google reported a similar uptick in activity related to these efforts in March, so CrowdStrike isn't pulling back the mask for the first time, as it were. But this new Threat Report drives home just how big the problem is.
It's kind of like watching an episode of "Scooby Doo" where the gang first reveals that some normal-seeming dude is a criminal. But so is that dude, and this other dude at that other company, and... wait, actually those are the same person using a combination of laptop farms and chatbots to seem like different people, and whoops it turns out Velma's an imposter too, and that's why that HBO show was so bad. Oh, and unlike a cartoon villain, North Korea will continue to get away with this.
CrowdStrike's recommendations for identifying these imposter hackers include, among other things, the adoption of "enhanced identity verification processes during the hiring phase that include rigorous background investigations and corroboration of online professional profiles" and the implementation of "real-time deepfake challenges during interview or employment assessment sessions." But those approaches incur additional costs — and North Korea will find ways to circumvent them.
The masks are being pulled back. It doesn't seem to be making a difference. So what now?
Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Nathaniel Mott is a freelance news and features writer for Tom's Hardware US, covering breaking news, security, and the silliest aspects of the tech industry.
-
Notton "What now?"Reply
Maybe don't go for the cheapest labor?
Hire locally?
If the job can be faked with AI, did it really require someone you'll never meet in person?
When gazing into exploits for profit, don't be surprised when the exploit stares right back at you. -
-Fran-
Shareholders will sue you, as a CEO, if you don't maximise profits, so.Notton said:"What now?"
Maybe don't go for the cheapest labor?
Hire locally?
If the job can be faked with AI, did it really require someone you'll never meet in person?
When gazing into exploits for profit, don't be surprised when the exploit stares right back at you.
It's a problem of their own (the USA's) making.
Regards. -
jp7189
You're missing whats happening here. These are legitimately talented people that actually do the work and do it well (while also stealing data). Part of the scheme is to appear as coming from a different country by using a combination of AI tools, remote PCs and a complicated web of social media accounts that have history and appear legit.Notton said:"What now?"
Maybe don't go for the cheapest labor?
Hire locally?
If the job can be faked with AI, did it really require someone you'll never meet in person?
When gazing into exploits for profit, don't be surprised when the exploit stares right back at you.