
Microsoft Eyes OpenClaw-Style AI Features for Copilot
Microsoft is reportedly exploring OpenClaw-style AI features for Copilot that could make the assistant more proactive inside Microsoft 365.
Key Takeaways: North Korean IT workers infiltrated over 320 companies in the past year, representing a 220% surge in operations. Operatives are using AI tools to create deepfake videos during intervie...
North Korean state-sponsored hackers have dramatically escalated their infiltration of global companies, weaponizing artificial intelligence to create convincing fake resumes, deepfake video interviews, and even forged military identification documents in an unprecedented cyber espionage campaign.
According to new cybersecurity reports and FBI warnings, North Korean IT workers infiltrated more than 320 companies in the past 12 months, representing what CrowdStrike describes as a staggering 220% growth during the past 12 months in successful infiltrations.
The operation, tracked by security firms as "Famous Chollima," has evolved far beyond traditional social engineering. Exclusive datasets, including browser histories and ChatGPT searches from over a dozen North Korean computers, obtained by researchers through open-source analysis and shared with CNN reveal how they engage AI to create job-seeking personas.
The most alarming development involves the use of real-time deepfake technology during video job interviews. Famous Chollima operatives very likely use real-time deepfake technology to mask their true identities in video interviews, according to CrowdStrike's 2025 Threat Hunting Report.
A single researcher with no image manipulation experience, limited deepfake knowledge and a five-year-old computer created a synthetic identity for job interviews in 70 minutes, according to Palo Alto Networks' Unit 42 researchers, demonstrating the alarming accessibility of this technology.
The sophisticated AI arsenal includes tools for creating convincing professional personas. The North Koreans have used generative AI to help them forge thousands of synthetic identities, alter photos, and build tech tools to research jobs and track and manage their applications.
Most concerning is a recent development where attackers used the artificial intelligence tool to craft a fake draft of a South Korean military identification card in order to create a realistic-looking image meant to make a phishing attempt seem more credible, according to research by Genians, a South Korean cybersecurity firm.
The scale of infiltration has prompted urgent government warnings. US Attorney for the District of Columbia Jeanine Ferris Pirro sent a direct message to corporate America:
"This is a code red." "Your tech sectors are being infiltrated by North Korea. And when big companies are lax and they're not doing their due diligence, they are putting America's security at risk," she said.
About 95% of the résumés Harrison Leggio gets in response to job postings for his crypto startup g8keep are from North Korean engineers pretending to be American, the founder estimates, illustrating the pervasive nature of the threat.
The FBI has taken decisive action. Between June 10 and June 17, 2025, the FBI executed searches of 21 premises across 14 states hosting known and suspected laptop farms.
These actions, coordinated by the FBI Denver Field Office, related to investigations of North Korean remote IT worker schemes being conducted by the U.S. Attorneys' Offices of the District of Colorado, Eastern District of Missouri, and Northern District of Texas. In total, the FBI seized approximately 137 laptops.
The operation generates substantial revenue for North Korea's weapons programs.
As described in a May 2022 tri-seal public service advisory released by the FBI, and State and Treasury Departments, such IT workers have been known individually earn up to $300,000 annually, generating hundreds of millions of dollars collectively each year, on behalf of designated entities, such as the North Korean Ministry of Defense and others directly involved in the DPRK's weapons of mass destruction programs.
Beyond financial gains, the infiltrations pose serious data security risks. After being discovered on company networks, North Korean IT workers have extorted victims by holding stolen proprietary data and code hostage until the companies meet ransom demands, according to the FBI.
IT workers employed under this scheme also gained access to sensitive employer data and source code, including International Traffic in Arms Regulations (ITAR) data from a California-based defense contractor that develops artificial intelligence-powered equipment and technologies.
Cybersecurity experts paint a sobering picture of the threat's scope. "There are hundreds of Fortune 500 organizations that have hired these North Korean IT workers," Mandiant Consulting CTO Charles Carmakal said Tuesday during a media briefing at the RSAC 2025 Conference.
"Literally every Fortune 500 company has at least dozens, if not hundreds, of applications for North Korean IT workers," Carmakal said. "Nearly every CISO that I've spoken to about the North Korean IT worker problem has admitted they've hired at least one North Korean IT worker, if not a dozen or a few dozen."
The threat has reached even major technology companies. North Korean technical workers have been detected in Google's talent pipeline as job candidates and applicants, but none have been hired by the company to date, said Iain Mulholland, senior director of security engineering at Google Cloud.
The sophisticated nature of the AI-enhanced operations makes detection increasingly difficult. "FBI investigation has uncovered a years-long plot to install North Korean IT workers as remote employees to generate revenue for the DPRK regime and evade sanctions," said Assistant Director Bryan Vorndran of the FBI's Cyber Division.
However, some companies have developed detection strategies. "My favorite interview question, because we've interviewed quite a few of these folks, is something to the effect of 'How fat is Kim Jong Un?' They terminate the call instantly, because it's not worth it to say something negative about that," he told a panel session at the RSA Conference in San Francisco Monday, according to Adam Meyers, CrowdStrike's senior vice president.
Organizations are implementing enhanced verification procedures, including requiring candidates to perform specific movements during video calls that challenge deepfake software capabilities and implementing comprehensive identity verification workflows.
The North Korean IT worker scheme represents a convergence of state-sponsored espionage, advanced AI manipulation, and traditional social engineering that challenges conventional cybersecurity defenses.
As AI technology becomes more sophisticated and accessible, experts warn that these infiltration tactics will only become more convincing and harder to detect.
The ongoing operation underscores the critical need for enhanced due diligence in remote hiring processes and the importance of multi-layered security approaches that can adapt to AI-enhanced threats.
Security experts recommend implementing strict identity verification procedures, including document authentication, video call protocols that test for deepfake technology, and ongoing monitoring of remote employees.
Companies are also advised to enhance their incident response capabilities and establish information-sharing partnerships with cybersecurity firms and government agencies.
The threat has prompted calls for industry-wide collaboration to develop new standards and technologies specifically designed to counter AI-enhanced social engineering attacks.
The operation highlights the broader geopolitical implications of AI weaponization, as North Korea leverages sophisticated technology to circumvent international sanctions and fund prohibited weapons programs.
The success of these infiltration campaigns demonstrates how authoritarian regimes can exploit remote work trends and AI advancements to project power across international boundaries.
For more news and insights, visit AI Pulse on our website.

Microsoft is reportedly exploring OpenClaw-style AI features for Copilot that could make the assistant more proactive inside Microsoft 365.

Meta launches Muse Spark, a closed proprietary AI model with tiered reasoning and 3B+ user reach across WhatsApp, Instagram, and Quest VR.

Key takeaways OpenAI has signed a multi-year, $10 billion agreement with AI chipmaker Cerebras Systems to secure computing infrastructure. The deal will deliver 750 megawatts of computing power throug...