
Image by WangXina, from Freepik
New Malware Spreads Via Fake AI Video Tool
A new malware campaign is targeting creators and small businesses by disguising itself as an AI-powered video tool.
In a rush? Here are the quick facts:
- Malware disguises as free AI video tool targeting creators and small businesses.
- Victims are tricked into downloading “Video Dream MachineAI.mp4.exe” malware.
- Malware steals browser credentials, crypto data, and allows remote access.
The scam, first reported by Morphisec, tricks users into uploading images to what appears to be a free AI platform, only to infect their computers with a dangerous new infostealer known as Noodlophile Stealer.
These fake AI sites are promoted through Facebook groups and viral posts, some reaching over 62,000 views. Victims believe they’re receiving AI-generated videos based on their uploads.
Instead, they download malware that steals browser credentials, cryptocurrency wallet data, and can even give hackers remote access to their systems using a tool called XWorm.
This research report explains that the campaign stands out for its use of AI hype to target a newer, more trusting audience. Unlike older scams tied to pirated software, this one focuses on the boom in AI tools.
The malware is hidden in a file called Video Dream MachineAI.mp4.exe, which pretends to be a video but is actually a disguised version of CapCut, a real video editor. Once clicked, it silently installs a chain of hidden programs that collect data and establish a backdoor.
The final payload is controlled via a Telegram bot, which acts as a secret communication channel between the infected computer and the attacker. Investigators say the malware’s developer is likely Vietnamese, based on online traces and social media posts.
“Noodlophile” is now being sold in cybercrime markets as a malware-as-a-service product. The operation uses complex tricks like file renaming, command-line obfuscation, and password-protected archives to avoid detection.
This new threat highlights how cybercriminals are evolving, using the popularity of AI tools to lure victims into traps.