A ransomware group has added a new tactic to its extortion efforts: threatening to submit stolen artistic data to artificial intelligence (AI) companies for use in training large language models (LLMs). The group, known as LunaLock, claims to have breached Artists&Clients, a platform that connects independent artists with paying clients, and is demanding $50,000 in cryptocurrency to prevent data release.
A Ransomware Attack With a New Angle
Around August 30, a message appeared on the Artists&Clients website attributed to LunaLock. The group stated that it had stolen and encrypted data from the site, urging users to pressure the platform’s operators into paying a ransom. The notice warned that if the ransom was not paid, the attackers would publish the stolen material on a Tor site and submit the artworks to AI companies so the data could be included in training datasets.
The ransom note promised that payment in Bitcoin or Monero would result in data deletion and file decryption. To increase pressure, the hackers included a countdown timer, giving the site’s administrators a limited period to comply. The warning also raised the specter of legal consequences, suggesting that the release of personal data could expose the company and its users to fines under privacy laws such as the European Union’s General Data Protection Regulation (GDPR).
Ransomware attacks typically involve threats to leak or sell sensitive data, but the explicit mention of AI training is novel. By adding this element, LunaLock is targeting a particularly sensitive issue for digital artists, many of whom are already concerned about their work being used without consent to train AI systems. The prospect of stolen art being deliberately fed into these models is a new form of leverage.
Artists and Experts React to the AI Threat
Cybersecurity experts say this tactic represents a shift in the way attackers attempt to exert pressure. “This is the first time I see a threat actor use training AI models as part of their extortion tactic,” Tammy Harper, senior threat intelligence researcher at security firm Flare, told 404 Media. She noted that while it has long been assumed that stolen data could end up in AI systems, explicitly using that threat is unprecedented.
Harper added that the move is likely to resonate strongly with artists. For creative professionals already wary of how AI systems use publicly available content, the idea that ransomware groups might funnel stolen work into training datasets raises the stakes. “It’s a very sensitive subject for this type of victim,” Harper said, pointing out that LunaLock may be counting on artists and clients to pressure the site into paying.
How the attackers would follow through on the threat is less clear. In theory, stolen artworks could be posted openly online, where AI training systems that rely on large-scale web scraping might eventually capture them. Alternatively, attackers could attempt to upload the material directly to services that allow users to contribute content, though companies’ policies vary on whether such submissions are used to train models.
Website Offline and Response Unclear
As of now, Artists&Clients is offline, with attempts to visit the site returning a Cloudflare error. Screenshots of LunaLock’s ransom message have been widely shared by users and cybersecurity researchers on social media. In addition, Google indexed the ransom note, which briefly appeared in the website’s description in search results.
The company behind Artists&Clients has not issued a public statement and did not respond to requests for comment from 404 Media. This silence leaves both users and observers uncertain about whether the ransom will be paid or what steps are being taken to secure the platform and its community of artists.
The situation highlights the evolving landscape of ransomware attacks, where hackers are adopting new threats tailored to their targets. By linking stolen creative work with AI training, LunaLock has tapped into one of the most contentious debates in the art and technology world. Whether the group follows through or not, the case underscores the risks that digital platforms face when handling both sensitive user data and intellectual property.