Blog Layout

By The Cybersecurity Lair™ • July 30, 2024

Latest News | Stolen GenAI Credentials Flood Dark Web, Threatening Corporate Security

Around 400 GenAI account credentials are listed daily on dark web platforms

Recent findings by eSentire’s Threat Response Unit (TRU) reveal that approximately 400 stolen GenAI account credentials are being listed daily on dark web markets. The credentials, which include access to platforms like GPT, Quillbot, Notion, HuggingFace, and Replit, are frequently harvested through infostealer malware infecting users' browsers. Despite the closure of the notorious LLM Paradise market, which sold stolen GPT-4 and Claude API keys for as low as $15, cybercriminals continue to exploit these credentials for malicious activities such as phishing, malware development, and creating harmful chatbots. The theft of GenAI credentials poses a significant threat to corporate data, potentially exposing sensitive information including customer data, financial records, intellectual property, and employee PII. eSentire’s report also highlights critical threats like LLM jacking, prompt injection attacks, and aggressive data collection practices, with OpenAI credentials being among the most frequently stolen.

Key Points:


  • Around 400 GenAI account credentials are listed daily on dark web platforms.
  • Credentials for GPT, Quillbot, Notion, HuggingFace, and Replit are targeted.
  • Credentials are often obtained via infostealer malware.
  • The LLM Paradise market, although closed, previously sold stolen GPT-4 and Claude API keys.
  • Stolen GenAI credentials are used for phishing, malware creation, and malicious chatbots.
  • The impact on corporate data can be severe, exposing sensitive and private information.
  • Critical threats include LLM jacking, credential abuse, prompt injection attacks, and aggressive data collection.
  • OpenAI credentials are particularly targeted, with significant daily thefts.


Takeaways and Prevention Strategies:


  • Robust Security Measures: Companies should implement advanced multi-factor authentication (MFA) and usage monitoring to protect their GenAI accounts.
  • Dark Web Monitoring: Regular monitoring of dark web marketplaces for stolen credentials can help mitigate risks.
  • Educate Users: Raise awareness about phishing and malware risks to reduce the likelihood of infostealer infections.
  • Regular Security Audits: Conduct frequent security audits to identify and address vulnerabilities in systems and credentials management.
  • Prompt Incident Response: Develop and maintain a swift response plan for credential breaches to limit damage.


The proliferation of stolen GenAI credentials on the dark web represents a critical and escalating threat to corporate security. With cybercriminals leveraging these stolen credentials to perpetrate phishing attacks, develop malware, and exploit sensitive data, the need for heightened vigilance and robust security measures has never been more urgent. The ease with which these credentials are obtained and the potential impact on business operations highlight the pressing need for organisations to adopt comprehensive security practices, including advanced multi-factor authentication, dark web monitoring, and user education. As the landscape of cyber threats evolves, companies must remain proactive and adaptable to safeguard their digital assets and maintain the integrity of their operations.


Source and further reading.


Mascellino, A. . (2024, July 30). Stolen GenAI accounts flood dark web with 400 daily listings.
Infosecurity Magazine. https://www.infosecurity-magazine.com/news/genai-dark-web-400-daily-listings


Poireault, K. . (2024, July 28). What the OWASP Top 10 for LLMs means for the future of AI security.
Infosecurity Magazine. https://www.infosecurity-magazine.com/news-features/owasp-top-10-llm-means-future-ai/


Mascellino, A. . (2024a, July 29). New “LLMJacking” attack exploits stolen cloud credentials.
Infosecurity Magazine. https://www.infosecurity-magazine.com/news/llmjacking-exploits-stolen-cloud/

Share by: