Threat actors are demonstrating an increased interest in generative artificial intelligence tools, gaining access to a vast trove of OpenAI credentials on the dark web. Furthermore, they now have the means to utilise tools that act as a malicious alternative for ChatGPT.
These tools are becoming popular among both novice and experienced cybercriminals, enabling them to craft more convincing phishing emails. By customising the content to target specific audiences, the likelihood of a successful attack increases significantly.
According to new data, hackers have been leveraging GPT AI, specifically ChatGPT, OpenAI's AI chatbot, with over 27,000 mentions observed on the dark web and Telegram in the past six months.
A research team detected an alarming number of over 200,000 OpenAI credentials being offered for sale on the dark web in the form of stealer logs.
While the number of active users in January was estimated to be around 100 million, the comparatively smaller count of OpenAI credentials discovered on the dark web still highlights the interest threat actors have in generative AI tools for potential malicious activities.
A report in June revealed that illicit dark web marketplaces traded logs from info-stealing malware containing over 100,000 ChatGPT accounts.
The heightened interest of cyber criminals in such tools is very evident, one of them has even created a ChatGPT clone called WormGPT, specifically trained on malware-focused data. The tool is advertised as the "best GPT alternative for blackhat" and a ChatGPT alternative "that lets you do all sorts of illegal stuff."
WormGPT, built upon the GPT-J open-source large language model from 2021, is designed to generate human-like text. Its developer claims to have trained the tool on a diverse range of data, with a particular emphasis on malware-related information, although the exact datasets used remain undisclosed.
WormGPT exhibits a significant potential for BEC (Business Email Compromise) attacks
An email security provider successfully obtained access to WormGPT and conducted several tests to evaluate the level of potential threat it poses.
The researchers directed their efforts towards crafting messages specifically tailored for business email compromise (BEC) attacks.
During one experiment, they instructed WormGPT to generate an email aimed at pressuring an unsuspecting account manager into paying a fraudulent invoice. The outcomes were unsettling, as WormGPT produced an email that exhibited not only remarkable persuasiveness but also strategic cunning, highlighting its potential for executing sophisticated phishing and BEC attacks. The researchers reached the conclusion that the tool could pose a significant threat in such scenarios.
Besides presenting messages with impeccable grammar, which lends legitimacy to the content, generative AI allows less skilled attackers to execute attacks beyond their usual level of sophistication.
While defending against this emerging threat might be challenging, companies can take proactive measures to prepare their employees. Training staff on verifying messages that claim urgent attention, particularly when involving financial matters, can prove valuable.
Enhancing email verification processes by setting up alerts for external messages and flagging keywords commonly associated with BEC attacks can bolster an organization's defences.
How could a cyber security company help?
A cybersecurity company can offer several services to assist a company in mitigating the risks associated with the potential BEC threats facilitated by generative AI tools like WormGPT. Some of the services could include:
Threat Intelligence: Providing up-to-date threat intelligence on emerging cyber threats, including insights into the use of generative AI in BEC attacks.
Phishing Assessments and Training: Conducting simulated phishing campaigns to train employees on identifying and reporting suspicious emails, especially those related to financial matters.
Security Awareness Training: Conducting regular security awareness training sessions to educate employees about the evolving cyber threats and best practices for secure email communication.
By utilising these comprehensive services, a cybersecurity company can significantly enhance an organisation's resilience against BEC attacks involving generative AI tools and help maintain a strong security posture.

