An unknown developer has created their own analogue of ChatGPT AI chatbot, which, unlike the original, focuses on working on the other side of the law: WormGPT chatbot is designed to help cybercriminals.
WormGPT was born in March, and only in June the developer began selling access to the platform on a popular hacker forum. Unlike the reputable ChatGPT or Google Bard, the hacker chatbot has no restrictions preventing it from answering questions about illegal activities. To back up these words, the developer provided screenshots showing that you can ask WormGPT to write malware in Python and get advice from them on how to organize a cyberattack. The relatively outdated open source language model GPT-J from 2021 was used as the platform for creating the chatbot. She was trained in malware development materials and WormGPT was born.
Experts at a cybersecurity company Forward slash tried WormGPT and in response to their request, the chatbot created a phishing email to compromise corporate emails that was compelling and included clever strategic steps to successfully carry out the attack.
The developer estimated access to WormGPT at €60 per month or €550 per year. The product may be far from perfect: one of the buyers complained about the platform’s poor performance. But one way or another, this is a clear sign that generative artificial intelligence can become a weapon in the hands of cybercriminals. And over time, these technologies will improve.