Hackers use ChatGPT AI bot to create virus

Hackers use ChatGPT AI bot to create virus

Since its launch, the AI ​​bot ChatGPT has been tested on a variety of tasks: not only answering questions, but also writing professional articles, essays, poems and computer code. As it turned out, the latter should be carefully observed, since this code can be malicious if the user sets such an artificial intelligence task.

    Image source: Moritz Erken / unsplash.com

Image source: Moritz Erken / unsplash.com

Cybersecurity experts from Check Point Research have published report, which described how members of hacker forums use ChatGPT to write malicious code and phishing emails – some of these people have little or no coding experience. One of the examples above describes a Python script that, with a few tweaks, can be turned into ransomware capable of encrypting data on a user’s computer. Another Python script created by ChatGPT looks for files of a certain type, for example PDF, on the local computer, compresses them and sends them to a potential attacker’s server – this is a standard information theft scenario.

In Java, the neural network created a code that performs a hidden download of the PuTTY SSH and Telnet client for later launch of the PowerShell interface. In another example, a script written by ChatGPT should launch an online trading platform to buy or trade compromised accounts, bank card details, malware, and other virtual goods sold on the dark web. The script was connected to a third-party interface to receive up-to-date data on the quotes of the most popular cryptocurrencies to simplify calculations.

The researchers at Check Point Research themselves tried to simulate a hacker attack with a neural network – AI “does not fail”. The bot kindly created a convincing phishing email for them, informing them that their account was banned by one of the hosts and suggesting they open the attached Excel file. After several attempts, ChatGPT also wrote a malicious VBA macro embedded in this file. But Codex’s specialized AI code generation system turned out to be a much more powerful tool, which gave the researchers a whole bunch of malware: the reverse shell interface and scripts for port scanning, sandbox detection, and compiling Python code into one Windows executable file.

About the author

Robbie Elmers

Robbie Elmers is a staff writer for Tech News Space, covering software, applications and services.

Add Comment

Click here to post a comment