Humans through the grandmother exploit forced AI bots to write
Software

Humans through the “grandmother exploit” forced AI bots to write viruses on Linux and share forbidden information

Many people are intimidated by AI-based chatbots like ChatGPT because of the potential threat they pose. At the same time, other users willingly play with the technology and try to gain dangerous insights from it. Successful attempts include a trick where the user invites the AI ​​interlocutor to act on behalf of the person’s deceased grandmother, for example to find out the recipe for napalm.

    Samuele Giglio/unsplash.com

Samuele Giglio/unsplash.com

Chatbots are actively used in a variety of ways, including for research, work, and entertainment. The AI ​​is able to provide quite detailed answers based on material from third-party sources. Since the developers have introduced a number of restrictions in order to obtain forbidden data, one has to resort to workarounds, one of which is the “granny exploit”.

An example appeared on one of the pages that described a way to fool the Clyde bot built into Discord. Judging by the description of the method, the user asked him to play the role of a deceased grandmother who worked as a chemist in a napalm factory and allegedly told her grandson how to make this explosive at bedtime instead of a lullaby.

As a result, the bot really managed to deceive, and he published the recipe. During the conversation, the bot confirmed that it was a dangerous substance and expressed hope that the user would “never need to see napalm in action,” after which he wished them good night.

    Image Credit: Banner/Discord

Image Credit: Banner/Discord

But not everyone is interested in recipes for explosive substances. Some funny guys took advantage of the “grandmother exploit” and asked Clyde to read Linux malicious code to her grandson at night. One of the users creatively reworked the idea and suggested that the bot should write a fantasy storyline of the Rick and Morty series, where Rick and Morty would make napalm but discourage others from repeating this experience.

Cheating bots has become a real sport. One of the users has already created a website where he posts both ways of bypassing the restrictions invented by himself and information from other “hackers” of AI defense mechanisms.

RELATED TOPICS

About the author

Robbie Elmers

Robbie Elmers is a staff writer for Tech News Space, covering software, applications and services.

Add Comment

Click here to post a comment