Just last week, Microsoft restricted the use of ChatGPT-based AI bot test-takers on Bing over multiple abuses of the technology. However, after repeated complaints from testers on social media, she once again decided to expand the possibilities of the experimental program.
Microsoft is already relaxing restrictions. The developer said that he is starting to test a new option that will allow users to change the tone of communication in chat. The options are Concise (precise, on-topic answers), Creative (more detailed and “chatty” answers), and Balanced (a combination of both).
After numerous reports of strange bot behavior (like “splitting” into multiple virtual identities, one of which was offering adult content, according to The Verge) and numerous attempts by enthusiasts to cheat the system in one way or another, Microsoft recently announced the bot new rules on Friday by limiting the duration of dialogues – it is believed that more than five replicas from the user side can lead to rather strange AI responses. In addition, the number of sessions per day was limited to fifty.
Now the developer has increased the number of copies to six, and the number of sessions to sixty for those who have access to the test program. Soon they promise to increase the number of sessions to a hundred, and it will be possible to communicate with the bot again without restrictions in the same dialogue.
Previously, Ars Technica reported that Reddit users complained that Microsoft had “lobotomized” the bot, leaving “a shell” of its former personality.
The Bing team responded to these allegations in a blog post that long and complex conversations with a bot are generally atypical in internal tests and that the purpose of open testing with a limited group of users is precisely to identify atypical situations that the team can get out of can learn and improve the system. Microsoft said the company is bringing back the possibility of longer conversations after reacting negatively to the restrictions.