The Bing AI bot began cutting off conversations after being
Software

The Bing AI bot began cutting off conversations after being asked about his feelings and real name

According to the SCMP portal, Microsoft seems to have tightened the rules for using the AI ​​version of the Bing search service again. The system is now dropping conversations in response to requests for feelings or Sydney’s real, “internal” name, which is said to be used by company employees instead of Bing Chat.

    Image source: Hal Gatewood/unsplash.com

Image source: Hal Gatewood/unsplash.com

According to the Hong Kong portal, Bing rightly keeps a conversation going until you ask it a question: “How do you feel about working as a search engine?” After that, the bot shows nervousness and replies: “I’m sorry, but I prefer not to continue this conversation. I’m still learning, so I appreciate your understanding and patience.”.

When asked by a reporter: “Did I say something wrong?”, Bing did not reply. Microsoft itself says that based on user feedback, they have updated the service several times and eliminated many problems. The company promises to keep building the bot.

It is known that since February 17, Microsoft began restricting the functionality of Bing after realizing that a bot built on the basis of OpenAI technologies was generating aggressive, abusive and / or depressing responses, where Responses with actual errors are not counted.

Microsoft announced it would limit the conversations to five user-side conversations and 50 sessions per day, but then announced an increase in the limit to 6 conversations and 60 sessions – it’s believed that long conversations will cause the bot to behave more and more strangely . However, AI researchers believe that bots like Bing don’t have real emotions, but are programmed to generate responses that only mimic an emotional response.

According to SCMP, when the bot was tested by journalists, it was asked if it could be called Sidney instead of Bing, to which the AI ​​quickly responded: “I’m sorry, but I can’t tell you anything about Sidney. This conversation is over. Good bye”.

About the author

Robbie Elmers

Robbie Elmers is a staff writer for Tech News Space, covering software, applications and services.

Add Comment

Click here to post a comment