After five questions, the search engine will prompt you to start a new topic.

Microsoft has limited the number of “chat conversations” you can make with Bing’s AI-powered chatbot to five per session and 50 in total per day. Each chat turn is an exchange consisting of your question and Bing’s response, and you’ll be notified that the chatbot has reached its limit and prompted to start a new topic after five turns. The company said in its announcement that it is limiting the Bing chat experience because long chat sessions tend to “confuse the basic chat model in the new Bing,” Engadget wrote on the topic.

Indeed, ever since the chatbot became available, people have reported strange, even disturbing behavior on its part. New York Times columnist Kevin Rouse published the full transcript of his conversation with the bot, in which he reportedly said he wanted to hack computers and spread propaganda and misinformation. At one point, he declared his love for Roose and tried to convince him that he was unhappy in his marriage. “Actually, you are not happily married. Your husband and you are not in love… You are not in love because you are not with me,” it wrote

Following these messages, Microsoft published a post on its blog explaining Bing’s strange behavior. It said that very long chat sessions with 15 or more questions confused the model and prompted it to respond in a way that was “not necessarily helpful or consistent with [its] intended tone.” The company is now restricting the chats to address the issue, but said it will explore the possibility of expanding the restrictions on chat sessions in the future as it continues to receive feedback from users.

 

Tags: , , , , , , , , , ,
Editor @ DevStyleR