Bing’s AI chatbot has a question limit that is more of a problem than a solution

A man standing on stage of Bing and Edge event

Getty Images/Jason Redmond

Bing’s AI chatbot has yet to be released to the public; however, there are select users who have been given early access, and they have not been shy about sharing their experiences. Many of these users have made it their mission to test the chatbot’s capabilities and expose its flaws. 

From revealing its confidential codename used internally by developers, to declaring its love to a New York Times writer and asking him to leave his wife, to wishing it was alive, there is no doubt the chatbot was acting out of hand. 

Consequently, Microsoft decided to reel in the chatbot. As of last Friday, the chatbot has a five-question limit per session and a 50-chat turn limit per day. 

“Very long chat sessions can confuse the underlying chat model in the new Bing,” Microsoft said in a blog post. “Our data has shown that the vast majority of you find the answers you’re looking for within 5 turn.”

This solution, however, doesn’t address any underlying issues with the way chatbots form responses, or the ways their models are trained. Instead, it just makes it extremely difficult to have a conversation with the chatbot. 

As a sample prompt, I tried to get the chatbot to help me with writing an email. By the time I finished making all of the changes I wanted, I wasn’t even able to access the email — the Bing chatbot had booted me off. 

Screenshot of ChatGPT Bing

Screenshot by Sabrina Ortiz/ZDNET

It’s easy to imagine how someone who is trying to perfect their code or is trying to compose the perfect essay could need more than five prompts to get their desired result. By cutting the conversation short, the chatbot is rendered useless for most technical prompts (such as drafting code or text in a specific format), which is what ChatGPT’s claim to fame has been. 

The new limitations also cut down on the entire appeal of having a chatbot with human-like conversational capabilities. To start the chat, all I asked was a simple, “Hi! How are you feeling today?” when I was met with a curt, “I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.🙏”

Bing ChatGPT screenshot

Screenshot by Sabrina Ortiz/ZDNET

This response provides a stark contrast from ChatGPT’s response, which acknowledges that it is incapable of feelings, but still kindly answers your question and asks how to further assist you. 

ChatGPT screenshot

Screenshot by Sabrina Ortiz/ZDNET

By trying to curb controversy, Microsoft removed the characteristics that made the chatbot helpful and unique and has left us with what feels like a demo version. Although slightly messy, the original version was more useful and exciting than Bing’s current, more muted iteration. 

Source link