Bing to limit AI chat to five replies to prevent long, weird conversations

In an effort to keep Bing AI from getting weird, Microsoft is looking to cap its conversations to five replies.

Microsoft
3

Microsoft's Bing AI has been making news this week, though not for the reasons that the tech giant might have hoped. It appears that the farther conversations went, the more likely the AI bot was to malfunction in some spectacular ways. In an effort to try and curb instances of the AI bot attempting to gaslight users, Microsoft is looking to cap Bing chats at five questions per session and a total of 50 questions daily.

"Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat conversations have 50+ messages," reads the Microsoft Bing Blog (via The Verge). "After a chat session hits 5 turns, you will be prompted to start a new topic. At the end of each chat session, context needs to be cleared so the model won’t get confused."

For those who haven't followed this story and may be wondering how an AI bot could get confused, users reported increasingly unhinged behavior from the Bing AI bot as conversations went long. A New York Times article chronicled a full two-hour exchange with the Bing AI bot, in which the bot's behavior gradually deteriorated. The Verge posted other instances of Bing AI's odd behavior, including an instance where the bot tried to gaslight a user into believing the year is 2022. There was even a bizarre exchange about harm and retaliation.

Bing AI example conversation

Source: Microsoft

This week's Bing AI story has certainly been fascinating. Those interested in potentially trying it out can read our guide on how to sign up for Bing AI today. We'll continue to follow this developing story and report back with more interesting stories as they arise.

Senior Editor

Ozzie has been playing video games since picking up his first NES controller at age 5. He has been into games ever since, only briefly stepping away during his college years. But he was pulled back in after spending years in QA circles for both THQ and Activision, mostly spending time helping to push forward the Guitar Hero series at its peak. Ozzie has become a big fan of platformers, puzzle games, shooters, and RPGs, just to name a few genres, but he’s also a huge sucker for anything with a good, compelling narrative behind it. Because what are video games if you can't enjoy a good story with a fresh Cherry Coke?

From The Chatty
  • reply
    February 17, 2023 4:55 PM

    Ozzie Mejia posted a new article, Bing to limit AI chat to five replies to prevent long, weird conversations

    • reply
      February 17, 2023 5:45 PM

      Having played around a lot with generative AI I've found it's really easy to trap them into giving odd results. Image generators give you weird meatpiles and deep-fried, high-contrast oddities if you push them. The same seems to be true for large language models.

    • Zek legacy 10 years legacy 20 years
      reply
      February 17, 2023 6:18 PM

      Seems like a quick fix to prevent people from priming the bot to say nasty things and then taking a screenshot of the result.

      • reply
        February 17, 2023 11:36 PM

        No, it was going crazy without prompting. Luke discussed his experiences at length on the WAN show tonight.

    • reply
      February 17, 2023 11:23 PM

      WACK.

Hello, Meet Lola