Microsoft “lobotomized” AI-powered Bing Chat, and its fans aren’t happy

EnlargeAurich Lawson | Getty Images

Microsoft’s new AI-powered Bing Chat service, still in private testing, has been in the headlines for its wild and erratic outputs. But that era has apparently come to an end. At some point during the past two days, Microsoft has significantly curtailed Bing’s ability to threaten its users, have existential meltdowns, or declare its love for them.

During Bing Chat’s first week, test users noticed that Bing (also known by its code name, Sydney) began to act significantly unhinged when conversations got too long. As a result, Microsoft limited users to 50 messages per day

→ Continue reading at Ars Technica

Related articles

Comments

Share article

Latest articles