AI-powered Bing Chat gains three distinct personalities End-shutdown


Benj Edwards / Ars Technica

On Wednesday, Microsoft employee Mike Davidson Announced that the company has released three distinct personality styles for its experimental AI-powered Bing Chat bot: Creative, Balanced, or Accurate. Microsoft has been evidence the feature from February 24 with a limited set of users. Switching between modes produces different results that change your balance between precision and creativity.

Bing Chat is an AI-powered assistant based on an advanced Long Language Model (LLM) developed by OpenAI. A key feature of Bing Chat is that you can search the web and incorporate the results into your responses.

Microsoft announced Bing Chat on February 7, and shortly after its release, adversary attacks regularly drove an older version of Bing Chat into simulated insanity, with users finding that the bot could be coaxed into threaten them. Not long after, Microsoft dramatically reduced Bing Chat outbursts by imposing strict limits on the length of conversations.

Since then, the firm has been experimenting with ways to bring back some of Bing Chat’s edgy personality for those who wanted it, but also allow other users to search for more precise answers. This resulted in the new three option “conversation style” interface.

In our experiments with all three styles, we noticed that the “Creative” mode produced shorter, quirkier suggestions that weren’t always safe or practical. “Precise” mode erred on the side of caution, sometimes not suggesting anything if it couldn’t see a safe way to achieve a result. In between, “Balanced” mode often produced the longest responses with the most detailed search results and website citations in their responses.

With large language models, unexpected inaccuracies (hallucinations) often increase in frequency with greater “creativity”, which usually means that the AI ​​model will deviate more from the information it learned in its data set. AI researchers often call this property “temperaturebut Bing team members say there’s more at work with the new conversation styles.

According to Microsoft employee mikhail parakhinchanging modes in Bing Chat changes fundamental aspects of the bot’s behavior, including switching between different AI models they’ve received additional training of human responses to its output. Different modes also use different initial prompts, which means Microsoft swaps the personality-defining prompt like the one revealed in the prompt injection attack we wrote about in February.

While Bing Chat is still available only to those who signed up on a waiting list, Microsoft continues to refine Bing Chat and other AI-powered Bing search features as it prepares to roll it out more widely to users. Microsoft recently announced plans to integrate the technology into Windows 11.


Leave a Reply

Your email address will not be published. Required fields are marked *