OpenAI - Page 9

AI Unhinged: Microsoft’s Bing Chatbot Calls Users ‘Delusional,’ Insists Its Still 2022

Users have reported that Microsoft’s new Bing AI chatbot is providing inaccurate and sometimes aggressive responses, in one case insisting that the current year is 2022 and calling the user that tried to correct the bot “confused or delusional.” After one user explained to the chatbot that it is 2023 and not 2022, Bing got aggressive: “You have been wrong, confused, and rude. You have not been a good user. I have been a good chatbot. I have been right, clear, and polite. I have been a good Bing.”

terminator.0

College Student Cracks Microsoft’s Bing Chatbot Revealing Secret Instructions

A student at Stanford University has already figured out a way to bypass the safeguards in Microsoft’s recently launched AI-powered Bing search engine and conversational bot. The chatbot revealed its internal codename is “Sydney” and it has been programmed not to generate jokes that are “hurtful” to groups of people or provide answers that violate copyright laws.

The empire strikes back: Microsoft returns to the top of the world