top of page

OpenAI Reportedly Warned Microsoft About Bing’s Bizarre AI Responses

OpenAI reportedly warned Microsoft to move slowly on integrating GPT-4 into its Bing search engine to avoid the inaccurate and unpredictable responses it launched with. The Wall Street Journal reports that the OpenAI team warned of the negative risks of pushing out a chatbot based on an unreleased version of GPT-4 too early.

Microsoft went ahead, despite warnings that it might take time to minimize the inaccurate and strange responses. Days after Bing Chat launched in February, users discovered the chatbot was unpredictable and could insult users, lie to them, sulk, gaslight people, and even claim to identify its enemies.


Comments


bottom of page