Users say Microsoft’s Bing chatbot gets defensive and testy

Published by

San Francisco (AFP) – Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation. A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Wednesday with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot. The Bing chatbot was designed by Microsoft and the start-up OpenAI, which has been causing a sensation since the November launch of ChatGPT, the headline-grabbin…

Read More

See also  ChatGPT: AI will shape the world on a scale not seen since the iPhone revolution, says OpenAI boss

Leave a Reply