Users say Microsoft’s Bing chatbot gets defensive and testy

Published by
Relaxnews

By JASON REDMOND / AFP Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation. Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation. A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Wednesday with tales of being scolded, lied to, or blatantly confused in conversation-s…

Read More

See also  Elon Musk blasts Apple for adding ChatGPT to iOS 18 days after calling out new AI feature in Windows 11

Leave a Reply