Users say Microsoft’s Bing chatbot gets defensive and testy

Published by
Relaxnews

By JASON REDMOND / AFP Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation. Microsoft’s fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation. A forum at Reddit devoted to the artificial intelligence-enhanced version of the Bing search engine was rife on Wednesday with tales of being scolded, lied to, or blatantly confused in conversation-s…

Read More

See also  'The Matrix Resurrections' review: Time to log off from this franchise

Leave a Reply