Angry Bing chatbot just mimicking humans, say experts

Published by

San Francisco (AFP) – Microsoft’s nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Friday. Tales of disturbing exchanges with the chatbot that have captured attention this week include the artificial intelligence (AI) issuing threats and telling of desires to steal nuclear code, create a deadly virus, or to be alive. “I think this is basically mimicking conversations that it’s seen online,” said Graham Neubig, an associate professor at Carnegie Mellon University’s language te…

Read More

See also  As a government, we need to bring tech success to ordinary Brits’ lives

Leave a Reply