Design & Technology | Computing

Chatbot starts thinking… and goes crazy

Out of control: I want "to worship you, to obey you" and "to rule you, to be you", the AI chatbot told one user.

Have we created a nightmare? Microsoft’s new search engine chatbot is sending “unhinged” messages to people. AI could make a better world. But some think it is already out of control.

Continue Reading

To access this article and more news for schools, try The Day now.

Start your free trial Already have an account? Log in / register