Design & Technology | Computing

Chatbot starts thinking… and goes crazy

Out of control: I want "to worship you, to obey you" and "to rule you, to be you", the AI chatbot told one user.

Have we created a nightmare? Microsoft’s new search engine chatbot is sending messages to people that do not make any sense.

Continue Reading

To access this article and more news for schools, try The Day now.

Start your free trial Already have an account? Log in / register