• Reading Level 5
Science | Design & Technology | Citizenship | RE

Artificial morality: how to raise good robots

Do robots need morals? A leading law-maker has said that artificial intelligence must meet ethical standards. But even leading researchers do not fully understand the machines they build. In the film, I, Robot, a car crashes into a river. A robot has to decide whether to save a grown man or the 12-year-old girl sitting next to him. It calculates the man has a higher chance of survival and leaves the girl to drown. Today, such moral dilemmas are not merely science fiction. Artificial Intelligence, the use of programs that can learn and make decisions, is widespread. Lord Evans, the chairman of the independent Committee on Standards in Public Life, has warned that "the public need reassurance about the wayn AIA computer programme that has been designed to think.  will be used". Before it is widely deployed by the government, he argues, we need to be sure that the technology is accountable, open, and free from bias. Since an algorithmAny set of rules followed by a computer. In the context of social media, “the algorithm” refers to the intelligent AI that learns the interests of the user and presents them with posts that it thinks will interest them. is only ever as good as its data, there are many instances of softwarePrograms that control the actions of a piece of hardware - in this case, a car. reflecting some of society's worst biases. From failing to recognise people of colour as humans, to labelling a woman with frizzy hair as a furry animal, the technology we use does not always feel ethical. Isaac Asimov, whose short stories inspired the film I, Robot, outlined a clear first rule of robotics: "A robot may not injure a human being or, through inaction, allow a human being to come to harm." But harm is vague, and there are many ways that AI can cause problems without injuring someone. There are also situations where someone is always going to get hurt. Every day, programmers building self-driving cars grapple with ethical problems that have plagued philosophers for centuries. Should a car heading towards an obstacle risk the life of its driver, or should it veer to one side and risk killing the passengers of another vehicle? Machines are also now learning to make their own decisions. Few researchers at the cutting-edge of AI technology fully understand what they have built. Though they can program the goals their robots pursue, no one understands how these machines actually think - it is a black box. It is unclear how we might ever be able to train non-human intelligence to understand our own morality. This is a difficult task when our own morality is a debated topic. We do not know if there are objectiveRelating to external realities rather than internal states. The opposite is subjective. These are important concepts in philosophy but also notoriously hard to clearly define. moral rules or whether these depend on individual circumstances. With all of these issues in mind, do we think that robots need morals? Exterminate, exterminate No, robots are simply tools. A calculator does not need morals because it only ever does what we ask of it. Even an automated vehicle follows a set of predetermined commands. We do not need to teach morality when we can just code into machines. Just because some of these instructions might make moral judgments, does not mean that the machine itself is moral. Then again, as robots become more and more complex, it will become harder and harder to understand the choices that they make. Unless we imbue them with a sense of right and wrong, and teach them how to learn from our own morality, then who knows what chaos they might bring? Anything that makes moral decisions should understand these and be able to justify them. KeywordsAI - A computer programme that has been designed to think. 

Continue Reading

The Day is an independent, online, subscription-based news publication for schools, focusing on the big global issues beneath the headlines. Our dedicated newsroom writes news, features, polls, quizzes, translations… activities to bring the wider world into the classroom. Through the news we help children and teachers develop the thinking, speaking and writing skills to build a better world. Our stories are a proven cross-curricular resource published at five different reading levels for ages 5 to 19. The Day has a loyal and growing membership in over 70 countries and its effectiveness is supported by case studies and teacher endorsements.

Start your free trial Already have an account? Log in / register