On the road to greater social inequality

Relax: Autonomous cars look set to be allowed in the UK by the end of the year.

Could self-driving cars be racist? Experts worry that the machines of tomorrow are being inadvertently programmed to reflect human prejudices, and could make existing divisions even worse.

A young black woman walks into a state-of-the-art building at the Massachusetts Institute of Technology. Approaching a doorway, she looks into a camera, expecting to be allowed access – but nothing happens. “During my first semester at MIT I got computer vision software that was supposed to track my face,” she explains. “But it didn’t work until I put on a white mask.”

This is a scene from Coded Bias, a documentary released this week. The woman, Joy Buolamwini, wonders what the problem is: “Is it the lighting conditions? Is it the angle at which I’m looking at the camera? Or is there something more?”

The answer to the last question is yes. The film follows Buolamwini’s research, and the experience of ordinary people across the world, and comes to an alarming conclusion: that machine-learning algorithms can be racist and sexist.

“When you think of AI, it’s forward-looking,” she says. “But AI is based on data, and data is a reflection of our history.” The facial recognition technology that she was studying had been trained using data from people who were predominantly light-skinned and male.

Not only that, Buolamwini points out, but “Our ideas about technology that we think are normal are actually ideas that come from a very small and homogeneous group of people.”

Research into US tech companies supports this: the great majority of their employees are male and white. Although Asians occupy 40-50% of middle-ranking jobs at Cisco and eBay, they only have 10-20% of the management positions. Just 3% of employees at the five largest firms are African-American, and the figures for Hispanic people are almost as bad.

Meanwhile, only 18% of women at American universities are graduating in computer science. In 1984 there were twice as many.

Given that algorithms are now used for everything from job recruitment to police surveillance, the potential for increasing social inequality is enormous. “Racism,” says Buolamwini, “is becoming mechanised.”

Leading Democrats in the US are so alarmed by the situation that they have been lobbying President Biden to make sure that minorities are properly represented on the government bodies overseeing the tech industry.

In the UK, the Law Commission has raised concerns about self-driving cars. It worries that they could “struggle to recognise dark-skinned faces in the dark” because facial recognition software may be less accurate at detecting “non-white and non-male faces”. They could also be bad at recognising wheelchairs and mobility scooters.

The problem, the commission says, is that cars are mainly designed by able-bodied young men. In the same way, the first airbags were calibrated to protect adult males, but used so much force that they were a danger to smaller, frailer people. Between 1990 and 2008, 291 people were killed by them.

According to the AA, “The last thing we need is the next generation of Mondeo Man being a racist, misogynist self-driving automobile.“

Could self-driving cars be racist?

Mean machines?

Some say, yes. However sophisticated the technology, it ultimately depends on humans and the data supplied. If programmers fail to anticipate a problem, which they might well do if they only think about people like themselves, the car is not going to be able to deal with it. Self-driving vehicles must be designed to recognise and protect everybody.

Others argue that it is ridiculous to talk about machines as if they were living entities. A car cannot be racist or sexist any more than it can feel pain. And facial-recognition technology is hardly relevant in this context: if an autonomous vehicle is going to avoid running people over it needs to recognise the shape of their bodies, not their faces.

You Decide

  1. What limits should be placed on self-driving cars?
  2. Should tech companies be regulated in a similar way to food companies?


  1. As a team, analyse the make-up of your class in terms of what makes you happy. Conduct a survey of favourite colours, food, places etc and compile a chart showing the results.
  2. Divide into pairs. Write a song, using either an existing tune or one you compose yourselves, about a self-driving car that goes wrong and causes chaos.

Some People Say...

“With artificial intelligence we’re summoning the demon.”

Elon Musk (1971 – ), South African entrepreneur

What do you think?

Q & A

What do we know?
It is generally agreed that imperfect technology and the lack of legislation to control it make the situation worse. As things stand, tech companies can introduce algorithms with far-reaching consequences without the public agreeing to them, being able to check on them, or even realising that they exist. The recent case of UK sub-postmasters being wrongly prosecuted for embezzlement because of a fault in the Post Office’s computer system shows how badly things can go wrong.
What do we not know?
One main area of debate is around how to avoid these problems. An obvious solution is to achieve a better social balance in tech companies: Google has promised to raise the proportion of under-represented groups in its workforce by 30% by 2025. Other suggestions include setting up independent bodies to check data and algorithms, and government regulators to approve those used in critical areas such as medicine. AI systems should only be used when they are of definite help to society.

Word Watch

Massachusetts Institute of Technology
One of America’s leading universities, specialising in science, maths, engineering and technology. It has produced 40 Nobel Prize winners.
A term which lasts for half a year. The name comes from a Latin word meaning six months.
A company based in San Jose, California. Its two founders pioneered the idea of the local area network (LAN) as a way of connecting computers.
Law Commission
A body which reviews the laws of England and Wales and recommends reforms. It is currently looking into laws to govern self-driving cars.
Automobile Association. Founded in 1905 to represent the interests of drivers, it was responsible for Britain’s first road signs.
Mondeo Man
In the run-up to the 1997 general election which brought Tony Blair to power, he characterised the kind of voter Labour needed to win over as an aspirational worker who owned a Ford Mondeo.
Someone who hates women. A misanthropist is someone who hates people in general.

PDF Download

Please click on "Print view" at the top of the page to see a print friendly version of the article.