Get all your news in one place.
100's of premium titles.
One app.
Start reading
ABC News
ABC News
Science
Ariel Bogle

There's an accident ahead. Who do you think should be spared?

Would you let a car determine who dies?

Self-driving cars may soon make troubling human choices. Their software could have a programmed reaction to the question: If an accident is inevitable, who or what is more valuable?

A human or a pet? Passengers versus pedestrians? The young or the elderly?

Researchers from the Massachusetts Institute of Technology built an online game called the Moral Machine to test how people around the world would answer those questions.

Players were shown unavoidable accidents with two possible outcomes depending on whether the car swerved or stayed on course, and then asked to choose the outcome they preferred. The research gathered 40 million responses from people in 233 countries.

The results, published today in the journal Nature, are just one step towards finding a social consensus around how we expect driverless cars to act, given it will be humans who write the code.

While there were intriguing trends from country to country, globally Moral Machine players showed a preference for sparing babies, little girls, little boys and pregnant women.

Of course, humans don't always make clear, thought-out decisions when faced with a road accident, and it's not clear if driverless cars will do any better.

What should I read next?

Want more charts?

This is part of a new daily series featuring charts which tell a story. If you know of some data that fits the bill, we'd love to hear about it.

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.