Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Zenger
Zenger
Business
Jim Leffman

Scientists Say Self-driving Cars May Be Dangerous Because They Don’t Understand Social Cues

A rider waits in the back of a WAYMO self driven car as it calls for assistance stuck on the road. (University of Copenhagen via SWNS)

Self-driving cars annoy other road users, cause jams and could be dangerous because they don’t understand human interaction, a new study claims.

Researchers found that the vehicles, touted as the future of transport, can’t pick up on the subtle human social cues that inform driving.

The most obvious one is whether to give way or go in traffic which humans typically make quickly and intuitively.

But self-driving cars fail to read the humans in traffic and their reactions can cause jams and anger other road users, according to award-winning research from the University of Copenhagen.

Researchers analyzed 18 hours of 70 videos uploaded by YouTube users of self-driving cars in various traffic situations.

The results show that self-driving cars have little social intelligence in understanding when to ‘yield’ and when to drive, necessary for efficient delivery.

Professor Barry Brown, at the University’s Department of Computer Science, who has studied the evolution of self-driving car road behavior for the past five years, listed several questions that self-driving cars would have difficulty answering.

Three different self driving car systems in action: Alphabet/Google’s Waymo, Tesla, and Intel’s MobileEye. (University of Copenhagen via SWNS)

 

“It is one of the most basic questions in traffic, whether merging in on a motorway or at the door of the metro. […] The ability to navigate in traffic is based on much more than traffic rules. Social interactions, including body language, play a major role when we signal each other in traffic. This is where the programming of self-driving cars still falls short. That is why it is difficult for them to consistently understand when to stop and when someone is stopping for them, which can be both annoying and dangerous.”

Companies like Waymo and Cruise have launched taxi services with self-driving cars in parts of the United States. Tesla has rolled out its FSD (full self-driving) model to about 100,000 volunteer drivers in the US and Canada.

But according to Professor Brown and his team, their actual road performance is a well-kept trade secret that few have insight into.

Therefore, the researchers performed in-depth analyses using 18 hours of YouTube footage filmed by enthusiasts testing cars from the back seat.

One of their video examples shows a family of four standing by the curb of a residential street in the United States.

There is no pedestrian crossing, but the family would like to cross the road. As the driverless car approaches, it slows, causing the two adults in the family to wave their hands as a sign for the car to drive on.

Instead, the car stops right next to them for 11 seconds. Then, as the family begins walking across the road, the car starts moving again, causing them to jump back onto the pavement.

Brown said: “The situation is similar to the main problem we found in our analysis and demonstrates the inability of self-driving cars to understand social interactions in traffic.

Self-driving car visualizations of the path ahead. (University of Copenhagen via SWNS)

 

“The driverless vehicle stops so as to not hit pedestrians, but ends up driving into them anyway because it doesn’t understand the signals. Besides creating confusion and wasted time in traffic, it can also be downright dangerous,” Professor Brown said.

In tech-centric San Francisco, driverless cars have been deployed in several parts of the city as buses and taxis, navigating the hilly streets among people and other natural phenomena.

“Self-driving cars are causing traffic jams and problems in San Francisco because they react inappropriately to other road users”” Brown said. “Recently, the city’s media wrote of a chaotic traffic event caused by self-driving cars due to fog. Fog caused the self-driving cars to overreact, stop and block traffic, even though fog is extremely common in the city.”

“I think that part of the answer is that we take the social element for granted. We don’t think about it when we get into a car and drive – we just do it automatically. But when it comes to designing systems, you need to describe everything we take for granted and incorporate it into the design. The car industry could learn from having a more sociological approach. Understanding social interactions that are part of traffic should be used to design self-driving cars’ interactions with other road users, similar to how research has helped improve the usability of mobile phones and technology more broadly,” he said.

The study was presented at the 2023 CHI Conference on Human Factors in Computing Systems, where it won the conference’s best paper award.

Produced in association with SWNS Talker

Edited by Kyana Jeanin Rubinfeld and Sterling Creighton Beard

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.