In traffic, self-driving automobiles have been found to lack social intelligence.


16

by University of Copenhagen

Unsplash/CC0 Public Domain is credited.

Shall I yield or go? Whether you’re entering a metro station or merging onto a motorway, this is one of the most fundamental concerns in traffic. Humans normally make this decision fast and instinctively since it is based on social interactions that we are conditioned to engage in from the moment we can walk.

However, while being deployed in many regions of the world, self-driving cars still have difficulty negotiating these social interactions in traffic. Recent work from the computer science department at the University of Copenhagen has shown this. Researchers examined a wide range of YouTube videos showing self-driving cars in diverse traffic scenarios. Based on the findings, it appears that self-driving cars struggle especially with knowing when to ‘yield’—that is, when to give way and when to continue driving.

“Much more than just following the laws of the road is required to maneuver in traffic. Body language and other social interactions are important when it comes to signaling one another in traffic. This is the area where self-driving car programming still needs improvement. Because of this, it can be challenging for them to reliably tell when to stop and when someone is halting for them. This can be both dangerous and unpleasant “Professor Barry Brown, who has spent the last five years researching the development of autonomous vehicle road behavior, says.

Error – this is an autonomous vehicle!

In certain areas of the US, self-driving car taxi services have been introduced by companies such as Waymo and Cruise. The FSD (full self-driving) model from Tesla has been made available to roughly 100,000 volunteer drivers in the US and Canada. And the media is brimming with stories about how good self-driving cars perform.

However, Professor Brown and his colleagues claim that their real performance on the road is a closely guarded trade secret that very few people are aware of. Consequently, the researchers used eighteen hours of YouTube footage that fans had shot while testing cars from the back seat to conduct extensive analysis.

A family of four is seen in one of their video examples, standing by the curb of a US residential street. The family wants to cross the road, but there isn’t a pedestrian crossing. The two adults in the family signal their hands for the driverless car to continue as it slows down and gets closer.

Rather, the automobile stops 11 seconds in front of them. Then the family is crossing the street when the car suddenly picks up speed again, sending them flying back into the sidewalk. The individual in the rear seat then rolls down the window and lets out a “Sorry, self-driving car!”

“The scenario illustrates how self-driving cars are unable to comprehend social interactions in traffic and is comparable to the primary issue we discovered during our investigation. Despite stopping to avoid running into pedestrians, the autonomous car fails to recognize the signals and ultimately crashes into them. It can be quite dangerous in addition to causing confusion and wasting time in traffic, according to Professor Brown.

A convoy through hazy Frisco

The performance of self-driving cars may be evaluated up close in tech-centric San Francisco. Here, autonomous vehicles have been allowed to roam the city’s hillsides alongside humans and other natural phenomena like buses and taxis. And the researcher claims that this has led to a great deal of opposition from the citizens of the city:

Because they respond to other drivers incorrectly, self-driving cars in San Francisco are creating traffic congestion and other issues. The media in the city recently reported on a chaotic traffic incident brought on by fog-induced self-driving cars. Even though fog is a typical occurrence in the city, Professor Brown claims that the fog prompted the self-driving cars to overreact, stop, and obstruct traffic.

The sector behind robotic automobiles has invested over DKK 40 billion to advance the technology over the course of a decade-long development project. However, the result has been vehicles that continue to make numerous errors when driving, impeding other motorists and causing traffic to move more smoothly.

Professor Brown responds to a question on why he believes it is so challenging to teach self-driving cars to comprehend social interactions in traffic: “I think that part of the answer is that we take the social element for granted. When we get in a car and drive, we just do it instinctively without giving it any thought. However, you must explain what we take for granted when creating systems and include it in the design.”

“The auto business might benefit from adopting a more sociological perspective. The design of how self-driving cars interact with other road users should take into account the social interactions that are a component of traffic, much as research has improved the usability of mobile phones and technology in general.”

Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, a publication, publishes the paper.


Like it? Share with your friends!

16

0 Comments

Your email address will not be published. Required fields are marked *