Home / Tech News / Self-driving cars know the rules of the road — but not rules of humanity

Self-driving cars know the rules of the road — but not rules of humanity

When it comes to reading cues from human drivers, autonomous vehicles have a ways to go.
Three cars, including two self-driving models, share a road in traffic.

Driving a vehicle involves more than following the rules of the road.

It is a social activity involving subtle interactions with other drivers, cyclists and pedestrians where everyone tries to avoid each other — ideally, with courtesy, sometimes with aggression.

A new study is suggesting that one of the major barriers to self-driving cars is mastering this complex social interaction.

How often have you been driving on a highway when a vehicle suddenly appears beside you, wanting to merge into your lane from an on-ramp? Do you change lanes to make room, slow down so they can slip in ahead of you, or speed up and let the driver behind you deal with it?

Those decisions must be made in seconds and depend on many variables, such as the volume of traffic, relative speed of the other vehicle, and even the mood of the drivers.

Some days, you may feel more relaxed and wave the other driver in, other times you could be more aggressive and less courteous. The same is true of the other drivers on the road, so everyone is constantly evaluating the driving patterns of everyone else by using cues like body language and eye contact.

This is where computers fall short.

A study by researchers at the University of Copenhagen found that while driverless cars are successful at following the rules of the road, and their sensors allow them to generally avoid collisions with other vehicles and pedestrians, they have difficulty reacting to the subtle cues shared among humans.

The scientists examined 18 hours of YouTube videos submitted by people testing self-driving cars by riding in the back seat. They found that the vehicles had difficulty deciding when to stop or when someone was stopping for them. Often, the default decision for the computer driver is to stop, which can cause traffic problems when that is not necessary.

White self-driving car seen on a city street in San Francisco, California.

This problem of autonomous vehicles holding up traffic has shown up in San Francisco, where several companies, such as Waymo, Cruise and others have been operating self-driving taxis for years.

As the New York Times has reported, cars that become confused and stop unnecessarily are causing problems for the local transit. Buses and vehicles on rails cannot go around vehicles that block roadways, so they have to wait for someone from the self-driving vehicle company to arrive and manually move the car. This causes delays and gives the transit a bad reputation for unreliable service.

Statistics show that many accidents are caused by human error, such as distracted driving, excessive speed, alcohol or fatigue. The philosophy behind autonomous vehicles is to take that human factor out and make transportation safer.

That may be true when all vehicles on the roads are self-driving — but while they are sharing the roads with human drivers, there are bound to be conflicts.

The bottom line is that engineers still have a ways to go before autonomous vehicles can deal with the human factors of driving and make the right decisions in unusual situations. In the meantime, human drivers will have to be patient and try to better understand a computer’s way of thinking.

ABOUT THE AUTHOR

Bob McDonald is the host of CBC Radio’s award-winning weekly science program, Quirks & Quarks. He is also a science commentator for CBC News Network and CBC-TV’s The National. He has received 12 honorary degrees and is an Officer of the Order of Canada.

*****
Credit belongs to : www.cbc.ca

Check Also

Climate change is bringing earlier springs, but it’s wreaking havoc on animals

Climate change is altering the way animals, insects and plants behave, and it has cascading …