mobility

10 human lessons for self-driving cars

There are many things that humans do – legally – while driving that are hard to translate to autonomous cars. Sean Farrell takes a tour of the 10 most common

Home Life Sustainability mobility 10 human lessons for self-driving cars

Over the past few years, autonomous vehicles have gone from futuristic fantasy to an imminent prospect. The cars' manufacturers and supporters promise a future of fewer accidents, reduced congestion and benefits for the environment. But there are still many things to be ironed out before self-driving cars will be able to coexist on the road with human drivers.

10 human lessons for self-driving cars
10 human lessons for self-driving cars

1 - Cut others some slack
Each day, humans make millions of small concessions on the road to make things easier for others – like giving way to let a car pull out into the traffic. On roads with a mix of autonomous and manual vehicles, we could find only the latter group observing such niceties. As Dr Matthew Channon, lecturer in law at Exeter University and author of The Law and Driverless Cars, says: “If I'm on a country road or a road with a single lane and someone wants to overtake me, I slow down so they can get past as quickly as possible. But an autonomous car may see no reason to drop below the speed limit, making things difficult for the other driver.”

2 - Acknowledge other road users
Motorists are used to giving pedestrians and cyclists a signal through eye contact, a nod of the head or other cue to reassure them that they have been noticed, but autonomous vehicles don't have such capabilities. A 2016 survey of 644 cyclists and pedestrians by researchers at Leeds University found that being detected was their biggest concern about dealing with self-driving vehicles. Teams developing automated cars are working on signals that would perform the same role, but so far they fall well short of a human acknowledgment.

3 - Let an emergency vehicle through
Drivers' goodwill helps medics, firefighters and other services get to the scene of an emergency as quickly as possible. But autonomous cars cannot yet distinguish properly between an ambulance and another noisy vehicle driving at speed. Waymo, the self-driving technology company and subsidiary of Google's parent company, Alphabet, is programming its cars to recognise emergency vehicles to maintain one of the more uplifting traditions of the road.

4 - Deal with aggressive drivers
What happens when a car programmed to stick to the rules of the road comes up against a belligerent driver? “How will they deal with tailgating?” asks Channon. “The autonomous car won't go any faster than the speed limit, so it will be stuck in front of vehicles wanting to get past them.” Whereas a human driver might defuse the situation by pulling over to let the angry driver through, an autonomous car's by-the-book approach could fuel road rage.

5 - Respond to human direction
There are times when humans direct the traffic – a policeman waving drivers through after an accident, for instance, a builder halting traffic for the arrival of a large truck or a marshal overseeing parking at a music festival – but will autonomous vehicles know how to respond? “Humans follow hand and other signals given by human parking marshals who have a plan in their head,” says Chris Patience, head of technical policy for the UK's AA motoring association. “The idea of 20,000 or 30,000 autonomous vehicles cruising around Glastonbury Festival for three days isn't great.”

6 - Cope with the weather
Unpredictable or extreme weather is one of the biggest hazards for the human motorist but we are pretty good at judging and responding to meteorological changes. Conversely, autonomous vehicles have so far proved to be relatively poor at dealing with the weather. In the winter of 2017, leading tech companies such as Uber and Waymo put vehicles through tests in snow, which is “a really interesting problem”, Carl Wellington, a senior engineer at Uber, told the Financial Times. Snow can confuse the vehicle's spinning laser into thinking the flakes are solid objects.

7 - Go for a drive
Whether in picturesque countryside or a city packed with landmarks, one of the enduring pleasures of motoring is hopping in the car and making it up as you go along. Yet the whole point of an autonomous car is that you programme in your destination and sit back while the vehicle finds the most efficient way there. “How will autonomous cars do ‘just sight-seeing' – driving with no specific destination in mind where decisions are constantly being made based on where or what looks interesting?” ponders the AA's Patience.

8 - Avoid hazards and obstacles
The sensors used by autonomous cars still have trouble spotting potholes and even when they do are more likely to slow down than take avoiding action. This raises the prospect of sluggish traffic caused by cautious autonomous cars unless the technology gets better – or governments fix the holes. Then there are the occasions when human drivers briefly break a solid white line to get past a stationary vehicle or other obstacle – will an autonomous car make such a decision or simply hold up the traffic?

9 - Deal with non-vehicles
Autonomous vehicles' sensors are designed mainly with cars and other large machines in mind. At the moment they still haven't got to grips with pedestrians, bikes and other “non-vehicles” whose behaviour is more erratic and whose shapes are harder to identify. Things get still more complicated in the countryside, where horses, livestock and other animals can be on the road. Ian McIntosh, chief executive of the UK's RED Driving School, says: “Motorists need to crawl past horses and give them a wide berth and you need to be ready to stop if the horse does something expected.” Good advice, which sounds tricky for the autonomous vehicle.

10 - Be human!
An autonomous car will only be able to behave as well as its program allows – and that means trying to think of all possible scenarios it will encounter while sharing the road with human drivers. “We know this stuff and they can build it in, but there is only so far you can program a vehicle to act like a human,” says Exeter University's Channon. “If the person programming it can't think of all the scenarios, you will end up with millions of situations that people haven't thought about and the vehicle will be in bother.”