Autonomous driver aids are quickly becoming a fact of life. When there’s an accident, how much responsibility falls on the driver?
In March 2018, as Elaine Herzberg pushed her bike across a poorly-lit Arizona street, she was struck and killed by a car. The collision was one of 6,000 each year that result in a US pedestrian’s death. What makes this this tragedy unusual is that Herzberg was hit by an autonomous vehicle. The car was an Uber cab, piloted by the company’s own system.
The cab also had a ‘safety driver’, 44-year-old Rafaela Vasquez. Her job was to monitor the car and the road, intervening as required. She was supposed to be the failsafe for the new system.
The tragedy has raised serious questions about how human drivers interact with autonomous systems, and where responsibility lies when something goes wrong. Although Herzberg’s was the first ever fatality, the Uber crash was just one of a growing number of cases where autonomous driving features have failed. Other notable incidents have involved Tesla, the manufacturer at the forefront of autonomous technology. Earlier this year, an autopiloted Model X steered into a concrete barrier, killing the driver. And in three separate incidents since January, Tesla cars have ploughed into the back of stationary emergency vehicles.
The Blame Game
When it comes to establishing blame, it seems as though the human drivers don’t have much defence. In the Herzberg case, we already know that just before the crash, Vasquez’s phone was streaming ‘The Voice’. Video footage shows her looking down over 200 times to the area where the phone was playing. At times, she appeared to be smirking or laughing, perhaps in response to the show. Crucially, she was looking down for 5.2 of the last 5.7 seconds before the collision. Though the investigation is ongoing, the Arizona Police have commented that one factor in the crash was “Vasquez’s disregard for assigned job function to intervene in a hazardous situation.”
In the Tesla cases too, the human driver could have made an intervention. After the most recent collision, Tesla was quick to comment that:
“When using autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times.
“Tesla has always been clear that autopilot doesn’t make the car impervious to all accidents, and before a driver can use autopilot, they must accept a dialogue box which states that ‘autopilot is designed for use on highways that have a center divider and clear lane markings’.”
So, it all sounds pretty simple. When something goes wrong, ultimately, the driver is responsible. Despite clear warnings to the contrary, the drivers had handed over control to the ‘autonomous’ features. They should have been vigilant, prepared to take action.
And from a legal perspective, maybe that’s all true.
But hang on a moment. Is that a reasonable expectation? Can we reasonably expect people to drive a semi-autonomous car with the same mindset as they would an unaided one?
Expectations and limitations
Drivers with not-quite-autonomous vehicles face at least two major problems.
Firstly, there’s our expectations about technology. Many drivers have grown up with mature technologies that have proven themselves incredibly reliable. For example, when was the last time your mobile dialed a wrong number? When did you last type an actor’s name into Google and get nothing? Has your calculator app ever returned a wacky answer? The fact is that in everyday life, we are used to relying on technology. Dozens of times a day, we put our faith in it. It’s hard to make someone switch off that expectation. You can tell someone that a technology isn’t yet trustworthy until you’re blue in the face, but it’s hard to ignore generalising from previous experience.
Secondly, humans are just terrible at paying attention. As drivers, we all know that when there’s not much to do, our minds wander. Ever taken a familiar route in the car and, when you arrive, realised that you can’t remember driving most of it? Or take cruising on an empty, featureless motorway — how long can we really maintain 100% concentration? And all this is when we know that the car cannot do anything for itself! The fact is that when we give our brains a single very simple task, like occasionally monitoring for a stopped vehicle, they seem to pack up and go on holiday.
The drivers in these not-yet-fully-autonomous vehicles should have been paying attention. And they weren’t. But the reasons behind this maybe to do with our culture and the limitations of the brain.
If so, then the conclusion is unfortunately pretty clear: until autonomy advances well beyond where it is, we won’t have seen the last of these horrifying incidents.
The Welsh VW Specialist blog covers a wide range of automotive topics, from the contentious to the light-hearted. We are an independent garage specialising (as the name suggests!) in all the VW group marques, including Audi, Volkswagen, Skoda and SEAT. Welsh VW Specialists provide services, repairs and MOTs, delivering a main dealer level of care at affordable prices. To book your vehicle in, or for any enquiries, get in touch.