AVs - are we solving the wrong problem?
The death of a pedestrian caused by a collision with an autonomous vehivle (AV) raises a number of questions. Is this technology the wrong answer?
Car hits pedestrian crossing the road
On the 18th of March 2018 at around 10pm, Elaine Hertzberg was crossing a road in Tempe Arizona, while wheeling her bicycle. She was struck and killed by a Volvo SUV travelling at about 40mph. It was operated by Uber and was in autonomous mode at the time, with a driver behind the wheel. It apparently did not slow down before the impact. So why did that fatal crash happen? Three reasons:
Firstly, as evidenced by the video (caution: distressing scenes) released by Uber, the driver was switching his attention between the road and something on the centre console. He was therefore not in a position to take control in an emergency, as was expected.
Secondly, the software controlling the vehicle failed to recognise that an object - it really doesn't matter at this point that the object was a human with a bicycle - was about to intersect the vehicle's path. The driver (although can we really call him a driver if he was really just a passenger, and indeed, a victim of a traumatic incident) would have expected that the vehicle's Lidar system was fully capable of detecting and recognising the danger even at night. And yet, it did not. Uber immediately said it was "pausing its self-driving car operations" pending investigation.
Lastly, the victim was wheeling her bicycle across a road, between intersections, at night, with no visible lights or reflective clothing. And apparently without looking until the last split second.
So who was ultimately responsible for Elaine's death? Uber, The 'driver' Rafaela Vasquez, or Elaine herself? In this case, Uber have apparently accepted primary liability, having hurriedly "reached a settlement" with the family. Presumably a very large amount of money was handed over in exchange for an agreement not to pursue any further claims.
In my view, Elaine Hertzberg would probably have been killed even if the vehicle was not autonomous. At 40mph, it would take the average car and driver nearly 140 feet (42 metres) to stop. 0.67 seconds after perceiving the danger, the average driver would have hit the brake pedal, and it would take a further 2.6 seconds to come to a stop.
Look at the video again. At 0:06 Elaine is not even visible. At 0.07 she emerges from the darkness into the area covered by the car's headlights. At 0:09 - impact. Most people, myself included, would have barely got their foot onto the brake pedal before the impact. Yes, the car might have slowed down sufficiently to reduce the force of the collision such that the incident wasn't fatal, but it would still have been a nasty one.
Now, here's where I get controversial.
While the Volvo SUV failed to recognise Elaine and her bicycle as a threat, Elaine should not have been there in the first place. Wheeling a bicycle across a four-lane road at night without reflective clothing, and apparently without noticing a car was coming, is negligent in the extreme. Fatally so, in this case.
I submit that it was Elaine's fault that she was struck by a car, Uber's fault that she was killed by it.
Not paying attention
Elaine wasn't paying attention when she crossed the road. The people behind the wheel were not paying attention either. But Elaine is not alone in this. It is something I see all too often on the roads around me. People walk across roads all the time, eyes focussed on the phone in their hands, headphones on. Why we always blame the drivers of the car involved in the resultant collisions is beyond me. Should we not also hold pedestrians equally accountable for their actions?
Another fatal crash involving a Tesla and an articulated lorry (tractor-trailer) was, according to investigators, not the fault of Tesla's Autopilot system because "situations involving crossing traffic are beyond the performance capabilities of the system.”
Say what?? Autonomous cars are incapable of detecting and preventing collisions with crossing traffic? Then how the heck did they get approval for road testing?
Allegedly, 90% of road traffic accidents are a result of human error. Okay, I can believe that. What I am frankly struggling with is the idea that the only right answer to this problem is to hand over control to a system of computers that, at this point is very good at preventing rear-end collisions, but cannot detect someone walking across the road in front of them, or a car pulling out from the next lane. In this example (first 23 seconds), the driver assumed control because he WAS paying attention.
The automotive and technology industries have a vested interest in AVs. And they are persuading you, me and governments to just accept that they are the future. Tesla claims that "If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident." I have no reason to dispute that statistic, but does that mean Autopilot is the only answer to the problem?
This is possibly the most dangerous time of all for AVs - they are really rather clever, but only to a point. And when we reach that point where the system stops being clever - like not recognising that a woman and her bicycle might require you to slow down or swerve - it becomes even more dangerous than when we are not simply relying on our own abilities.
Autonomous cars CAN and DO prevent collisions a lot of the time. I acknowledge that. And in future they will only get better at it. But until all AVs reach Level 5 , there will always be mistrust and potential confusion over whether the human being should be in control. Jim McBride, autonomous vehicles expert at Ford , said "the biggest demarcation is between Levels 3 and 4." He's focused on getting Ford straight to Level 4, since Level 3, which involves transferring control from car to human, can often pose difficulties. "We're not going to ask the driver to instantaneously intervene—that's not a fair proposition," McBride said.
Not a fair proposition - exactly! But that's where we are now.
Not the only answer
What I am advocating though, is that we stop thinking of them as the ONLY answer to the number of road traffic accidents. We need to get a lot better at educating drivers and pedestrians too, because, for now at least, we cannot totally rely on our autonomous cars to get it right all the time. And, frankly, if I have to be ready to take control at a moment's notice, with my hands on the wheel, then what's the point? I would prefer to be in control.
So why is no-one looking at the big picture? Why are governments around the world not considering ALL the causes of fatal and serious accidents and looking to address them, instead of handing responsibility for this to the corporations like Tesla and Uber, Nvidia, Google, and others? Corporations with vested interests in making a profit. I am willing to bet that some bean counter somewhere has already factored in the cost of settling a given number of law suits a year against the potential profits to be made, and decided ' it's worth it'.
No, it bloody well isn't!
My challenge to each and every one of you reading this is to resolve right now to get better every day. I have been driving for over 25 years and I am still learning. I am trying to be #SaferEveryDay.
Will you do the same?