Amid the hype about self-driving cars, an unsettling reality is stealing sleep at night.
It’s one fact that reaffirms the old cliche: ignorance is bliss.
The National Highway Traffic Safety Administration has documented 11 new fatal crashes involving vehicles with partially automated driver-assistance systems, most in Teslas.
As the leading industry pushes AVs onto US roads with a promise of safe, low-cost travel, it must first prove that the technology works. And while it’s hard to predict exactly how often such vehicles will crash, one thing is certain: Pedestrian deaths will make front-page news when they do.
Pedestrians are especially vulnerable to AV crashes because they typically aren’t seen until the last second when the car is already close enough to strike them.
An Uber test vehicle with a safety driver on board failed to detect Herzberg as she moved toward its Volvo SUV. Its software reportedly interpreted her as either a car or a jaywalker, and the vehicle’s automated system didn’t have time to execute an evasive maneuver before she was struck. Rafaela Vasquez, the car’s human safety driver, was charged with negligent homicide.
Though designed to reduce crashes, there are still instances of self-driving cars killing people, such as self-driving cars are still expected to hit cyclists. AV manufacturers must learn to recognize and communicate with cyclists to mitigate that risk. Bhuiyan points to research from Argo AI that demonstrates how to train a car to understand cycling hand signals and gestures.
Similarly, Google spinoff Waymo has developed technology that allows its cars to recognize bicycles and communicate with them in various ways, including using LED lights and talking. The firm plans to share this with other AV manufacturers to help them adopt the technology.
However, Waymo isn’t the only AV company involved in collisions with cyclists. Last year, a woman was killed when an Uber ATG test vehicle struck her in Tempe, Arizona. In that case, the software incorrectly believed Herzberg was a pedestrian and didn’t react fast enough to avoid her. The crash highlighted the need to develop better collision detection methods for AVs.
The good news is that humans cause most accidents involving robot cars in other vehicles. Drivers can be distracted or impeded by weather or road conditions or fail to recognize hazards. If robots can avoid these mistakes, the number of crashes should decline significantly.
Motorcyclists can also be harmed by drivers who misjudge their speed or fail to see them when turning or changing lanes. Robot cars that can “see” motorcyclists should be able to avoid these problems, though it may take some time for the technology to develop.
One of the latest cases involved a Tesla that struck a motorcyclist from behind, ejecting him from his bike and killing him. The National Highway Traffic Safety Administration is investigating the incident to determine whether the vehicle’s Autopilot feature was enabled during the crash. This type of real-world accident can shake public confidence in autonomous cars. Companies rushing to market these technologies must take the time to ensure that their systems are ready for the roads before they introduce them.
AVs are still only on the road in limited numbers and are involved in just a small percentage of vehicle accidents. However, when they do crash, the causes often involve human error. Specifically, drivers use a full self-driving mode or at least a component of the autopilot feature to operate their vehicles, which results in many crashes that could have been prevented by limiting driver control.
For example, in March 2018, a Tesla driver crashed into a highway lane divider because his hands were not on the wheel despite visual and audible warnings. The accident caused the first pedestrian death attributed to an autonomous car.
Problems cause other AV accidents the systems couldn’t anticipate or predict, like tunnels that interfere with GPS signals and construction projects that force lane changes. In these cases, it can be difficult to determine whether the AV was engaged, what the human driver was doing during the accident, and how the system could have avoided the crash.