Just another PLM WordPress site
A fatal Uber crash has raised questions regarding liability in the unsettled area of law regarding self-driving vehicles. Although there have been several other accidents involving Tesla cars in autopilot mode, this may be the first fatal pedestrian accident involving self-driving vehicle. Some experts are pointing to an inadequate light detection and ranging (LIDAR) system that failed to detect the woman.
A 49-year old woman was struck and killed by an Uber self-driving SUV in Tempe, Arizona as she tried to cross the street with her bicycle at night. The Tempe Police Department released the dashboard video from the crash which shows footage of both the outside and inside of the vehicle. The street was dark, and it appears that the woman became visible only after it was too late for the emergency driver to stop the vehicle. Footage from inside the SUV shows the driver looking down before the crash and then becoming startled when she notices the pedestrian.
Reports indicate that the self-driving system did not “see” the pedestrian either, as it too failed to brake and avoid the collision. An engineering professor at the University of Michigan who works on Ford Motor Co. autonomous vehicles believes the algorithm may have failed to distinguish the woman from vehicles or objects on the road. Because autonomous vehicles rely on radar (which functions optimally in dark conditions), the darkness was unlikely to have played a role in the crash. Rather, the accident is more likely attributable to classification software that did not recognize it was a pedestrian in the road or an inadequate number of LIDAR sensors that caused there to be a blind spot.
Self-driving technology is a fairly new development in the automotive industry. As such, the law pertaining to it is sparse and unclear, making it difficult to predict who will be held responsible. Municipal, state and national governments may now be faced with establishing clearer laws and policies regarding self-driving cars and liability.
Other countries are taking steps to establish such laws and policies. The government in Germany rolled out ethics guidelines for autonomous cars and the U.K. is conducting a three-year review of self-driving technology before allowing it on the roads. Part of the U.K. study involves an examination of cyber security and the potential for technological terrorism. According to a recent survey of 1,000 licensed drivers, 45 percent of U.S. drivers view protection against unauthorized access as a main concern regarding self-driving technology, second only to having a permanent option for drivers to assume control of the vehicle.
For now, Uber has halted testing of its self-driving vehicles and other car companies such as Toyota have followed suit. The city of Boston has also requested that self-driving technology companies, nuTonomy and Optimus Ride, temporarily stop testing in the city. It is likely that lawsuits will be filed in the Uber pedestrian accident case, but it remains to be seen who would ultimately be held liable.
If you were injured in a car accident, contact an experienced Wilmington car accident lawyer at Rhoades & Morrow. We represent clients throughout Delaware from our offices in Wilmington, Bear and Milford. Call us at 302-427-9500 or contact us online for a free consultation.