On Monday, a fatal accident involving a pedestrian and a self-driving car from Uber occurred in Arizona, in what is believed to be the first fatal autonomous vehicle accident. The crash occurred at an intersection as a 49-year-old woman walked across the street with her bike. The self-driving car, unfortunately, was going faster than the speed limit in the lead up to the crash and it also never attempted to stop (according to police) as it approached the crosswalk.
In response to the fatal accident, Uber pulled all of their self-driving cars off of the road. But that is the least they could do, given the tragic circumstances of this crash. The question going forward now is, how will liability be determined?
This has always been one of the leading criticisms of self-driving cars. When they get into an accident, how do you determine liability? Without a driver at fault, does the liability fall on the auto manufacturer? Or what about the programmer or engineer that made the systems used on the vehicle? How does shared liability work depending on what the answers to these questions are? And with ride sharing apps and other companies that utilize self-driving technology, what responsibility do they bear?
These questions will only be asked more, and louder, in light of this crash. It will be interesting to see what the future of the road will be like when self-driving cars are more widely available.
Source: CNN, "Uber self-driving car kills pedestrian in first fatal autonomous crash," Matt McFarland, March 19, 2018