Today, the National Transportation Safety Board will continue its investigation into a fatal crash involving a self-driving Uber vehicle that hit and killed an Arizona woman in Tempe, Arizona over the weekend. A test driver from Uber was behind the wheel at the time of the crash.

School of Information Studies associate professor Lee McKnight has no doubts that more driverless vehicles are headed to the road – but without the legal infrastructure and social norms in place to keep up with the technology, he says they are “accidents waiting to happen.”

McKnight says:

“The first autonomous shuttle bus brought into service in Las Vegas summer 2017 was in an accident within an hour. A delivery truck driver that did not ‘sense’ the presence of the shuttle as he backed up was blamed. Of course, a human shuttle bus driver would have heard the truck beeping, and backed up out of the way. Instead of the supposedly smarter autonomous vehicle – stupidly – just sitting there waiting to get hit.  But yeah let’s blame the human truck driver, and not the – human software programmers, or the hubris of the shuttle designers and operators thinking their smart vehicle need not ‘sense’ when a vehicle is backing up towards it; or ‘hear’ that beeping noise humans know to take as a warning sign to be careful and get out of the way.

“It is too soon to say what exactly went fatally wrong in Arizona when an autonomous Uber (with a human inside) killed a pedestrian who was not exactly where humans should be on a highway. Still, no doubt autonomous vehicle enthusiasts will blame the pedestrian, and note many pedestrians die in accidents across the country daily. While that is true, many more pedestrians walking not exactly where they should are avoided by intelligent human drivers. Every day.

“In sunny Arizona, in Las Vegas, and soon in California, humans are the guinea pigs while autonomous vehicles learn – from their errors – to not kill us. In the future. Hopefully.

“More challenging terrain like rainy Oregon, or the snowy Rockies… well, for now, the owners of autonomous cars will not risk their vehicles in those terrains. Soon, however, on interstates and highways, we are supposed to trust autonomous trucks more than human drivers. No doubt the autonomous trucks will not need coffee to stay alert; but why should we believe their owners, and programmers, have taught the vehicle everything it needs to know to avoid the careless things humans do? Of course, machine learning and artificial intelligence have great potential to improve vehicle safety.

“So no doubt autonomous vehicles are coming; but when neither legal norms nor social practices have evolved to deal with them, they are, by definition, accidents waiting to happen. And, very large lawsuits waiting to be filed.”