Google is being sued by the family of a North Carolina man who died after Google Maps reportedly directed him to drive off a collapsed bridge.
The family alleges Google was negligent because it failed to update its app to reflect the bridge’s disrepair.
Google for its part said it sympathizes with the family and is reviewing the lawsuit.
This case is super-interesting, one of a number of lawsuits that are seeking to define where liability falls when automated systems cause mishaps or accidents.
In the Google case, is the Silicon Valley tech giant ultimately responsible? Is it the software developer? Whoever wrote the algorithm? No one?
Or is this a case of buyer beware? Are consumers liable if the systems they’re using cause trouble?
If your Tesla hits something while in Autopilot mode, is the carmaker liable? Whoever wrote the code? You for engaging the self-driving system?
Cases such as these will probably work their way through the courts for years as society comes to terms with the growing role of robots in everyday life.
The body of the North Carolina man, Philip Paxson, was found by state troopers in his overturned, partially submerged truck.
He drove off an unguarded edge of the bridge and crashed about 20 feet below, according to the lawsuit.
“Our girls ask how and why their daddy died, and I’m at a loss for words they can understand because, as an adult, I still can’t understand how those responsible for the GPS directions and the bridge could have acted with so little regard for human life,” his wife, Alicia, told the Associated Press.
She’d no doubt be equally at a loss if she had to explain that a machine was to blame, and nobody knows how to sue a machine.