The widow of a man killed by a Tesla Model X on autopilot is suing the car manufacturer in a San Jose, California federal court, alleging its “half-baked” self-driving technology caused the fatal collision.
Yoshihiro Umeda, 44, was the first pedestrian to be killed by a Tesla driving on autopilot, a safety feature that the manufacturer says is intended to be a safety backup for the human driver.
The accident occurred near Tokyo on April 29, 2018, when the Tesla “suddenly accelerated” to highway speed when the vehicle in front of it changed lanes.
The complaint alleges the Tesla, driving with its Traffic Aware Cruise Control (TACC) feature engaged, “rapidly accelerating from about 15 km/h to approximately 38 km/h” before slamming into the motorcycles and Mr. Umeda, according to Newsweek.
His widow, Tomomi Umeda, said the automaker will likely blame the crash on the driver of the Model X, who was reportedly “drowsy” at the time. She claims the crash could have been prevented had the automaker used better technology available, such as eye-tracking technology that would have alerted the drowsy driver to wake up. Instead, Tesla uses a driver monitoring system that relies on steering wheel input to determine whether the driver is alert.
“Mr. Umeda’s tragic death would have been avoided but for the substantial defects in Tesla’s Autopilot system and suite of technologies,” court documents read. The alleged defects include the failure of Tesla’s Autopilot to alert the drowsy driver, who kept his hands on the steering wheel as he fell asleep.
The complaint also acknowledges that Tesla’s goal of improving auto and traffic safety is commendable, but says it is using the world’s highways as test tracks and information gathering sites to feed its AI technology.
“Tesla’s decision to release a half-baked product to the public that is currently still in a ‘beta-testing’ stage of development continues to put the general public, other motorists, and all of those who share the road with Tesla’s vehicles, including pedestrians and the drivers of Tesla’s vehicles themselves, at risk of becoming the next casualty,” Ms. Umeda alleges, according to Law 360.
In the Umedas’ case, Tesla’s work-in-progress autopilot technology didn’t recognize the stationary vehicles in its path or the pedestrian, the lawsuit alleges, arguing that there will always be situations that Tesla’s AI is not prepared to navigate, putting motorists and pedestrians at risk.
Ms. Umeda and her daughter Miyu Umeda accuse Tesla of defective design, failing to warn, negligence and wrongful death. They seek compensatory and punitive damages.
While most automakers are well into the market providing driver-assistance features, fewer have delved into the autonomous vehicle market, or are just beginning to dip a toe into this new technology. It remains to be seen how autonomous vehicles, which take even more control out of the hands of individual drivers, may help or hinder safety as they become more common on the roads. Manufacturers of vehicles dubbed “autonomous” that are currently available to the public – Tesla among the best known – still caution drivers to take an active role in their operation.
Chris Glover, who is the Managing Attorney for Beasley Allen’s Atlanta office, has handled many cases involving defective auto products that have failed after being placed on the market and that have caused catastrophic injury and death. He also handles truck accidents and other litigation involving commercial vehicles, and is keeping a close eye on how autonomous vehicles are being tested for long-haul trucking.
Other source: Newsweek