Uber’s self-driving vehicle programs have been suspended in Arizona and other states after one of its autonomous SUVs struck and killed a pedestrian in Tempe March 18.
According to the Tempe Police Department, 49-year-old Elaine Herzberg was killed when a self-driving Uber Volvo XC90 SUV struck her as she walked a bicycle across the street. The accident – the first deadly accident involving a self-driving vehicle – remains under investigation. Dashcam video shows the driver behind the wheel of the self-driving Uber car momentarily looking down at the moment of impact. Outward facing video shows Ms. Herzberg being hit as she crosses the road. The accident happened at night.
Arizona Bars Uber’s Self-Driving Vehicles
Although Uber voluntarily suspended its self-driving vehicle programs in Arizona and other states after the collision, Arizona Governor Doug Ducey sent Uber CEO Dara Khosrowshahi a letter telling him that Uber’s self-driving vehicle programs would no longer be allowed in the state.
Gov. Ducey called the dashcam video “disturbing and alarming” and said the accident was an “unquestionable failure to comply” with Arizona’s expectations of prioritizing public safety when testing self-driving vehicles.
The Governor eagerly opened Arizona as a testing ground to Uber and other self-driving vehicle manufacturers when other states stepped more cautiously into the regulatory and legal framework for driverless vehicles. The deadly Tempe accident is a major setback for Uber, which in addition to its self-driving passenger vehicle program had been operating a fleet of self-driving commercial trucks statewide.
The impact of the Tempe accident on Uber Freight, Uber’s self-driving commercial truck program, is uncertain, but the program could unravel if the company isn’t allowed to resume operating in Arizona or doesn’t find another home to test and develop its robotrucks.
Uber’s self-driving trucks had been making commercial deliveries in Arizona in the months preceding the Tempe accident. The deliveries were the first under Uber’s rollout of Uber Freight – an on-demand trucking app that relies on self-driving trucks for long hauls and conventional tractor-trailers for shorter hauls.
With Uber Freight, the company has been redesigning the way much of the commercial trucking industry will operate in the future, using Arizona as its testing arena.
In a demonstration video, Uber shows how it has been establishing a hub-based network where commercial trucks meet and trailers are swapped, cutting driving distances and time for truck drivers.
The Uber video shows a conventional truck driver from Los Angeles driving to an Uber self-driving hub in Topock, Arizona, on the state’s western border. There, a long-haul Uber self-driving truck arrives from the Midwest, where the cargo is swapped. The short-haul conventional tractor-trailer driver returns to Los Angeles with the delivery from the Midwest while the Uber self-driving truck takes the West-Coast cargo to a destination on the East Coast.
Deliveries are arranged and payments are made in the Uber Freight app in much the same way as individual rides are booked and paid for in its ride-hailing app.
Uber Aims to Partner People and Tech
Uber’s self-driving truck system will rely on human drivers and conventional tractor-trailers to complete the short legs of the trip, such as from Los Angeles to Topock, while self-driving trucks complete the longer distances to another hub, where another short-haul driver completes the delivery.
Like all companies involved in the race to develop self-driving technologies, Uber freight system is a work in progress. Before the Tempe accident, the company was using its self-driving trucks to run between Topock and Sanders, Arizona, on the border of New Mexico – a haul of about 300 miles. Its ultimate goal, however, is to have a network of hubs spanning longer distances across multiple states.
Like its autonomous car program, Uber’s self-driving trucks have a commercially licensed truck driver behind the wheel who can override the self-driving truck’s autopilot, if necessary. The company also predicts that its model will generate a lot more work for tractor-trailer drivers making the short hauls.
The company also seeks to assuage fears among commercial truck drivers that self-driving trucks may make their jobs obsolete in the coming years:
We envision a future where truck drivers and self-driving trucks work together to move freight around the country. Self-driving trucks will manage long haul driving on some interstate highways, but having two hands on the wheel will still be the best way to get a load to its final destination. Truck drivers possess the critical skills that self-driving trucks may never match — like backing into a tight dock, navigating a busy industrial yard, or moving axles on a trailer.
But What About Safety?
Beyond concerns about employment for commercial truck drivers, valid concerns about the safety of self-driving trucks on the road at this stage exist and certain questions remain unanswered, as the deadly Tempe accident sadly demonstrates.
Many critics say Arizona and other states are moving too quickly without sufficient regard for public safety. Rosemary Shahan, founder of Consumers for Auto Reliability and Safety, told the Wall Street Journal that Arizona and other states with relaxed rules for self-driving vehicles are “abandoning their responsibility to ensure that the (autonomous vehicles) are safe prior to allowing them to be deployed.”
Because self-driving trucks are computer automated, how susceptible are they to hacking? Cybersecurity concerns for automated trucks are currently far more complex and underdeveloped than collision avoidance technologies and navigation.
According to the MIT Technology Review, self-driving vehicles “will have to anticipate and defend against a full spectrum of malicious attackers wielding both traditional cyberattacks and a new generation of attacks based on so-called adversarial machine learning.”
“As consensus grows that autonomous vehicles are just a few years away from being deployed in cities as robotic taxis, and on highways to ease the mind-numbing boredom of long-haul trucking, this risk of attack has been largely missing from the breathless coverage,” the MIT Technology Review explains.
There are also various concerns about the human-automation overlap in vehicles that run mostly on autopilot. As MIT mentions, long-haul trucking can be mind-numbingly boring, but what technologies are in place to ensure that human drivers keep their hands on the wheel and their attention on the road when the truck is doing all the work? Is it possible truck drivers can grow increasingly accustomed to their truck’s self-driving capabilities that they let their guard down and do other things?
Safety concerns also surround the practice of “platooning” trucks – a form of automated or driver-assisted technology that uses radar and vehicle-to-vehicle (V2V) communications to virtually connect a group of trucks.
Platooning relies on V2V communications to virtually connect a caravan of trucks in a single lane on the highway, maintaining a closer than typical distance between the trucks. But how are those trucks distinguishable from conventional trucks to other motorists, who try to pass them or get between them? Is platooning safe for the truck drivers and other motorists when self-driving technology is still officially in the testing stage?
Safety concerns are also sometimes tied to ethical choices, and whether autonomous vehicles will ever be capable of making an ethical distinction between hitting, say, the driver of a runaway vehicle or a pedestrian in the crosswalk, or between killing a cat or hitting a dog, or two cats or a dog and so on. In Uber’s case, the ability of artificial intelligence to make sudden ethical distinctions may be a bit premature considering it still has work to do on detecting and avoiding pedestrians and other potential obstacles.
The New York Times reported that Uber’s self-driving vehicle tests were struggling even before the crash, falling far below industry standards for miles traveled before a safety driver has to intervene and take over and override the vehicle’s auto-pilot system.
So, while self-driving technologies have come a long way in the past 10 years, major concerns and questions about using these vehicles on the road in the developing stage persist. Hastily rushing a new product to market frequently yields less than desired results and often sacrifices consumer safety.
New York Times
MIT Technology Review
The New York Times