Fatalities Involving Self-Driven Cars Raise Concerns About the Technology, Tesla, and Uber
Technology is meant to make our lives better. When it comes to automobiles, computerized systems are expected to reduce the number of accidents and fatalities. In fact, automotive technology has already saved untold lives. But sometimes things go wrong.
I was saddened to hear about two recent fatal motor vehicle crashes involving self-driving vehicle technology. This technology will hopefully save many lives, but it will not mean the end of defective products. As I have said many times, the manufacturers could put lawyers like me out of business. I hope they do.
They just need to choose safety over profit.
On March 23rd, ABC News issued a traffic alert via Twitter: “Officials say a #Tesla driving in the southbound lanes collided with the barrier at the 101/85 interchange in #MountainView [California] and caught fire.” The ultra-modern Tesla, a vehicle featuring the latest autopilot technology, caught fire when its battery broke in half. The driver suffered deadly injuries before firemen succeeded in putting out the blaze.
Meanwhile in Tempe, Arizona, only a few days earlier, a pedestrian was killed by a self-driven Uber. According to the New York Times, this is the first pedestrian fatality involving self-driving technology.
Tesla Ablaze -Driver Had Previously Complained About Autopilot
Walter Heung was described by relatives as a “straight up, caring guy.” The 38-year-old Apple engineer had no idea, on a fine morning last March, when he got into his Tesla to drive to work, that that would be his last day on Earth.
Shortly after his 2017 Tesla collided with a median barrier, it caught on fire and was hit by two other cars. Heung was rescued, but he died at Stanford Hospital as a result of the injuries he suffered. The drivers of the other cars were unharmed.
Heung’s family has provided investigators with evidence that the man had complained multiple times about the vehicle’s Autopilot feature, taking it back to the dealership on more than one occasion, and stating that Autopilot veered toward that very same barrier his car hit on the day of the accident.
Huang was survived by his wife and two children.
Last January, an article on Business Insider discussed Tesla’s problems with the production of its pricey car batteries. Simultaneously, a CNBC story revealed that Tesla employees complained about quality controls that were being carried out by inexperienced workers.
At the time, employees participating in the manufacturing of Tesla vehicles told reporters that the batteries were “leaving the factory with a potentially serious defect,” a claim that Tesla speedily and categorically denied.
Tesla published a blog stating that “Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015. … There are over 200 successful Autopilot trips per day on this exact stretch of road.” This reminds me of a defense in the Ford/Firestone litigation: the failure rate per tire was incredibly low.
And I have seen other manufacturers argue in so many cases that the product has a low failure rate. So what? Engineering principles require a failure mode and effect analysis, and any failure mode that can result in death or serious injury must be designed out of the product when it is possible and reasonable to do so. End of story.
Tesla has chosen to sell this technology, and the reason Tesla is selling this technology is to make money. They will make lots of money, no doubt. But if they choose to make even more money by violating engineering principles so that people die, then I intend to hold them accountable.
A Deadly Self-Driving Uber
Elaine Herzberg was fatally hit by a Volvo XC90 with a high-tech sensor system as she walked her bike across a street in Tempe. The car, operated by Uber, was being driven by a computerized system. A “human safety driver” was present, but there were no passengers. Video recorded by the vehicle itself shows that the safety driver, Rafaela Vasquez, was not paying attention to the road at the time of the accident.
The car’s “lidar” (laser radar) sensors, furnished by Uber, were designed to detect pedestrians approaching its path, but they failed to sense Herzberg’s presence. Following the crash, Uber suspended its self-driven vehicle program in several locations in the U.S. and Canada.
For Consumer Watchdog’s John M. Simpson, self-driven cars pose serious risks, “The robot cars cannot accurately predict human behavior, and the real problem comes in the interaction between humans and the robot vehicles.”
It is heartbreaking that people had to die before regulators could really focus on protecting the American public by disallowing the use of what is still considered experimental technology on U.S. roads.
For former endurance pilot and car expert Alex Roy of Thedrive.com, video of the accident shows that Uber is to blame. Roy, who has lectured on related issues for the FBI, CIA, MIT, and Stanford has drawn the following conclusions after watching the footage:
- Uber’s hardware or software failed.
- Many people at Uber need to be fired.
- The Arizona officials who greenlit testing need to resign.
- One or more people need to be prosecuted.
- The SAE Automation Classification System is vague and unsafe.
- Volvo–one of the few car makers that truly cares about safety–is innocent and shouldn’t be in bed with their craven opposites
I sue car companies on behalf of grieving families.
I am sure the car companies do not view me kindly; we are frequent opponents. I am proud of the work I have done in holding them accountable, but I would be remiss if I did not also say that I have seen many brilliant, kind-hearted people working at these companies. The engineers want to build a beautiful, safe product. Profit demands of management can sometimes get in the way.
But some car companies seem to always put safety above profit.
Volvo and Mercedes in particular have innovated incredible advances in automotive safety, and even given that technology away free to other companies. That’s what it means to put safety over profits.
Here is my plea to Uber and Tesla: let Volvo and Mercedes be your model. Innovate this technology the right way. Put me out of business! I will be happy to never again be called upon to help a grieving family.
John Uustal is a serious injury lawyer and author of several books on legal topics and the upcoming Corporate Serial Killers. He leads a 22 lawyer law firm with a national practice based in Florida.