Tesla comes clean about its fatal crash; Uber just cowers
A self-driving Uber ran over a pedestrian in Phoenix. A Tesla in autopilot mode crashed into a safety barrier and killed the driver. Tesla released a statement about the crash that delivers specifics and clarity, while Uber said very little. It’s moments like this that clarify why we tend to trust Tesla, but not Uber.
Tesla’s statement clarifies what happened in detail
As soon as the fiery Tesla crash happened, Tesla released a preliminary statement. Then, one week later, Tesla described more details in a statement on its website. Unlike most statements, this one is full of facts and detail. Here it is, with my commentary added:
An Update on Last Week’s Accident
The Tesla Team March 30, 2018
Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.
Commentary: Get the apology over with first.
The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.
In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.
Commentary: These details are intended to show that the driver was too dependent on the autopilot, including the description of the warnings, the following distance, and amount of time the driver had to avoid the crash.
The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.
Commentary: U.S. highways are not padded with safety features — there are plenty of things to crash into. While it is true that the already crushed safety barrier wasn’t capable of absorbing another impact, I don’t think the real problem here is that the car chose the wrong thing to crash into.
Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.
In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.
Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.
Commentary: These statistics are encouraging, but their use in this statement is defensive. It’s not clear from this whether Tesla considers the autopilot working in the way that it does to be appropriate, or this level of fatalities to be acceptable.
No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.
Commentary: It’s clear that Tesla included this statement to protect itself from lawsuits or regulations regarding the autopilot. People are engaged in a fallacy that accidents they control are more acceptable than accidents they don’t control; Tesla is fighting that.
In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.
Commentary: The statement ends with an emotional statement, to indicate that Tesla, filled with engineers, is not heartless.
As corporate statements go, especially those about situations that end in death, this is well above average. Consider the difficulty of Tesla’s situation here. It must be sympathetic, but cannot make any statements indicating that it is taking responsibility for the crash. It has a responsibility to describe what happened appropriately, which is why the details are important. It also needs to share the facts about the safety situation with autopilot, which it has done. The statement comes off as defensive, but because of the specifics, it can actually contribute to the discussion of the safety issues here.
According to the Washington Post, the US National Transportation Safety Board, which is investigating the crash, is “unhappy” with Tesla for releasing this information. I understand the need for a full investigation, but I also think Tesla has the right to release information that’s pertinent at this moment when people are questioning the company.
When a self-driving Uber killed a pedestrian, Uber said almost nothing
Uber was running tests of its self-driving car in Phoenix. In a crash in nearby Tempe, Arizona, which happened at night, a pedestrian started walking her bike across the road; the Uber ran her over and killed her. There is a “safety driver” at the wheel of these self-driving cars, ready to take over in case of a problem, but in this case, the safety driver was looking down until just before the crash.
Like Tesla, Uber could defend its position. It’s not clear that a human driver could have avoided running over a person who walks directly into the car’s path on a dark road. But Uber has no statement on its website at all. All we have are a few brief tweets from the communications team and the CEO.
Some incredibly sad news out of Arizona. We’re thinking of the victim’s family as we work with local law enforcement to understand what happened. https://t.co/cwTCVJjEuz
— dara khosrowshahi (@dkhos) March 19, 2018
Our hearts go out to the victim’s family. We’re fully cooperating with @TempePolice and local authorities as they investigate this incident.
— Uber Comms (@Uber_Comms) March 19, 2018
Uber suspended its self-driving tests in the wake of the accident. But what actually happened? We can only guess. We don’t know why the driver was looking down. We don’t know why the car didn’t detect the pedestrian. We don’t know whether it was functioning as designed or not. Uber has released little information except this perfunctory apology.
According to the New York Times, Uber’s self-driving car project had problems well before this crash.
Silence is a bad strategy when you have a reputation for problems
Both companies were dealing with plenty of bad news leading up to this moment. Tesla was addressing challenges with production, reports that it was running out of cash, and a recall. And Uber’s list of failures is long and troubling.
Tesla released a statement describing and explaining what happened. Uber did not.
This is why Tesla continues to maintain its positive reputation while Uber, despite having a new CEO, continues to maintain its reputation for deceit.
What Uber tweeted: “Some incredibly sad news out of Arizona. We’re thinking of the victim’s family as we work with local law enforcement to understand what happened.”
What Uber meant: “Oh, shit.”
Not sure I agree with some of your points on this one.
1) “Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.
Commentary: Get the apology over with first.”
That was expressing condolences, not an apology. I also express my condolences, but I’m not apologizing for the accident, which I had nothing to do with.
2) The safety numbers may include spin. Not all vehicles are equal in terms of crash rates, which are factors not just of the vehicle (including presense of autopilot, whether or not the vehicle is a motorcycle), but the types of roads they are driven on and the demographics of the owners. Tesla is spinning numbers to place themselves in a good light.
3) “Tesla released a statement describing and explaining what happened. Uber did not.”
Tesla released a statement, yes, but I haven’t seen either company describe why their systems failed to react (Uber vehicle by not braking, Tesla vehicle not only by not braking, but apparently steering INTO a fixed obstacle!). It may be premature for either to have the analysis complete and the answers, but I call that a wash, not credit to one and not the other.
Coming from a background in computer & control systems both crashes illustrate a lack of feedback to correct, slow or stop autonomous cars
We know how to autonomously control hot rolled steel and a few variables. We are still looking at a porous intercept strategy for incoming ICBMs bound for the US West Coast from a rogue state like North Korea. Israel’s Iron Dome missile defense system combines 3 independent feedback tracking systems. Modern aircraft have 3 different systems for critical controls. Hard to imagine autonomous cars can be safe without a composition of at least 3 different feedback systems to control output. Both Tesla and Uber can stop apologizing and start leveling up with people expressly for untested edge cases.
As a PR problem this is a particular tricky one from Tesla.
Their position, right nor not (and I am not taking a position on that here), is actually that the driver did himself in by not using their product correctly. But they cannot just come out and say that.
As to your first comment: “Get the apology over ..” it should be pointed out that Tesla did not actually apologize. (If I am reading you correctly you didn’t mean to say that they did.) In fact they shouldn’t since anything they do say can and will be used against them in a court of law.
Regardless of that It is nice to see an analysis like this.
I’m starting to believe people that buy Teslas just test the car limits to the max, and any car will spew flames with a crash this hard.