The National Transportation Safety Board announced Thursday that it has revoked Tesla’s status as a party to its investigation of a fatal Model X crash in Mountain View, California last month.
Being a party to an investigation allows a company to fully participate in the investigation process, sharing information with the agency and viewing information uncovered by NTSB while the investigation is still ongoing.
But parties must agree to respect the confidentiality of the process while it’s underway, and the agency says that Tesla has broken that agreement with recent comments about the Mountain View crash. In a statement this week to Silicon Valley television station ABC 7, for example, Tesla argued that the crash occurred because driver Walter Huang “was not paying attention to the road.”
“Releases of incomplete information often lead to speculation and incorrect assumptions about the probable cause of a crash,” the agency said in its statement.
Tesla was unapologetic, however, and insists that it unilaterally withdrew from the agreement.
“It’s been clear in our conversations with the NTSB that they’re more concerned with press headlines than actually promoting safety,” a Tesla spokesperson wrote. “They repeatedly released partial bits of incomplete information to the media in violation of their own rules, at the same time that they were trying to prevent us from telling all the facts.”
Tesla and the NTSB have had an increasingly frosty relationship in recent weeks, as Tesla has released information to the media over the NTSB’s objections. In an April 6 phone call, NTSB chairman Robert Sumwalt warned Elon Musk by phone that talking to the media could lead to loss of party status.
NTSB chairman suggests Autopilot design still needs improvement
Sumwalt’s statement included what appears to be a pointed shot at Tesla’s Autopilot: “We continue to encourage Tesla to take actions on the safety recommendations issued as a result of our investigation of the 2016 Williston, Florida, crash.”
That’s a reference to the death of Joshua Brown, a Tesla customer who died when his Model S slammed into a truck while Autopilot was engaged. In its investigation of that crash, the NTSB found that the user interface for Tesla’s Autopilot system had played a role in Brown’s death.
Tesla cars require drivers to keep their hands on the steering wheel and warn them if they don’t. But the NTSB argued in its 2017 report that this wasn’t good enough.
“Monitoring steering wheel torque provides a poor surrogate means of determining the automated vehicle driver’s degree of engagement with the driving task,” the agency wrote.
The NTSB also encouraged Tesla and other automakers to “limit the use of automated vehicle control systems to those conditions for which they were designed.”
Tesla did make some changes to Autopilot in the wake of Brown’s death. For example, Autopilot will now bring the vehicle to a stop if the driver ignores Autopilot warnings three times in an hour. When I visited Tesla’s headquarters for a Model S test drive last year, the spokeswoman made a point of walking me through the various safety warnings a driver sees both before and during Autopilot use.
But Sumwalt evidently believes Tesla could be doing more.
One possible model here is the Cadillac Super Cruise. As our own Jonathan Gitlin wrote in February, the technology takes two safety precautions not found in most other driver-assistance products on the market.
First, Super Cruise uses eye tracking, rather than steering wheel torque, as the main method of detecting driver engagement. This is a better way to gauge whether a driver is actually paying attention, and it’s harder for drivers to circumvent.
Tesla’s Model 3 has a driver-facing camera that could potentially be used for this kind of driver monitoring, but we’ve seen no sign that the company is planning to do this.
Second, GM has gathered high-density map data for 160,000 miles of highway across the country, and Super Cruise can only be engaged on those roadways. This greatly reduces the chances of Super Cruise being activated on a road with an unusual configuration that could confuse the technology and cause a crash. Tesla’s technology doesn’t make use of high-definition maps, allowing it to operate on a much wider range of roadways—but potentially making the system more error prone.