One of Tesla's beta testers has died. Should the Autopilot test on open roads continue?

Latest

At the end of last week, almost two months after the fact and later than we probably ought to have learned about it, the first fatal crash of a Tesla on autopilot was widely reported.

The details of the crash were sad, but also grimly predictable: the deceased, Joshua Brown, 40, was a thrill-seeking Tesla fan who’d narrowly avoided a crash using autopilot the month before his death; he may also have been watching a movie in the car when he died. Those details finally came out because the National Highway Traffic Safety Administration (NHTSA) is investigating the role autopilot played in the crash.

Tesla said the car’s camera failed to discern the white tractor trailer of the truck with which Brown collided from the brightly lit sky behind it. Tesla also says the truck driver was clearly at fault, that the accident was “the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car.”

A day after news of the investigation and Brown’s death began circulating, another Tesla crashed, this time non-fatally. The car involved in the second crash was a Model X (an SUV, also outfitted with autopilot) driven by Albert Scaglione, a 77-year-old art dealer from Michigan. The Detroit Free Press reports that Scaglione and his son-in-law, artist Tim Yanke, were driving on the Pennsylvania Turnpike on Friday afternoon when the car hit a guard rail, crossed several lanes, hit a concrete median and rolled onto its roof. Both Scaglione and Yanke survived, as did the driver and passenger of a car struck by debris from the Model X. The Pennsylvania State Police officer who arrived at the scene said Scaglione claimed autopilot was on when the collision happened.

Tesla disputes this, telling Jalopnik that there is “no data to suggest that Autopilot was engaged at the time of the incident.” Scaglione and Yanke have been incommunicado since the crash (the Free Press wasn’t able to reach them, and my own calls and emails have also gone unanswered).

How does Tesla know whether autopilot was on or not? Here’s what it said in a statement to Jalopnik:

Anytime there is a significant accident, Tesla receives a crash detection alert. As is our practice with all collisions, we immediately reached out to the customer to make sure he was safe. Until the customer responds, we are unable to further investigate.

For the time being we’re left with an automation whodunit: did Scaglione think autopilot was on when it wasn’t? Is Tesla’s data flawed? A spokesperson for the company told me that while they know the airbag deployed, they’re still missing other information. Tesla’s three attempts to call Scaglione have been unsuccessful. [Update: Scaglione has now declined to comment on the crash to the New York Times, telling the paper ““My attorneys will be releasing a statement shortly.” The NHTSA will also be investigating his crash.]

“[L]ogs containing detailed information on the state of the vehicle controls at the time of the collision were never received,” the spokesperson explained via email. “This is consistent with damage of the severity reported in the press, which can cause the antenna to fail.”

I asked Tesla how many crash detection alerts it’s received from the cars it currently has on the road, but the company declined to provide that information.

For now, we know of two Teslas that have crashed and that automation played a role in one and maybe the other. The Wall Street Journal reports on other Autopilot accidents. Additionally, Tesla’s forums for its drivers contain scary tales about autopilot.

In a April 1 thread titled, “Anyone else having too many Autopilot close calls?” a poster with the username evlnte outlined two autopilot near-accidents that happened in the same day.

On one occasion, we were traveling 70 mph on a straight stretch of highway and started to approach a traffic jam that was absolutely stopped. We were waiting … waiting… The car was not stopping, not slowing down. All of a sudden the alarm sounds and autopilot disengages and have to slam the breaks to stop from crashing into the stopped cars. Next, we are behind a car and traveling about 55 mph. I turn on the left turn signal and a car passes from behind to the right and moves ahead. The car in the left lane was probably no more than 12 inches ahead and the Tesla stars to turn too soon into the left lane almost striking the car. Had to pull the wheel back to my current lane and disengage autopilot. This thing is freaking me out!

Many posters chimed in to say they disagreed, and that they found the system helpful, though some echoed (albeit less feverishly) the original poster’s concerns. Two days later in the same thread a poster going by brian told another close-call story:

I had one two days ago – one of those heavy traffic moving at a good speed then suddenly braking to a stop situations while I was using auto pilot.. It warned me and I put on the brakes however I am not sure if it had the time to auto brake, and was not willing to test it. I was side by side in stop and go traffic on Friday with a woman obviously using the autopilot, as was I, she was picking her nose (hey we all do it) , on the phone and had a coffee mug in the other hand.

In another thread from October 17, 2015, not long after autopilot was released to the public, a driver warned “BEWARE AUTOPILOT BETA ON TEXAS HIGHWAYS.” While the driver, who used the username cmacfarland, praised the software generally, they also offered a cautionary tale:

Last night we did about 100 mile round trip on Texas Highway 287 between Rhome and Decatur. On this highway with AutoSteer engaged there is real danger of the car following a line for a turn around or highway exit at 70 mp/h that will result in a crash if you do not correct immediately by hand steering.

There are both ethical and legal questions about whether Tesla should continue its autopilot beta test on open roads. Legally, Tesla has implied that it’s not responsible for the first fatality involving autopilot, emphasizing in a blog post about Brown’s death that autopilot is off by default and “requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.”

Tesla says drivers must check an acknowledgment box before activating Autopilot that explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. (Some legal experts who spoke to The Guardian suggested that, even with those warnings, Tesla might still face problems if sued.)

Autopilot’s autosteer remains in beta, as Tesla emphasizes on its own site. Beta testing is late stage testing, and usually involves external testers kicking the wheels on a product. But those wheels are also usually metaphorical, and betas are carried out on products that are nothing like cars: software, living on a computer somewhere comparatively safe. But this one is happening on open roads, meaning that Tesla drivers aren’t the only participants. Every driver, passenger, bike rider, or pedestrian that shares a road with a Tesla on autopilot is being included in the beta. The drivers might check that acknowledgement box but what about all the other people sharing the roads with Teslas? They checked no boxes.

Tesla argues that autopilot is statistically safer than manual driving, which is true, though that’s in large part because driving in general is stupendously dangerous. Self-driving cars may be safer than manually-operated cars in the long run, though MIT Technology Review has reported that the numbers Tesla likes to trot out about the present state of the technology may be more dubious than they seem.

Lots of corporations are pouring money into self-driving cars (Apple, Google, Tesla, Mercedes, and Uber, to name a few), and many of them are doing tests on open roads. But they’re doing so less recklessly: When Google and Uber test self-driving cars, they use paid employees whose entire job is to focus and be safe.

When Tesla first deployed the autopilot technology last fall, it didn’t have to get permission from a regulatory body or government agency. As The Wall Street Journal points out, Brown’s death is the “first significant chance to flex regulatory muscle” that the NHTSA and other agencies have had to weigh in on Tesla’s beta test. The NHTSA finally plans to release new guidelines on self-driving cars this month. The agency’s head, Mark Rosekind said in June that we need more data to ensure the safety of autonomous and semi-autonomous vehicles.

Tesla’s autopilot-outfitted models aren’t intended to be totally autonomous, but they’re still being put into the hands of people, who may abuse it and use it for purposes other than the short-range highway driving for which it was meant. Oh, did I write may? I meant to say people did the moment autopilot was available, and were congratulated for it by Elon Musk.

Tesla may be right that it’s not legally liable for autopilot crashes, but it’s still putting software that’s in development on the road and in the hands of people who qualify as testers based on the fact that they can afford a very high-priced car. If the technology is indeed dangerous, it puts not just Tesla drivers in danger, but their passengers and anyone sharing the road.

Knowing when a technology like AutoSteer is safe for widespread use is a tricky question. Tesla thinks it already is, but the company profiting from selling cars with the feature shouldn’t be the sole body calling the shots. That shouldn’t remain the case if safety is really Tesla’s primary concern.

Ethan Chiel is a reporter for Fusion, writing mostly about the internet and technology. You can (and should) email him at [email protected]

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin