Science

How a Fatal Tesla Crash Could Shape America's Autonomous Car Laws

What will keep your self-driving car from killing you in the future?

Tesla

By the end of the year, the biggest scoop about self-driving cars won’t be that Uber is testing them in Pittsburgh, that a Singapore company has already put them on the street, or that Google’s steady progress shows no signs of stopping. Nor will it be Apple chief executive Tim Cook’s non-denial denials about the so-called Project Titan. Rather, it will be a story about the fatal crash that killed Joshua Brown.

Brown, an Ohioan, was on a road trip, driving through northern Florida on May 7. A tractor-trailer driver made a left turn in front of Brown’s Tesla Model S, while Autopilot, the car’s driver-assist technology, was engaged. Autopilot didn’t register the trailer as a threat, so the car traveled beneath the trailer before hitting two fences and a power pole. Brown, age 40, was killed.

Brown was something of a Tesla super-fan — the New York Times reports that he nicknamed his car “Tessy” and put 45,000 miles on it in a single year — who was acknowledged by Tesla chief executive Elon Musk when he posted a video of Autopilot successfully avoiding a highway collision just a month before his fatal wreck.

Brown’s crash led the Senate Committee on Transportation to ask Musk how Tesla will learn from the crash, sparked an investigation by the National Highway Traffic Safety Administration, and may lead to delays in NHTSA’s guidelines for these cars.

“I think the Tesla incident clearly had an impact on the NHTSA’s plans,” Consumer Watchdog privacy project director John Simpson tells Inverse. “They are investigating that incident, as is the National Transportation Safety Board. There may have been some usual bureaucratic delays and misestimates, but I think that the Tesla incident and the focus it put on this issue was one of the reasons they delayed in July.”

In the past few weeks, Ford has revealed its plans to release a fleet of autonomous vehicles, Uber has announced plans for Pittsburgh, and those Nutonomy self-driving taxis have hit Singapore. Regulators might be idling, but the companies developing the autonomous vehicles are moving forward. In March, Google’s self-driving car lead told a U.S. Senate committee that foreign firms were hot on Google’s development heels.

“Not a day goes by that a company from China doesn’t try to recruit our team and poach our talent,” said Chris Urmson, who heads up Google’s Self-Driving Car Project. “We need to see the economic benefits and others in America first.”

This is the past, present, and future of regulating self-driving cars.

Back in 2013, NHTSA released preliminary guidelines for autonomous vehicles, and at the time, The Hill reported the agency was hesitant to endorse self-driving cars: “[T]he agency does not believe that self-driving vehicles are currently ready to be driven on public roads for purposes other than testing.” The report also notes that the NHTSA was at the time “encouraged by innovations in automated driving and their potential to transform our roadways.”

Those guidelines appear to have been removed from the internet when NHTSA updated them earlier this year. (That update is separate from the guidelines the agency was supposed to release in July).

Now the agency says that the “rapid development” of self-driving tech means that cars with some level of autonomy — partial or full — are reaching a point at which the masses will be able to buy and drive. This is a marked shift in tone over the course of just three years.

But here’s the problem: NHTSA refers to “partially and fully automated” vehicles. Those are very different technologies that have to be regulated as such. Something like Autopilot — which is supposed to see its eighth version very soon — doesn’t turn vehicles into self-driving cars. It engages features — like automated brakes and adaptive cruise control designed to make life easier for human drivers. Passengers are still expected to remain alert and ready to take control of a moving car.

The Tesla Model S at the IFA trade show in 2014.

Getty Images / Sean Gallup

Autopilot-equipped Teslas are what’s called Level 2 automated vehicles on NHTSA’s scale. They’re more advanced than cars that merely include electronic stability control, for example, but fall short of Level 4 vehicles – that are designed to do everything a human driver can. Autopilot is semi-autonomous at best; the problem is that at least some of its users don’t understand that they aren’t supposed to be trusting the feature with their lives.

Bryant Walker Smith, a professor at the University of South Carolina School of Law, tells Inverse that each of those vehicle types has its own set of rules to follow.

“The way law applies to Tesla’s Autopilot is different from how law applies to Uber’s supervised automated driving, and it’s different from how law might apply to a truly driverless shuttle,” Smith says. “That matters because really the devil is in the details.”

Smith published a paper in 2012 called “Automated Vehicles Are Probably Legal in the United States.” In it, he explained that driving laws don’t prohibit self-driving cars, which means they are legal. (“In the U.S. we start with a presumption of legality,” he notes. “Things are legal unless they’re explicitly illegal, not the other way around”).

This means companies like Ford, Uber, and Google (its prototype above) can test their autonomous vehicles without breaking the law. The problem arises when it’s time for these vehicles to be deployed on a massive scale – especially when consumers don’t know the difference between the autonomous vehicles these companies intend to build and Tesla’s Autopilot.

In a survey of 1,832 U.S. drivers – conducted to learn how consumers feel about self-driving vehicles – AAA found that 75 percent would be afraid to ride in an automated vehicle. But 61 percent said they want semi-autonomous tools like automatic emergency braking in their next car.

So there’s a difference between semi-autonomous and fully autonomous vehicles, and consumers can sense the distinction. Yet features like Autopilot have led to multiple crashes because many overestimate the system’s capabilities, sometimes after confusion about what the technology can actually do.

But when experts think about truly self-driving cars — Level 4 stuff — they question whether the technology is ready for regulators to form policy. Lawmakers could be putting the self-driving cart before the horse, as it were.

UC Berkeley transportation expert Steven Shladover, who also advises California’s Department of Motor Vehicles on self-driving vehicles, estimates that Level 4 tech needs another five to 10 years of work.

“Level 5 automated vehicles, which would be capable of driving under the full range of road, traffic and weather conditions in which people drive today, are much further in the future,” Shladover tells Inverse, adding it will likely be about 60 years until Level 5.

He says “massive scale usage” of those Level 5 self-driving cars may never become a reality, given the technical challenges.

How is the government supposed to regulate theoretical technologies? The NHTSA doesn’t merely have to solve problems like the fatal Autopilot crash – it also has to envision the future and create guidelines that address potential issues without stunting innovation.

It’s clear there are no easy answers with self-driving car laws. At this point, lawmakers are just waiting for tech companies to give them some kind of guidance for the expectant future, forcing them to be reactive instead of proactive in this field.

“When companies showcase systems that are ready, that will change the system pretty dramatically,” Smith says. “But those systems just aren’t there yet.”

More crashes like Brown’s will happen. But the hope is that staying hands-off may allow companies to improve autonomous vehicles so they can help save lives. Smith offers this simplification: They were like airbags, imperfect at introduction, but now they generally work perfectly.

“Were going to see the same dynamic in automated driving,” he says. “There are going to be some tragedies. The hope is that the number of people saved is going to outrank the number of people harmed.”

Related Tags