New Self-Driving Car Tech Could Make U.S. Roads More Dangerous
“I would say, don't do it — because now you've just opened Pandora's box.”
Many new cars offer features like adaptive cruise control, which prevents your vehicle from slamming into the one in front of you, and automatic steering to keep you in your lane. But in the coming years, carmakers claim that fully self-driving whips could take over while we sit back and play video games, binge Netflix, or soak up the scenery.
For now, most systems offer what’s called Level 2 autonomy, according to the classification scheme outlined by the Society of Automotive Engineers. This system ranks autonomous driving technologies from 0 (with basic safety features like lane departure warning) up to 5 (a fully self-driving car).
In the coming months, the most advanced type of self-driving car on the market will hit U.S. roads. Last month, Mercedes-Benz announced it will become the first automaker to deploy Level 3 in the states: Mercedes will offer Nevada customers its Drive Pilot system in the second half of 2023, the company announced. Later this year, it aims to expand Level 3 to California.
To ramp up autonomous vehicle software in the U.S., companies don’t have to get federal approval. And while states have a dizzying array of varying regulations, Nevada’s government didn’t need to give Mercedes legal permission — officials just confirmed that the company’s safety features meet state standards.
“For the most part, this is for public relations and often involves asserting demonstrably false claims,” William H. Widen, a law professor at the University of Miami who researches autonomous vehicle regulation, tells Inverse. “Moreover, AV companies actively lobby to create toothless state regulation and pre-empt local governments from regulation needed for safety given local conditions.”
So what exactly is Level 3?
It marks the first stage when people officially hand over control to the car. But it also demands that the human retake the reigns when the system requests it — that may kick in over 40 miles per hour, as is the case with Mercedes, or in situations where the system doesn’t know how to proceed.
A handful of companies, including Mercedes and Honda, currently sell Level 3-compatible vehicles, but they’re only available outside of the U.S. More carmakers, like Hyundai and Kia, may also roll out similar software in U.S. cars over the next few years.
Despite the trend toward widely available Level 3 driving, the control “handoff” between humans and machines makes some auto safety experts worried. Some even suggest the industry should skip this level and refrain from giving vehicles more autonomy until the technology allows cars to operate fully on their own.
To learn more about the controversy around the Level 3 rollout, we spoke with Missy Cummings, an engineer at George Mason University who served as one of the U.S. Navy’s first female fighter pilots and a senior safety advisor to the National Highway Traffic Safety Administration.
This interview has been edited and condensed for clarity.
What does Level 3 autonomy actually mean?
MC: These Society of Automotive Engineers levels have been extremely confusing for everyone. But it's where automation takes over some part of the task, and you're not expected to intervene until the car tells you that you need to take over.
With the Mercedes, my concern is similar to what we see in Tesla’s full-self driving system. If the car is in Level 3 mode, and then something happens … is that human in a place both physically and mentally to be able to take control?
“A lot of bad things can happen in 30 seconds.”
When I was at NHTSA, I saw it wasn't just a Tesla problem — there are all sorts of problems with all manufacturers. The autonomy system in a car sometimes just gives up when it doesn't recognize what to do in a particular situation.
To be realistic, you need to give somebody about a 30-second heads up that they're going to need to be able to take over. A lot of bad things can happen in 30 seconds.
Mercedes’ Level 3 features will work only under 40 miles per hour (the SAE standard). Are slower speeds harder for autonomous systems, since that may entail traffic lights and pedestrians?
MC: It's kind of a trade-off. Interstates have a lot of structure. You can only get on and get off in certain places. You don't have pedestrians and bicyclists.
But you've got high speed, which means that bad things can happen in very short time periods. In the urban and suburban environments, you've got slower speed, but then you've got all these other possibilities.
I would hesitate to say which one is harder or easier. They're both really hard.
Do you think drivers will actually find Level 3 features useful? What happens when a car gets through a highway traffic jam and suddenly needs to speed up?
MC: I think Level 3 is kind of a nightmare scenario. If Mercedes had asked me for a recommendation prior to doing this, I would say, don't do it — because you've just opened Pandora's box. You're going to need to have very good autonomy.
How are they making sure that you're going to take over when you're supposed to? What happens if you don't?
“That's super dangerous.”
If you're in the far left lane of a freeway and you don't take over, can the car edge itself through four lanes over to the shoulder?
What we don't want is for the Mercedes to perform an abrupt breaking maneuver if you don't take over. Is it going to slow down in its lane, and then traffic comes to a grinding halt because the car is stopped there? That's super dangerous.
Let’s say you’re allowed to look away from the road or stare out the window while the car is driving. Then the car gives you an urgent warning to take over, but you can’t resume control fast enough and get into an accident. Whose fault is it?
MC: Well, I can be quite sure I'm going be called to be an expert witness in that case. I'm being a little tongue-in-cheek, but I can promise you that will end up in court.
If you are told that you can text while the car drives, people are going to take very full advantage of that. When they're being told that they could be hands-free, people relax.
One of the things that I saw [at NHTSA] is that people start stretching, or they're tucking their feet under — they're changing their body position significantly enough that when something bad happens and they go for the steering wheel and the brake, they will often hit the accelerator instead of the brake.
Could people feel too confident with these features because they haven’t seen Level 3 fail?
MC: You go through this period when you love the car, you want it to do everything that was advertised, and you're kind of lulled into a false sense of security. You should have my job: After a year of reading nothing but crash reports every morning, I actually take the bus now a lot more.
“You give them an inch, they take a mile.”
Humans are human; drivers are not pilots. They're not formally trained. You give them an inch, they take a mile.
Some people say, “Oh, people know that what Tesla calls full self-driving isn't really full self-driving.” I can promise you, I've read enough crash reports to suggest that people want to believe that is true full self-driving.
Some experts have argued that we should skip Level 3 entirely. From a public safety point of view, what should we be doing?
I don't think Level 3 should be a product outside of very narrow applications, where hiccups in the technology are not going to cause massive traffic jams or crashes. I'm glad I don't live in a state where Mercedes is looking to bring this technology.
That actually brings me to another problem: There’s actually no federal law that prevents Mercedes from deploying this right now. So if Mercedes were to sell those cars and have them drive in Nevada, there's no federal law that they're breaking. I do not think that there's any state law that they're breaking, so I'm not sure why Mercedes is insisting that they get regulatory approval.
Do you think states or cities might try to push back against the Level 3 technologies that are surely on the way? You’re saying they can’t?
MC: I think most states have some kind of imminent hazard authority, meaning that they do have the ability to post hoc control these technologies. But I don't know why Mercedes is insisting that they have Nevada's approval, which they don’t need.
“People are still not that crazy about cars driving themselves.”
The thing is, states don't regulate vehicle technologies. That's NHTSA’s job. There is this gray area where, if you're calling the autonomous system the “driver,” states get to certify the drivers. I think that's a murky area. If there were actually lawsuits that pushed the point, then that would have to go. That would be a federal responsibility because the driver in this case is technology, and NHTSA certifies technology.
Are you more or less optimistic about what's happening with autonomy than you were a few years ago?
MC: I don't know if I would say I'm less optimistic about the state of the art. It is exactly where I thought we would be. I am more pessimistic about companies that seem to be coming up with a product that’s really just trying to desperately grasp that market share as opposed to putting together products that people actually want.
Survey after survey has said people like safety features like automated braking and a lane departure warning system. But in general, people are still not that crazy about cars driving themselves.
This article was originally published on