Science

Autonomous Cars are a Huge Problem for the Police

Getty Images / Darren McCollester

Autonomous cars might be programmed not to break traffic laws, but that doesn’t mean they won’t run afoul of the police — and a robotics expert says automakers should start looking at how their self-driving creations handle the cops, soon.

On Saturday, Rodney Brooks, the co-founder of iRobot, dove into that dilemma in a blog post that outlined some serious concerns about how autonomous cars are going to work in the real world. Chief among them: what happens when your Tesla gets dinged by the feds? Despite all the press attention around projects like the upcoming Tesla Autopilot update, Brooks is worried that people aren’t asking the most basic questions about how any of these systems will work in practice.

“I definitely see the edge cases as pushing back time for introduction,” Brooks tells Inverse. “I expect we will see smallish scale trials and lots of these issues will come up, and that will slow down wide spread allowance (by cities and other governing bodies).”

Brooks noted that it’s common for police officers to use a combination of sirens, hand gestures and maneuvers to communicate with drivers. While pulling over an autonomous car would likely be pretty easy — some device could beam a signal to bring the car to a stop — the subtleties of an interaction with police could get lost on a vehicle’s road-focused brain. And that could be dangerous to the car’s occupants, particularly if it’s been designed without a steering wheel or other manual controls, or if a human being isn’t even physically inside it.

A U.S. National Guardsman directs traffic after Hurricane Dennis. 

Getty Images / John Moore

Brooks outlined several other cases where the complexities of every day life could bamboozle a self-driving car. In some situations, the police may need to communicate with drivers to ask them to steer in a non-intuitive way. For example, an officer may direct traffic through to the other side of the road to avoid road work or an accident. In those situations, will a car understand what is being communicated and act? Will it recognize who it should take orders from and who it shouldn’t?

No worries if the car hits these guys, they've got good health insurance.

Getty Images / Justin Sullivan

The debate around self-driving cars often involves the trolley problem a hypothetical scenario in which a computer or person is faced with a choice with no good outcome (on a crowded road, would a car choose to hit one pedestrian, or swerve and endanger more) but Brooks says that smaller dilemmas, like what to do on a single-lane road with two-way traffic, could be far more important.

“Unlike the trolley problem variants of these edge cases are very likely to arise, at least in my neighborhood,” Brooks said. “There will be many other edge case conundrums in the thousands, perhaps millions, of unique neighborhoods around the world.”

What's a self-driving car gonna do about this guy? 

Getty Images / Darren McCollester

As such, it may take a while before automakers and legislators feel comfortable with depending on autonomous technology to drive in unfamiliar circumstances. But either way, they need to start thinking about it soon, because cops will be seeing driverless vehicles on their streets in no time, at different levels of autonomy, in a myriad of different situations.

“Fully driverless cars are a lot further off than many techies, much of the press, and even many auto executives seem to think,” Brooks writes. “They will get here and human driving will probably disappear in the lifetimes of many people reading this, but it is not going to all happen in the blink of an eye as many expect.”

Let’s hope both drivers and cops alike are ready.

Related Tags