Road Signs That Crash Cars: What You Need To Know

Road Signs That Crash Cars: What You Need To Know

Self-driving cars are the future, but is it a good one? Security researchers put the brakes on plans for production till better safety testing is conducted, due to an algorithm that makes cars misread signs. Check it out!

Hopefully, you all heard about the dangerous stunts pulled by security researchers as they successfully hacked a 2015 Jeep Cherokee by using Chrysler’s UConnect system, causing it to lock out driver controls and be controlled remotely – while driving on the freeway. Thankfully, the information was given to Chrysler, who most likely have a much more robust security system behind their infotainment system now.

 Maybe you’ve seen an Uber self-driving vehicle on the road in Pittsburgh, or San Francisco. Cars are getting smarter, that’s for sure. So are hackers, and recently, more security researchers tricked some self-driving cars into misreading simple road signs. The consequences could be disastrous, and there are plenty of bugs to fix before drivers can take their eyes off the road.

In autonomous cars, vision systems are comprised of sensors that detect certain objects, like pedestrians, lights, other vehicles, and specifically, road signs. Once an object is detected, it is classified. It can tell a person from a motorcycle, and a stop sign from a speed limit sign. Or can it? In fact, while car manufacturers have been busy ramping up security in their internet-connected car computer systems, they’ve missed a whole other aspect: changing the signs.

Yoshi Kohno, a cyber security researcher from the University of Washington, interviewed with Car and Driver Magazine recently and talked about an algorithm that created simple patterns on stickers that, when combined with regular signs, could confuse the vision sensors of cars. Here are a few examples:

“Using an attack disguised as graffiti, researchers were able to get computer vision systems to misclassify stop signs at a 73.3 percent rate, causing them to be interpreted as Speed Limit 45 signs.” – Car and Driver

In this example, researchers printed out a true-size image similar to the Right Turn sign and overlaid it on top of the existing sign. Subtle differences cause this to be read as a Speed Limit 45 sign.

Here is what the researchers are saying about sign mis-portrayal:

“We [think] that given the similar appearance of warning signs, small perturbations are sufficient to confuse the classifier,” wrote Kohno and his colleagues. “In future work, we plan to explore this hypothesis with targeted classification attacks on other warning signs.”

“Attacks like this are definitely a cause for concern in the self-driving-vehicle community,” said Tarek El-Gaaly, senior research scientist at Voyage, an autonomous-vehicle startup. “Their impact on autonomous driving systems has yet to be ascertained, but over time and with advancements in technology, they could become easier to replicate and adapt for malicious use.”

Previous attempts at changing signs or trying to manipulate a car’s computer have resulted in mixed success, partly because of the human factor. If a self-driving car also has a human driver that is paying attention, the rules are much more likely to be followed. But as technology increases, and confidence rises, the attention of a human driver may decrease. Remember all the classic Hollywood scenes of self-driving cars, from I, Robot to Knight Rider. KITT, on the one hand, is the ideal future mix of Siri and Tesla and a tank. The Audi in I, Robot is hopefully not a foreshadowing of the future. It might be unless technology can really be brought up to speed.