- CREDIT: Autoblog

Apparently Self-Driving Cars Can By Hacked By Stickers

Researchers confuse autonomous car with stickers on a stop sign

4y ago
97.5K

This information comes off a lot scarier than it actually is, especialy since the article never mentions in detail which cars the researchers used for testing, but regardless, it goes to show that in terms of autonomous driving systems, there are some pretty simple ways to fudge with the car's sensors.

A number of researchers from the University of Washington, University of Michigan, Stony Brook University, and UC Berkeley have figured out a way to hack self driving cars by putting stickers in a variety of patterns on street signs.

I want to again point out that there is no mention of what car they used, so don't think that this test was done in a Tesla Model S or anything. Chances are it was a researcher created "car" for the purpose of this experiment, so you have to bear in mind that actual cars on the road with autonomous technology have more sophisticated systems. If anything, this experiment is simply to raise a point.

In a research paper titled "Robust Physical-World Attacks on Machine Learning Models," researchers demonstrated four ways to disrupt an autonomous car's sensors using nothing more than a color printer and a camera.

A number of methods were used to disrupt the sensors, including putting up a full sized sign to cover the stop sign completely. This caused the sensor to classify the stop sign as a speed limit sign 100% of the time.

Another method involved putting up a number of stickers to spell out words like "love" and "hate," which caused the sensor to read a speed limit sign two-thirds of the time, and once as a "yield" sign. A third method involved placing stickers in an "abstract art" pattern with just a few small, strategically placed stickers, which ended up having the same effect as the full poster cover up.

In order for these "attacks" to work, potential hackers must know the algorithm the car's sensor system uses to recognize road signs, but again, many self-driving cars today have numerous sensors and wouldn't be able to be tricked so easily.

Tarek El-Gaaly, senior research scientist at autonomous driving startup Voyage, says that there are a number of solutions for these hacks, including incoporating some sort of context system so a car knows not to go highway speeds in a residential area if a fake sig is deployed.

"In addition, many self-driving vehicles today are equipped with multiple sensors, so failsafes can be built in using multiple cameras and lidar sensors."

SOURCE:

Cornell University Library via Autoblog

GTN STORE NOW OPEN

Support us with your own GTN gear here!

Join In

Comments (30)

  • Puh-lease...here is a typical stop sign in Greece.

      4 years ago
  • NOT A HACK.. It confuses the sensors.. a Hack would indicate that it somehow took control of the vehicle outside of its parameters.. Calling it a Hack is just blatant fear-mongering

      4 years ago
    • It IS a hack. You have caused the algorithm to malfunction - ie it's working outside it's parameters. (and yes, it IS blatant fear-mongering :-( )

        4 years ago
  • Thanks, but I would rather drive myself anyway.

      4 years ago
  • Self-driving cars will be just another obstacle for us to watch out for on the highways, along with drunk drivers, texting drivers, and rookie drivers. Except, the self-driving cars will be less predictable, making driving even more challenging and fun. (sic)

      4 years ago
    • It's likely you wont ever get into an accident with an autonomous vehicle. They are capable of predicting accidents before they happen and act accordingly. Remember - they are looking at a whole picture of the road (even things a human...

      Read more
        4 years ago
    • Perhaps, eventually, the computer in a self-driving car will be able to predict the behavior of all the other drivers on the road, and pedestrians, and account for every change in weather and road surfaces and the locating of orange...

      Read more
        4 years ago
  • One scary dilemma for self-driving cars is when the owner of the car is going to be in an unavoidable accident with a bunch of pedestrians or a school bus full of children. Will the car sacrifice the needs of the many for the needs of the few or the one? Will the car decide the guy who bought and paid for it should die in order to save more people? I hope I'm never in a situation where I'm about to run into a bunch of people but if I am, I hope my car isn't the one to decide whether I live or die. 🖖🏼

      4 years ago
30