Kwork.ru - услуги фрилансеров от 500 руб.
Home / auto / Car autopilot can incapacitate the usual stickers on road signs

Car autopilot can incapacitate the usual stickers on road signs

Автомобильный автопилот могут вывести из строя обычные наклейки на дорожных знакахStickers on Stop sign caused the system to incorrectly identify it as the speed limit sign

While automakers are working to protect their cars from hackers, researchers from the University of Washington has demonstrated, it is possible to make a car’s computer systems mistakenly identify traffic signs using not that other as labels made on your home printer.

The computer security expert Professor Yoshi Kono described the algorithm “attack” on the Autonomous driving system that uses a printed image affixed to road signs.

Kwork.ru - услуги фрилансеров от 500 руб.

So, a small sticker on the Stop sign caused the system to incorrectly identify it as the speed limit sign at 45 mph (72,4 km/h).

System “vision” of Autonomous vehicles typically use the object detector, showing pedestrians, lights, signs, other vehicles, etc., and the classifier decides what is the object and say the signs. Computer hackers can obtain access to this classifier and by changing the algorithm, and photos to change the classification of the object, in particular of a road sign.

Researchers have long known about this threat. But the previous attack had changes that were either too extreme and therefore obvious to the person of the driver, or too thin, running only from a certain angle or a certain distance. As it turned out, not necessarily to get into the system. Enough to put on a sign, graffiti or stickers to recognize the sign differently.

The algorithms developed by Kono with colleagues from other universities based on printing on a conventional color printer of stickers attached to existing road signs. In one case, they stuck a road sign right turn a full-size copy on which arrow looked spotty or faded to the human eye. Computer vision system identified it as a speed limit sign of 45 mph.

In the second case, the researchers stuck on the sign rectangular black-and-white stripes in a chaotic location and in the expressions “Love” and “Hate”. They also forced the computer to see a sign limiting the speed to 45 miles per hour. The audit was conducted from various distances and under different angles of view. To fool the computer turned in 73.3% of cases.

The danger of such actions, which researchers call “attacks” are obvious, for many experimental vehicles and some vehicles are already equipped with system of recognition of characters. If so, we can cheat a future Autonomous car, it can pass the intersection without missing the transport moving on the main, or Vice versa, to slow down in the fast lane.

As follows from experiment, the developers of the technology Autonomous driving car will solve a lot of problems. They have to use various combinations to protect not only from hackers, but as you can see, and from the usual bullies.

“Many of these attacks can be overcome using contextual information from maps and environment, says Tarek El-Ghali, senior researcher, Voyage start — up developing Autonomous driving systems. For example, the speed limit sign “65 mph” is not appropriate in the city as much sense as a stop sign [Stop] on the highway. In addition, many Autonomous vehicles today are equipped with many sensors, so fault tolerance can be built using multiple cameras and lidar sensors”.

© 2017 – 2019, paradox. All rights reserved.

Check Also

The brand “cool kids” became a rainbow. Milonov called “cock stuff” using BMW of the LGBT flag

The decision of BMW to use the LGBT flag for their logo on social networks …

Leave a Reply

Your email address will not be published. Required fields are marked *