Researchers Find a Malicious Way to confuse Self Driving Cars
An autonomous car (also known as a driver-less car, auto, self-driving car, robotic car) is a vehicle that is capable of sensing its environment and navigating without human input. Many such vehicles are being developed, but as of May 2017 automated cars permitted on public roads are not yet fully autonomous. They all require a human driver at the wheel who is ready at a moment’s notice to take control of the vehicle.
Autonomous cars use a variety of techniques to detect their surroundings, such as radar, laser light, GPS, odometer, and computer vision. Advanced control systems interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage. Autonomous cars have control systems that are capable of analyzing sensory data to distinguish between different cars on the road, which is very useful in planning a path to the desired destination.
While automakers focus on defending the systems in their cars against hackers, there may be other ways for the malicious to mess with self driving cars. Car and Driver notes how security researchers at the University of Washington found they could easily trick autonomous vehicles’ image recognition systems by defacing street signs. By adding the words ‘LOVE’ and ‘HATE’ to the sign above, the computer vision algorithm no longer recognized it as a stop sign and instead believed it to be a speed limit notice.
What’s worrying is the way some signs only needed slight modifications to throw off the cars’ algorithms. A Right Turn sign, for example, was created which on first glance looks very similar to the real thing, but the subtle alterations make it appear as a Speed Limit 45 sign to autonomous vehicles. The worst part is that these stickers could be made by anyone using a home printer.
The researchers did suggest solutions to the problem. Using contextual information is one answer; a car could be programmed to question why a stop sign is on a highway, or a high speed-limit warning on a back road. Additionally, information from other sensors, such as cameras, radar, and GPS, could be taken into account when determining if a sign is real.
In late 2015, a research fellow at the University of Cork, Ireland, discovered a way of tricking self-driving cars into seeing phantoms objects by using a home-made electronics kit costing less than $60.
Estimates say it’ll be another ten years or more before self-driving cars are a common sight on our roads, which should give manufacturers enough time to clear the bugs. But the vehicles are unlikely to ever become popular in India; the country’s transport minister recently said he would protect human jobs by banning the technology.