Billions of people—and a growing number of autonomous vehicles—rely on mobile navigation services from Google, Uber, and others to provide real-time driving directions. A new proof-of-concept attack demonstrates how hackers could inconspicuously steer a targeted automobile to the wrong destination or, worse, endanger passengers by sending them down the wrong way of a one-way road.
The attack starts with a $225 piece of hardware that’s planted in or underneath the targeted vehicle that spoofs the radio signals used by civilian GPS services. It then uses algorithms to plot a fake “ghost route” that mimics the turn-by-turn navigation directions contained in the original route. Depending on the hackers’ ultimate motivations, the attack can be used to divert an emergency vehicle or a specific passenger to an unintended location or to follow an unsafe route. The attack works best in urban areas the driver doesn’t know well and assumes hackers have a general idea of the vehicle’s intended destination.
“Our study demonstrated the initial feasibility of manipulating the road navigation system through targeted GPS spoofing,” the researchers, from Virginia Tech, China’s University of Electronic Sciences and Technology, and Microsoft Research, wrote in an 18-page paper. “The threat becomes more realistic as car makers are adding autopilot features so that human drivers can be less involved (or completely disengaged).”
Previous spoofing attacks have shown how to fake the GPS coordinates of yachts, drones, and critical infrastructure equipment. The new paper, titled , goes a step further by using falsified locations to manipulate navigation instructions. The paper will be presented at next month’s 27th USENIX Security Symposium in Baltimore.
The GPS spoofing device, pictured above, is most effective when it’s planted inside or directly underneath a targeted vehicle, but it can also be operated from a drone flying overhead the targeted vehicle or tailgating automobile. The spoofer consists of the front end of an open-source software-defined radio called a HackRF One, a Raspberry Pi, a battery, and an antenna. Its frequency range covers the civilian GPS band. The Raspberry Pi runs a secure shell server (SSH). The researchers built their spoofer for $223.
Haunted by ghost routes
The attack uses the spoofer to send false location data to the GPS service, which for this demonstration was an Android phone running the Google Maps app. The SSH server receives instructions issued by a remote sender that causes the spoofer to falsify the location in a way that subtly changes the directions. Algorithms developed by the researchers help to insure the ghost route mimics the general shape of the original route to prevent, for instance, the false directions from directing the driver from making a turn that doesn’t exist. The malicious instructions can be issued in real time by the attackers or through a pre-written script.
The researchers tested a simulation of their attack against real cars and found it works best in cities where road networks are dense. Randomly selecting 600 real-world taxi trips in Manhattan and Boston, the researchers were able to divert on average 40 percent of them 1,600 feet. They were able to send 85 percent of New York taxis and 98 percent of Boston taxis into time-delaying loops.
The rate for successfully diverting a vehicle to a specific location (say, a police station instead of a bank robber’s getaway car) varied depending on the distance between the manipulated destination and the intended one. In Manhattan, there was a 70-percent, 47-percent, and 20-percent median hit rate when the distances were 1,600 feet, 3,200 feet, and 6,500 feet, respectively. The researchers’ algorithm identified a potentially dangerous wrong-way victim route for 599 of the 600 taxi trips.
While the proof-of-concept attack is attention-grabbing, a variety of things significantly limit its effectiveness in the real world. First, the attack requires the physical spoofer be in close proximity to the navigation device, and second, it works best when attackers have a general idea of the targeted vehicle’s intended destination. That means attacks aren’t likely to work in large numbers. Rather, the attacks are likely only practical for targeting a specific individual who is in close range.
Other limitations: the attacks aren’t nearly as successful in rural or suburban areas or against people who are familiar with the area in which they’re traveling. The takeaway of all of this: the vast majority of readers shouldn’t worry about falling prey to these types of exploits.
Still, with billions of people already using navigation services and the projected growth of autonomous automobiles, engineers and security professionals would do well to heed the findings. The researchers outline a variety of countermeasures. The most effective is to give civilian GPS signals the same type of encryption military GPS has used for decades. Unfortunately, that would do nothing to protect people using the massive number of GPS devices already in use. Another countermeasure is to develop trusted ground infrastructure to help GPS devices verify their location. This, too, is at best a long-term solution because of the cost and constraints in government policies.
The researchers propose several other countermeasures. The one they say is the most promising, in terms of its effectiveness and cost, is “computer vision techniques to cross-examine the physical-world landmarks and street signs with digital maps.” Given the widespread use of cameras and LIDAR, this protection could be put in place with software-level upgrades, the researchers said.