In June Regulus, an Israeli security firm stated that they had successfully implemented a hack that forced Tesla model 3 off the road. Comprehensive details about the hack can be read in their blog post. The news was picked up by both Swedish and international media. In summary, Regulus had preset a route for the vehicle instructing it to drive to a nearby city, a route that includes a highway exit. The researchers then sent false satellite coordinates to the vehicle making it believe that the exit was closer to the vehicle than it was in reality. The car slowed down and exited into an emergency parking instead of the intended exit. As the driver was occupied with other tasks the control over the vehicle was regained too late. The researchers also state that they had managed to affect the vehicle’s behavior in that it had both accelerated, changed lanes and made hard breaks.

 

Tesla themselves provided a response:

“Any product or service that uses the public GPS broadcast system can be affected by GPS spoofing, which is why this kind of attack is considered a federal crime. Even though this research doesn’t demonstrate any Tesla-specific vulnerabilities, that hasn’t stopped us from taking steps to introduce safeguards in the future which we believe will make our products more secure against these kinds of attacks.

The effect of GPS spoofing on Tesla cars is minimal and does not pose a safety risk, given that it would at most slightly raise or lower the vehicle’s air suspension system, which is not unsafe to do during regular driving or potentially route a driver to an incorrect location during manual driving. 

While these researchers did not test the effects of GPS spoofing when Autopilot or Navigate on Autopilot was in use, we know that drivers using those features must still be responsible for the car at all times and can easily override Autopilot and Navigate on Autopilot at any time by using the steering wheel or brakes, and should always be prepared to do so.”

 

Other researchers that examined how the hack was conducted soon pointed out that it was not as serious as indirectly implicated by Regulus. In particular, it was pointed out that while GPS can be used to make the vehicles take other routes than the intended ones, the onboard sensors will prevent accidents. To put it in the words of Jim Salter:

“But this attack is like handing Mom or Dad the wrong map on a family vacation: sure, you might get lost, but the wrong map won’t plow the car into a tree. Just like the human driver in our example, an autonomous or semi-autonomous automotive application only uses the GPS to decide which road to take; what is or is not a road at all is decided by local sensors. In a human driver’s case, “local sensors” mostly means a pair of good old-fashioned Mk I Eyeballs; in the Tesla’s, it’s radar, ultrasonics, and a suite of eight cameras enabling full-time 360-degree visual coverage. I reached out to spokespersons from Tesla, Uber, and Cruise, and all made similar statements. Essentially, these companies say GPS helps cars decide which road to take, but it has nothing to do with a car’s decision about what is or is not a road in the first place. “

 

This is not the first time we have seen GPS spoofing in vehicles. Both yachts and road vehicles have previously been successfully targeted and lead to believe that their position in relation to their target was inaccurate. As stated above, these hacks are yet to prove to be directly dangerous to passengers, however, since the signal needed to succeed with the attack would need to be relatively strong it would affect several vehicles and provide means to cause serious traffic disturbance.

 

If interested in Tesla attacks that can lead to safety-related consequences make sure to check out Chinese Keen Security Labs, known for a number of previous successful attacks. The researchers have in their latest experiment (among other things)s managed to trick the lane recognition function by creating fake lane markings, which would make it possible for a vehicle exit the road in a much more unsafe manner.

 

Written by Ana Magazinius, RISE.

Facebooktwittergoogle_plusredditlinkedinmail