Hacking wearable technology: sensors come under sonic attack
Can you hack a smartphone or a FitBit with sound? It doesn’t sound as though it’s an attack that will work, but researchers have found ways in which audio can upset motion sensors.
It’s not a hack in the conventional sense of confusing the device with a flood of data and then taking it over, but it is one that can make embedded devices lose track of where they are.
So far, hacks on digital systems have largely been passive. Sound is one of the ways in which you can perform side-channel analysis on a device to work out what sort of calculations a processor is handling at that moment. Although power and electromagnetic signals get used more often, some systems can give away what they are doing through ultrasonic emissions.
Second-year PhD student Timothy Trippel and colleagues at the University of Michigan and the University of South Carolina opted to use sound to attack embedded devices directly through their accelerometers. These types of sensor can be vulnerable to sonic attack because of the way they use a moving internal mass as a way of detecting changes in acceleration. The mass sits on a cantilever that oscillates as it shifts backwards and forwards under motion.
Naturally, the masses involved are tiny, so can be shifted around using sound waves. You would need fairly big shock waves to simulate movement directly, but there is another way to fool the sensor – or rather the software that processes its data.
To reduce noise from a tiny mass bouncing up and down every time it suffers a shock, the electronics and software behind the sensor use a low-pass filter. The researchers found these filters do a pretty good job up to a point.
However, if the attack can stimulate resonance in the cantilevered structure, the sensor starts to produce readings that are wildly out of line with normal behaviour. Typically, this resonance operates at a much higher frequency than real-world motion – I doubt if anyone can jump up and down 3,000 times a second. Yet the filtering can cause some of that signal to trickle into the output and register a movement that isn’t really happening.
As a proof of concept, the team got an accelerometer to spell out the word ‘WALNUT’ on a trace by firing modulated audio at it. They then moved on to a couple of more realistic applications. One was to subvert the output of a smartphone app that controls a model car. Normally, the user would tilt the phone to steer it. Audio played close to the smartphone controlled the car without the phone moving at all.
Another trial involved placing a Fitbit wearable fitness band next to a $5 speaker to register thousands of fake steps. There is a potential fraudulent use of this attack, Trippel and colleagues noted. Some companies, such as the Walgreens pharmacy in the US, offer rewards based on step counts. Few are going to get rich off that.
However, given that accelerometers are vital to the navigation systems of robots and drones, the idea of a sonic attack does get a little more serious. A powerful enough noise aimed at a drone might confuse it enough to take it down, although it would be pretty obvious to everyone around what was happening.
A more insidious attack might be to cause a robot arm to fail in a factory by intermittently playing sounds to it through a device placed nearby. It’s not a mass-market attack, but one that robot designers will need to design out.
The good news is that the defence against this type of attack does not seem to be too onerous. It calls for a change to the way in which sensor filters are designed so that they better block the effects of high-frequency resonance.
However, it does illustrate how insidious some types of hack can be – they will exploit a weak point no matter how obscure. In case of sonic attack, think of your sensors.