
IoT devices and smart domestic abuse: who has the controls?
Image credit: Hive
From revenge porn to cyber-stalking, digital technologies have created new means for gender-based domestic violence and abuse. As Internet of Things-enabled systems flood our homes and lives, should we be on our guard?
In May 2018, in the first case of its kind in a UK court, a husband was convicted of stalking after he spied on his estranged wife by hacking into their smart home hub, installed in the kitchen.
Using a mobile app, the perpetrator logged into the audio facility on the iPad system display and listened in as his spouse confessed to her mother that she no longer loved him. Moments later he was at the door, confronting her about what she had just said.
The story is one of the first recorded instances of Internet of Things (IoT) technology - in this case a wireless system used to control the lighting, central heating and alarm - being deliberately abused by a domestic partner.
While no physical harm came to the victim, research shows the incident could have had a more violent end. According to the charity Refuge, two women are killed each week by a current or former partner in England and Wales; one in four will experience domestic violence in their lifetime, with women twice as likely to be victims than men.
The recent conviction is timely, given that researchers at the University College London are concluding a six-month feasibility study into the implications of IoT on gender-based sexual and domestic violence and abuse.
‘IoT’ in the study refers to interconnected ‘things’ and systems which are the direct and indirect extension of the internet into a range of physical objects, many of which were previously offline, such as lightbulbs and smart meters.
“Just like laptops or smart phones, IoT devices harvest data; however, due to their interconnectedness, they provide even more granular information about the habits and preferences of users,” explains lead author of the Gender and IoT (G-IoT) study, Dr Leonie Tanczer, a social scientist at the university’s Department of Science, Technology, Engineering and Public Policy.
The research aims to ascertain if these types of IoT technologies are or could be used by perpetrators of domestic violence and abuse against their victims.
Domestic abuse is extremely complex. It can be sexual, economic and psychological, with abusers using different tools, including technology, to exert control and often isolate their victims.
Research has shown that common devices, such as smart phones and laptops, along with social networking sites, have been routinely used to facilitate online harassment, domestic and sexual abuse.
Examples include perpetrators using the ‘Find My iPhone’ app to track a partner’s location or purchasing a smart phone for a girlfriend or spouse and then controlling how and when they use it. In extreme cases, the offender will install spyware software, such as Spyzie and WebWatcher, to track the location and activity on a particular gadget.
The pervasive nature of IoT technologies creates new opportunities for such abusive behaviour.
“Internet-enabled household appliances, wearable devices, or connected autonomous vehicles create new interdependencies between systems that were previously neither ‘smart’ nor interlinked,” says Tanczer. “Our study examines how dynamics of tech abuse as seen with phones or social media usage could be applied to the IoT ecosystem.”
The G-IoT study is run in collaboration with the London Violence Against Women and Girls (VAWG) Consortium, Privacy International and the PETRAS IoT Hub. Tanczer and her colleagues Dr Simon Parkin, Dr Trupti Patel and Professor George Danezis interviewed key stakeholders, such as domestic violence and abuse organisations, frontline support workers, police representatives and academics. They have also begun compiling a technical analysis of the most common in-home IoT devices: Google Home, Amazon’s Alexa and Philips Hue Lightbulbs, examining how they are managed and accessed and how the user interacts and amends the interface.
What quickly became clear was that although IoT-mediated abuse is not yet widespread, these systems show potential for exploitation. They found issues around cross-platform interdependencies and data exchanges, shared accounts, tracking/location capability, audio and video functionality.
In a wider context, anecdotal evidence collected by Refuge and other organisations shows that when technology is used to abuse a partner it is often the perpetrator who is the most technologically savvy and therefore will install and manage the devices.
If an abuser buys an Amazon Echo for the home they will most likely be the main account holder, with the victim sharing its ‘benefits’. This means the perpetrator can login and view all the device’s voice command and purchase history, giving them an opportunity to spy on their unwitting spouse. Furthermore, to stop the device from recording, a user has to actively put it on mute, an action that could arouse suspicion in a controlling partner.
The study found that to isolate and exert control using an Amazon Echo account, the abuser could apply ‘Child Account’ settings to their adult partner restricting what the user can view or purchase through it.
“Social media abuse is more psychological, but with home devices it can be physical control, you could potentially deprive someone of their basic human needs,” explains Dr Trupti Patel.
For example, in an extreme scenario, with in-home systems such as Hive, which allows remote control of a thermostat, lights, plugs and to even detect motion in different rooms, potentially an abuser could deprive their victim of heating during winter, the use of electric plugs and also monitor them as they enter, leave and move around the house. Smart locks create similar risk vectors.
“Whereas before there may have been some escape time for the victim, because of the constant tracking nature of internet-enabled devices it could become impossible to find time and space to not be monitored, whether by a phone, a Fitbit or smart car,” says Millie Graham Wood, a solicitor at Privacy International. “It’s so granular that even if a spouse sneaked the heating on for 15 minutes, the smart meter will give them away.”
This is, of course, a futuristic scenario, but Tanczer says the aim of the study is to ‘proactively highlight’ opportunities for abuse so as not to have to react to the issues only after they present themselves.
Abusing technology is nothing new. However, because IoT is expected to be so pervasive in our lives, with the government estimating every household will have 15 internet-connected devices by 2020, should designers, manufacturers and policymakers be doing more now to mitigate potential future problems?
‘Internet-enabled household appliances create new interdependencies between systems that were previously neither smart nor interlinked.’
It’s a complex question, like the issue of domestic violence and abuse itself, but Professor George Danezis hopes features can be identified to make the technology safer.
“The IoT space is nascent and it’s the nature of early technology to be deployed by optimists wanting to make the world better but perhaps not thinking about scenarios where there could be unhappy households with conflict within them,” he says.
Harmonising the need for security, functionality and privacy, though, is a herculean task. For example, for technologies that require AI and machine learning to improve functionality, such as home assistants, it may be difficult to instantly delete all user activity, given that there is an existential requirement to track and gain insights from that information.
“Amazon records its interaction to improve itself so it can learn things about its user to have higher quality interactions in the future,” explains Danezis, “You can build a device that instantly deletes information, but it might not work as well.”
Dr Simon Parkin says there is a trend away from custom sensors towards ones that are full camera and microphone set-ups. The development cost of adding these extras into devices is minimal and many manufacturers choose to incorporate them so they can add capabilities via a software update later, without having to change the hardware.
“IoT platforms are sticking cameras and microphones into people’s most intimate spaces,” Parkin says. “But it is very difficult to determine by the naked eye if these features are being used as the manufacturer intended, for example to measure ambient light, or if they have been doctored into a full-fledged spying device; this is a fundamental challenge.”
Furthermore, Budi Arief, senior lecturer at the University of Kent, whose research focuses on finding security vulnerabilities in IoT devices, says many are not designed with security in mind.
“The manufacturers main concern is hitting the niche market before their competitors, so they might have weaknesses that can be exploited to take control of the cameras or listen to conversations,” he says.
Consumer watchdog Which? in November found that connected toys such as Furby Connect and the i-Que robot had a basic design flaw as no authentication was needed between the toys and the devices they could link with via Bluetooth.
The government is trying to address these concerns and in March released a report called ‘Secure by Design’ that will culminate in a code of practice for IoT developers on how to move the burden of security away from consumers and instead be already built in to products.
Finding a compromise between safety, security and functionality across many different connected technologies is an ongoing battle for engineers and software developers, and will remain so.
Something that can be done now is to educate key stakeholders, domestic violence and abuse organisations, frontline support workers, police and the victims themselves in understanding IoT devices.
To do this, the UCL team are in the process of producing a preliminary guide on the associated risk vectors and how users can try to protect themselves through changing passwords and disabling network access. Though, due to the complicated nature of domestic abuse and the fast-changing pace of technology, they say the advice should be given on a case-by-case basis.
Manufacturers also need to provide better information on their internet-enabled products, says Parkin. For example, the researchers found that some of the devices they analysed did not come with instruction manuals, meaning many of their settings are not easily understood. Instead users need to search online for guides, which Parkin says is “absolutely unacceptable”.
Another suggestion is that each developer should provide a dedicated team for victims to contact for information on how to covertly change these settings, if they suspect they are being spied on. However, Tanczer says while this is a practical idea, if the gadget is from Amazon or Google, it won’t work for smaller manufacturers.
“There will be devices from SMEs that will create heterogeneity and a victim can’t call them because they are in China; there will definitely be complexities that emerge, for example, you can have a suspicion but what do you do about it? But at least we are discussing it now,” she says.
However, Tanczer believes there is a will to address these issues from major tech firms. At a SafeLives/Comic Relief ‘Tech vs Abuse’ Round Table, companies including Apple, Facebook and Google said there might be an opportunity that women’s shelters could contact them if they engage with a victim who is asking for help.
Last year, Refuge launched a part-Google-funded programme to tackle technology abuse. The money will go towards training 300 frontline professionals and to setting-up a dedicated expert unit to help its clients stay safe, as well as targeted campaigns to raise awareness.
If victims understand internet-enabled technologies, that can empower them to better protect themselves during and after escaping an abusive relationship, but also help them collect evidence.
Professor Heather Douglas, RC Future Fellow at the University of Queensland, who has also researched technology and domestic violence and abuse, says she has known the police in Queensland, Australia to give victims an app to covertly record incidents.
When survivors escape an abusive relationship, often the default position from frontline workers is to tell them to ‘detox’ from technology.
“This is no longer acceptable,” says Douglas. “We need to make sure there is training for women rather than simple disconnection.”
This is especially important to help survivors protect children shared with their abusers. A report co-authored by Douglas details one case where a GPS tracker was planted in a child’s toy to determine the mother’s location.
In one of the G-IoT study workshops, Tanczer argued that the research was “two years too early”.
Quickly, a police representative interjected, it was actually “just the right time”, she said. For once, law enforcement, support services and technologists could be ahead of the curve.
The study also adds to the wider discussion around the social impacts of technology that is usually reserved for AI and automation, and in which minority and vulnerable groups are often not included but are sometimes the worst affected.
However, the research does raise more questions than it answers – and that was precisely its point.
“Fully automated, smart and interconnected homes might feel very abstract now, but in the future, they will be ubiquitous, so we must consider their potential for abuse now, because once they have penetrated every aspect of our lives, it will be hard to remove or amend these systems,” says Tanczer.
“There is a need for further global research, and we hope this project facilitates a network across different stakeholders until eventually we have an appropriate understanding and response to these issues.”
Secure by design
In March, the government published a draft report, ‘Secure by Design: Improving the cyber security of consumer Internet of Things’, that advocates protection and safety to be built into internet-connected devices. Although the report doesn’t refer to domestic abuse and violence specifically, it does highlight vulnerabilities that could be exploited in this context.
These include a lack of basic inbuilt cyber security provisions undermining consumer security, privacy and safety, as well as the potential for an ‘attacker’ to disable safety controls and even deny usage, for example, of heating in winter.
It includes a draft ‘code of conduct’ that is largely generic, but which hopes to engage stakeholders, including manufacturers, IoT service providers, mobile application developers and retailers, to proactively design adequately secure devices.
Some of its suggestions could be helpful to victims of domestic abuse and violence.
It suggests stakeholders should be open and explicit about security mechanisms in place in a device and have established in-house incident response procedures for timely reporting and action to address comprised systems. The latter, however, seems more targeted towards larger scale incidents, as opposed to individual, domestic breaches, but could be useful for domestic violence victims.
The government’s National Cyber Security Centre already provides advice to citizens and organisations, but it is not widely known by consumers.
It also states that if an unauthorised change is detected, the device should immediately alert the user/administrator. This could be effective in detecting and alerting if malware is being used.
Consumers should also be provided with guidance on how to securely dispose of gadgets and removing personal data should be easy, it says.
However, the report acknowledges that cost and justifies investing time and money, when the main aim is to get a product to market asap, can be disincentives for proper security design. Furthermore, consumers struggle to distinguish between good and bad security in devices, primarily due to a lack of information.
Some recommendations in the report, which draws exclusively from the resources of PETRAS, an IoT research hub, are covered in law by the Data Protection Act. The rest are recommendations with the government stating a “preference for the market to solve these issues” itself.
Yet it remains to be seen if this will happen and raises questions about what more the government can do to safeguard victims.
“Perhaps it’s time,” says Dr Leonie Tanczer, “to have a discussion about what someone needs to do to prove they are buying software, such as Spyzie or WebWatcher, for legitimate uses.”
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.
Recent articles
