Can tech help end sexual violence?
Image credit: Dreamstime
In recent years, a flurry of apps have emerged aiming to tackle sexual violence either through support, protection or reporting to police – but many are clumsy and misjudged. Stories of sexual assault continue to dominate headlines, as do demands that something must change.
If you’re a man and you can’t quite believe the statistics about sexual violence, then go back to the women in your life and ask them. Is it because it hasn’t happened, or they just haven’t told you, says Lucrezia Spagnolo, founder of VESTA, a platform to help survivors of sexual assault. “Don’t take my word for it – speak to them.”
This, say activists, is the nub of the problem. Most victim-survivors don’t report to the police. Stigma, fear and shame hold them back – would they be believed in a ‘she said, he said’ situation? Across the world, rape and sexual violence are notoriously under-reported and the rate of conviction is dismal – in England and Wales more than 99 per cent of rapes reported to the police don’t end in a conviction. By 2024, the government aims to double the number of rape cases that reach court and latest figures have shown a slight increase in the number of convictions.
In the UK, nearly a quarter of women (22.9 per cent) have experienced sexual assault, and slightly more domestic abuse – which costs an annual £66bn in England and Wales, according to government estimates.
One estimate put the number of apps designed to tackle the problem at around 500, many of them more fit-for-purpose than the ill-fated iConsent.
How can technology help a victim who doesn’t look like a victim, says abuse survivor turned therapist Emma Davey, who’s launching a product targeting narcissistic abuse. “Because I wasn’t black and blue, no one said ‘you’re in an abusive relationship’,” she recalls. More than 33,000 people follow her Facebook support group – but there are many more victims, she says. “They’re just the tip of the iceberg.” An estimated 5 per cent of people suffer narcissistic personality disorder, and they go on to abuse many people in their lives, from children and partners to colleagues.
Davey saw her abusive boyfriend jailed for 19 months for coercive control, but getting justice took years. “He’d destroy my phones and hack my computer to delete evidence,” she says. Recent high-profile cases have flagged the prevalence of emotional abuse and control – the term ‘narcissist’ even made the BBC drama ‘Happy Valley’. “I didn’t recognise what I was going through,” Davey recalls. “You are terrified of your abuser. They’re very good at saying I’m sorry and reeling you back.” After six years of a relationship, which culminated in a violent attack on holiday, she escaped to Australia, where she trained to become a therapist.
Documenting evidence is critical – Davey’s ex once called 88 times in a day, which she managed to screenshot. Her new app – developed with advice from lawyers – helps victims recognise and record abuse and find support. Disguised as an innocuous everyday app, each page has an exit button in case the abuser spots what the user is doing. “I’ve obsessed over it for two years to make it secure so it doesn’t add to the risk,” says Davey. Features of MyNARA (narcissistic abuse recovery app) include a 12-step recovery programme, a journal to report incidents and cloud storage for evidence.
Many methods of documenting victims’ stories aren’t perfect but there’s still a need, says Professor Heather Flowe, a memory scientist at the University of Birmingham. She’s taken a look at a selection of apps designed to help victims of violence and abuse. Many are poorly designed or have gone out of business.
Some apps act as ‘digital chaperones’ with an emergency direct dial to police. Others help victims document events before their memories fade, or provide support and allow for anonymous reporting. A well-designed app can gather evidence that could be essential during a trial, but methods mustn’t ‘contaminate’ memory or undermine evidence, says Flowe. Testimonies must be time-stamped – and any alterations recorded.
There are ‘gold standard’ principles for face-to-face interviewing that have been scientifically tested to help protect and preserve memory. These include establishing a rapport – which is harder but not impossible via technology, setting ground rules, using open questions and allowing individuals to find their own voice.
Technology can help victims scale the huge hurdle of reporting assault or abuse in person, and cement memories that otherwise might fade, says Flowe. “Recalling an offence soon after the crime can stall the rate of forgetting,” she says.
One app, Safe City, allows victims to anonymously report sexual attacks in public places such as trains to help identify hotspots. Others allow individuals to record in real time and store evidence of bullying, assault and domestic abuse in relationships – evidence which could stand up in court. One app – Hollie Guard – with an estimated 500,000 users, offers a range of tools including an easy alert button, which triggers a phone’s camera and microphone to begin recording automatically.
Most apps, says Flowe, were competent at encouraging victims to make statements and documenting useful evidence, but some didn’t offer enough security features, nor did they allow users to answer ‘I don’t know’ – and this could skew and undermine answers in court.
Flowe wants to see authorities make more use of advances in artificial intelligence and machine learning, and hopes her research will show that AI tools are viable for police use. These technologies could help link testimonies and crimes and spot patterns of violence against women by prolific offenders such as the recently jailed rapist and former police officer David Carrick. “If they’re all saying the same thing about the perpetrators, it’s easier to apprehend these individuals who are committing extreme numbers of offences – and really make a huge dent in sexual violence.”
Her team is investigating how machine learning could help police as they interview victims, by both flagging the kind of details that are most easily recalled after an attack and getting the fullest most accurate accounts – and spotting cognitive clues that show a victim feels under pressure. “As cognitive effort increases, accuracy tends to decrease.” If an app could prioritise what information to ask first, police could record more accurate information.
A couple of decades after payments professional turned social entrepreneur Lucrezia Spagnolo was sexually assaulted, she returned to her native Canada. Memories of the incident came roaring back as news agendas were packed with high-profile sexual assaults. “One of the greatest myths [about sexual assault] is that people don’t report it because they don’t want to talk about it,” she says. “Through my own research, well over 80 per cent of individuals do come forward to tell somebody.” But if victims don’t speak to professionals, they’re unlikely to receive timely support or may not recognise an assault for what it is. “If you’ve dismissed it, not only do you not know what’s happened, you can internalise it instead and then guilt becomes shame.”
She wants to intercept survivors in the aftermath of a sexual attack and help limit the long-term fallout – helping them as appropriate, whether by reporting anonymously, finding professional support or making a formal report, or documenting and preserving evidence. “When we talked to survivors, one thing was consistent – everyone turned to the internet first to research or talk about what had happened to them.” After extensive research, she’s built the online platform VESTA Community, a single place where survivors can record what’s happened to them, find support and report to police if they wish – it is, as she says, survivor-centric. “One of the biggest challenges is combating that fear of walking into a police station and not being taken seriously.”
She launched in 2021 and has worked with law and the police to ensure that any evidence would meet legal standards. A Canadian university has come on board – a fifth of sexual assaults in Canada take place on campus. One study in the US estimated that 60 per cent of female university students had experienced unacknowledged rape. Other research suggests that 30 to 88 per cent of all sexual assaults are unacknowledged by survivors. Her aim is to reach wider. “It can take individuals a long time to realise what has happened to them. My driving goal is to get people the kind of interventions they need sooner rather than later.”
Technology has its place in supporting victims and professionals to cope in the aftermath of sexual violence, says Trang Le, a researcher at Monash University in Australia, but its concept is flawed. Apps are ‘sticking plaster’ solutions that put the onus on women to protect themselves, while failing to tackle the cause of male violence, she believes. And they promote a ‘women are vulnerable’ story that many resent.
Well-intentioned GPS trackers that can alert friends or family of danger can lull us into thinking surveillance is a normal approach to protecting women, she says. “We really need to think about how we can combat sexual violence without suggesting that women are passive victims in need of protection that’s inevitably predicated on restrictions, surveillance and control.”
Evidence suggests too that individuals (in this case authorities or police) may feel less empathetic in the absence of face-to-face interaction – which could make them potentially more judgmental about photos and recordings of victims, she warns, and allow bias to creep in.
While more than eight in ten people worldwide own a smartphone (Statista 2022), there will still be women who don’t have access to the apps available, “and that’s a concern”, says Michaela-Clare Addison at Victim Support, where reports of sexual assault are rising. “The main thing is that survivors are listened to with dignity and respect. And that they feel believed.”
For Spagnolo, these solutions are a stepping stone towards a far greater ambition. “We can only deal with where we are. Right now our goal is to minimise the trauma as much as possible, and in the long term we want to be part of a conversation that stops sexual assault happening in the first place.”
Jogging in the dark, Erin-Jane Roodt had a lightbulb moment. She was a student at the University of Bath, living off-campus and tired of the rigmarole of checking in with friends to ensure everyone had got home safely. She went on to develop what she says is the first automated app to help women stay safe. “You just don’t feel safe on your own in the dark as a woman,” she says.
EpoWar uses artificial intelligence to detect automatically if a smartwatch wearer is in distress or physically under attack, by monitoring heart rate and motion. This tech can spot the difference, she says, between exercise – which is regular – and erratic motions if someone is fighting off an attacker or pushed to the ground.
Her app will alert the user’s chosen emergency contacts of their location and state, and sound an alarm. It will also begin recording data – from audio to heart rate – and store it all in the cloud. “We’re looking at a pilot project with the police” says Roodt.
This app avoids the need to manually activate an alarm or pull out a phone, which is often not possible, says Roodt.
Working with an engineer, Roodt and her co-founder have trained their model by extensive simulation of attacks. Currently in late stages of testing, the app will launch first for Apple Watch with a view to being available later for Android smartwatches.
Encrypted tech to help victims flag serial attackers
Nine in ten of all sexual assaults at US university campuses are committed by repeat offenders, say the creators of encrypted technology that allows victim survivors to share and match details of their attacker. Survivors can now use Callisto Vault – a US non-profit – to enter information that identifies their attacker – such as a social media handle and where the attack took place. Survivors receive free and sensitive legal counsel and can also store time-stamped records of the attack using the app’s encrypted technology. They can use this information to report the crime if they choose – but they’re not obliged.
A recent pilot on US university campuses revealed that 15 per cent of attacks logged matched the perpetrator of another attack. After the pilot, creators plan to expand free access to the technology across all US universities.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.