Technology-Facilitated Violence Against Women and Girls: Trends, Risks, and the Need for Survivor-Centred Responses

Technology-facilitated abuse is one of the fastest-growing forms of violence against women and girls, evolving alongside the digital tools that shape everyday life. In this interview, WAVE Youth Ambassador Tamina Summersgill speaks with Kiera Brodie, Tech Abuse Training Lead at Refuge’s Technology Facilitated Abuse and Economic Empowerment (TFAEE) team. As one of the few specialist teams working directly with survivors of tech-facilitated abuse, and the UK’s only dedicated team of its kind, Refuge’s TFAEE team offers critical insight into how perpetrators misuse digital technologies to monitor, control, and harass survivors. This conversation explores the changing nature of tech abuse, the importance of survivor-centred responses, and the urgent need for stronger policies and services to keep pace with these emerging harms. 

1.   Can you tell us a bit about what your team does, and what makes it so unique? 

The Technology Facilitated Abuse and Economic Empowerment team was created in 2017 and began providing direct support to survivors in 2018. 

It developed in response to increasing calls from survivors who were reporting being tracked or having someone access their online accounts. When they spoke to statutory agencies, they were often told, “This isn’t happening; you’re paranoid” – reinforcing the gaslighting and psychological abuse by the perpetrator(s) and implying that the only solution was to digitally disengage. This approach is not only impractical, but also fails to address the issue or hold perpetrators accountable 

We’re one of few teams that operate from a place of survivor empowerment. We don’t tell people to go offline, change numbers, or delete social media. Instead, we aim to help survivors feel empowered to use technology safely, as a tool for recovery, rather than as a source of  fear. 

2.      How has the team’s work evolved since its creation? (changes in trends; interest from different bodies) 

Demand is increasing. We saw a 62% increase in referrals the first nine months of 2025 compared to all of 2024. October to December 2025 recorded the highest referral rate we’ve ever had. 

Survivors and professionals are becoming more aware of – and better at identifying –  tech-facilitated abuse, and media attention on the issue is also growing. As technology develops and becomes more accessible, the abuse survivors experience is becoming increasingly complex, often involving overlapping forms of abuse and extensive systems of home surveillance. In many cases, the use of GPS trackers, compromised online accounts, abuse via social media and other forms of tech abuse are intertwined within a perpetrator’s web of coercive control. 

One trend that is insufficiently addressed is that “simpler” forms of abuse are often overlooked due to the focus on more complex issues and AI-driven narratives in the media. Conversations around intimate image abuse often jump straight to deepfakes, and discussions on stalking tend to focus on “stalkerware”, even though much stalking occurs through compromised iCloud or email accounts. While media attention on emerging trends helps raise awareness, it can also skew understanding, leading people to underestimate the harm caused by “simpler” forms of tech-facilitated abuse, such as account hacking or social media harassment, which can be just as damaging.  

I think this increased awareness is a double-edged sword: more issues are being recognised, and more survivors and professionals are seeking guidance on staying safe online.Yet the emphasis on emerging trends risks neglecting simpler forms of tech abuse. Moving forward,we must ensure our work encompasses all types of tech abuse, which makes this area increasingly complex to navigate. 

3.      …and how do you anticipate it will evolve in the coming year(s)? (e.g. new gaps; new risks) 

Obviously, demand for specialist support is likely to continue increasing. I don’t think what we’re seeing is an isolated surge. 

The types of abuse are expected to become more complex, because the more technology we integrate into our lives, the more opportunities there are for perpetrators to misuse it. Wearable technology, for example, is a rapidly growing area, and the team is already starting to see aspects of this. The variety of tech and the ways devices connect will continue to expand. It’s not just that individual technical systems will become more complicated; the networks linking these devices will create an extensive web of surveillance around survivors. 

This will create additional barriers for survivors trying to escape abuse and make it harder to communicate with them safely. We can also anticipate more intersections with other forms of harm. For instance, overlaps between domestic abuse, romance scams, trafficking and other crimes.  

4.   What do you think needs to happen outside the violence against women and girls (VAWG) sector, which external actors need to act to support the elimination and prevention of technology-facilitated abuse (TFA)? 

We want to reach a point where tech abuse is fully integrated into mainstream domestic abuse support. Having dedicated teams is essential, but all domestic abuse services also need a baseline understanding of tech-facilitated abuse. 

Externally, several actors must play a role. Tech companies, in particular, need to adopt a safety-by-design approach, proactively considering violence against women and girls in the development of technology. This means embedding protections when designing tools and online resources, rather than responding reactively, often when it is already too late. Too often, companies deflect responsibility by claiming their products aren’t designed to cause harm. However, domestic abuse affects so many people, and survivors, as well as women and girls more broadly, need access to effective reporting tools. Many social media platforms still fail to remove harmful or abusive content. While there’s increasing pressure to address intimate image abuse, we also need to consider the wider context of online harassment and stalking. Harm is not always overt, and tech companies need a better understanding of this. 

At a systemic level, a more coordinated approach across the criminal justice system is needed, alongside stronger enforcement of existing laws. Improving the police response is crucial, but the Crown Prosecution Service and the judiciary must also play a role. Survivors should receive appropriate responses at every stage of the justice process. Laws must also be applied and adapted more flexibly to keep pace with emerging technological trends – otherwise, we will always be playing catch-up. 

Individually, we all need a greater awareness of how our data and tech can be misused, along with stronger basic digital hygiene practices, such as using strong passwords and enabling two-factor authentication. We’re not even there yet. 

5.      Are there specific risks that disproportionately affect victim/survivors based on age? What forms of technology-facilitated abuse are young women and girls particularly vulnerable to, and how might these differ from the risks faced by older women? How do remedies differ? 

Age is a crucial factor – arguably more so than in other forms of abuse. Young people today are digital natives,but there can be a real digital divide between them and the professionals supporting them. In some cases, caseworkers may not fully understand the platforms young people are using, which can make it harder to provide tailored, empowering support.  

For young people, one key risk is the normalisation of abusive behaviours. As tech has developed, our digital boundaries have shifted significantly. For example, sharing your location with a partner is now very normalised. This is not always abusive, but the wider context and nuances are often overlooked. Have both people genuinely consented? How does the partner react if the location isn’t shared? Applying professional curiosity to these dynamics is essential for identifying tech-facilitated abuse early on. 

Exposure to harmful content – whether through social media or pornography – has a huge psychological impact. It can normalise behaviours and, in some cases, even act as a training ground for perpetrators. For example, some have used AI companions to practise abusive behaviours before enacting them in real-life relationships. The shocking accessibility of harmful content, whether from the manosphere or general content promoting damaging  gender stereotypes, has a significant impact. 

To address these risks, we need to model positive digital boundaries, understand the platforms young people are using, and engage in open conversations about safe and respectful online behaviour. While there’s been a lot of fearmongering, the focus should be on helping people use technology mindfully and safely throughout their lives. 

Parental controls are a good example of this balance. If they remain in place throughout a child’s entire upbringing, without gradually introducing privacy and autonomy, young people may grow up thinking it is normal for partners to have unrestricted access to their accounts or passwords. Early experiences of technology can shape expectations in later relationships. At the same time, completely restricting access to the internet is not realistic because online spaces are central to how young people connect nowadays, and they should be able to engage with them safely. 

For older survivors, risks can look different. There’s a growing awareness of child-to-parent abuse, including its digital forms. A digital divide can also exacerbate abuse, particularly where technology-facilitated abuse intersects with physical and mental ill health, or disability. For example, devices marketed to support people with dementia, such as tracking tools, can be misused in ways that violate autonomy and consent. Even in care contexts, the person affected should be involved in discussions  about what technology is being used and why. There’s still limited professional curiosity around how health and care tech might be weaponised to control someone. Ultimately, across all age groups, we need to model healthy consent and establish clear, respectful digital boundaries. 

6.   How do other intersecting identities interact with risks around TFA? 

It’s important to acknowledge that survivor identities hugely affect their relationship with technology and the associated risks. Rejecting blanket approaches to tech safety is so important. 

We’re seeing an intersection between intimate-image-based abuse and harmful cultural practices. Current UK legislation doesn’t fully consider cultural context. What counts as ‘intimate’ can differ greatly between communities For example, an image showing shoulders or without a headscarf may not be considered intimate by Western standards, but to someone who is Muslim, it can be deeply personal and a violation. There are hierarchies in how intimate image abuse is treated. Images of nudity or sexual activity are often considered the most serious, while other intimate images – such as those showing someone without a headscarf – are treated as less severe. But the impact on the survivor can be just as significant, and these hierarchies often reinforce structural inequalities and racism.  

Black women experience higher rates of online abuse and young Black girls are subject to adultification at particularly  high rates. This means, for example, that a 15-year-old white girl may be seen as a child without agency, whereas a Black girl of the same age is more likely to be perceived as a young woman with agency. These stereotypes of Black girls being perceived as older than they are can minimise the abuse they experience and lead to it being taken less seriously. 

In the LGBTQ+ community, risks include doxing (sharing personal information online) and intimate-image abuse. Intimate content of someone who is queer is often proliferated at higher rates. Context matters: images may be used to out someone without their consent, or dead images may be shared. 

Tech-facilitated stalking and surveillance are also heightened in so-called ‘honour’-based abuse. These situations often involve multiple perpetrators, and those affected are often subjected to higher rates of surveillance, which significantly increases risk. These high-risk contexts need serious attention. Intersecting factors also include online misogyny, sexism, and disparities in  digital literacy. Women are often discouraged from engaging with technology, which limits their ability to protect themselves or recognise risks. 

Certain communities, such as Gypsy and Traveller communities in the UK1, face high rates of digital isolation. Limited access to technology isn’t just a barrier to support; it can also increase vulnerability to abuse, as members of these communities may be excluded from safety resources or unable to access reporting tools.. 

7.   What’s one (or two!) things you wish people outside the women’s sector knew about VAWG and tech? 

A.     Tech abuse is real and dangerous. It’s unrealistic to treat VAWG as separate in the digital and real worlds – they are deeply linked. Tech-facilitated abuse online can escalate  into serious physical harm, and in extreme cases, even homicide. Blanket safety planning is no longer appropriate, especially as technology is evolving so quickly. For example, asking someone to change their phone number or delete an account can actually increase risk by alerting perpetrators.   

B.     Remaining fearful isn’t helpful – hope is essential. It’s important to stay hopeful. There are amazing organisations working in this space, and simple steps can make a big difference: good digital hygiene practices, such as having strong passwords, checking recovery details and linked devices, and enabling two-factor authentication. While there’s a lot of negativity around tech abuse, it’s important to remember that practical actions exist and positive change is possible. 


Refuge/ TFAEE Team 

Refuge is the largest specialist domestic abuse organisation in the UK, providing life-saving support to thousands of women and their children every day so they can escape abuse and rebuild their lives – free from fear.  

Refuge’s Technology-Facilitated Abuse and Economic Empowerment team is the UK’s only service dedicated to supporting survivors experiencing complex cases of tech-facilitated and economic abuse. It was established in 2017 in response to an overwhelming increase in reports of tech-facilitated abuse from survivors and their children accessing our services.   

The team empowers survivors to use technology safely, ensuring they do not have to disconnect from online spaces and become further isolated. To facilitate this, Refuge’s experts carry out tech assessments and work with survivors to create customised safety plans. Alongside providing support, the team collaborates with agencies, professionals and tech developers to raise awareness of tech-facilitated and economic abuse and to improve responses.  

Kiera Brodie

Kiera joined Refuge in 2024 as the Tech Abuse Training Lead for Refuge’s Technology-Facilitated Abuse and Economic Empowerment team. In her role, she delivers accredited training packages for charities and for-profit organisations to improve recognition of ‘tech abuse’ nationally. With a background working in the homelessness and Violence Against Women and Girls (VAWG) sectors, she specialises in sexual exploitation, harm reduction, and trauma-informed practice, drawing on frontline experience to educate and empower professionals to better understand emerging harms.