Edge Technologies

DOI: https://www.doi.org/10.53289/AZMY1708

Everyday tech can put victims of abuse further at risk

Volume 23, Issue 9 - October 2024

Professor Leonie Maria Tanczer

Professor Leonie Maria Tanczer

Leonie Maria Tanczer is an Associate Professor in International Security and Emerging Technologies at University College London’s (UCL) Department of Computer Science (CS) and grant holder of the prestigious UKRI Future Leaders Fellowship (FLF). She is part of UCL's Information Security Research Group (ISec) and initiated and heads the “Gender and Tech” research efforts at UCL. Tanczer is also a member of the Advisory Council of the Open Rights Group (ORG), a Steering Committee member for the Offensive Cyber Working Group, and a voting member of the IEEE Working Group P2987 “Recommended Practice for Principles for Design and Operation Addressing Technology-Facilitated Inter-personal Control”. She was formerly an Association of British Science Writers (ABSW) Media Fellow at The Economist and a Fellow at the Alexander von Humboldt Institute for Internet and Society (HIIG) in Berlin.

Summary:

  • Technology is gendered, and it is important to examine how gender manifests in the design, usage, control and the effects of the systems we're putting in place
  • That technologies can be abused must come as no surprise to anyone who is creating but also using digital devices and systems
  • Technology is changing. A perpetrator does not need to be physically present to make someone's life miserable.
  • The same household devices that we are installing for the likes of dementia patients could lead to intimate partners being abused 
  • We are at a critical moment where both the underestimation of the capabilities of technology are just as dangerous as the overestimation

It is vital that we look at the adverse side effects of the technologies we let loose into society, and I am specifically focused on issues around intimate partner violence. 

At University College London (UCL), I am leading a research group called ‘Gender and Tech’. We are interested in how gender or societal assumptions shape the way we design technologies and how technologies that we put on the market affect the way we define gender, race, and other social categories. I will start with three premises that drive my presentation. 

 

Technology is gendered

 

Number one is that technology is gendered. I can give you dozens of historical examples, and we see them replicated today. These range from the way our mobile phones are extremely big and do not fit into dresses and women’s jeans pockets, the way that the bicycle was initially designed for men, and the way we are creating MRI scans and medical technologies that have dedicated “male” and “female” settings. We can also think of the past problems we had around crash test dummies, to more modern phenomena such as making things ‘prettier’ by putting rose gold and other allegedly ‘feminine’ colours on them. I think it is essential to emphasise the significance of how we see gendered representations in the design, usage, control and effects of the systems we are putting in place.

 

Technology is abused

 

Secondly, the notion that technologies can be abused is no surprise to anyone working on devices and systems. The ‘dual use’ discussion has haunted us forever. I recommend the movie ‘Demon Seed’ from 1977 if you have a spare evening. It's old, but it reflects what we are studying at UCL as it relates to a body of work called “tech abuse”. This shortens the very lengthy definition of ‘technology facilitated gender-based violence or intimate partner violence/domestic abuse’. That definition ranges from online harassment to cyberstalking to the use of spyware systems installed on smartphones. It also involves the topical issue of image-based abuse, which affects people sharing images (often intimate images)of individuals without their consent. 

 

Now, measuring this particular aspect of domestic abuse and intimate partner violence is challenging. If we think about how we, as a society, use tech, we must acknowledge that everyone owns a smartphone nowadays. So, counting the misuse (like excessive text messaging) is tricky, as we would have 100% of victims and survivors reporting it. However, if we put more nuance to the understanding of tech abuse - such as technical sophistication - we come up with a more specific number. Our research group is currently conversing with UN Women and the United Nations Population Fund on how to measure this phenomenon globally and how it could be standardised. I want to stress that this is not a minor issue, and it relates to the fact that we have tech in every aspect of our lives, from young to old. 

 

I often find that when I call industry partners to attend an event on this topic, they are uncomfortable coming along. Indeed, they do not see it as their role because, as I have indeed been told by one’s company’s representative, ‘they have not designed their products and services to be abused’. Indeed, of course, nobody sets out to produce something that harms people. Nonetheless, the tech sector must acknowledge that abuse via digital systems is occurring and that each of them has a role to play in tackling it. 

 

Most of the tech abuse that happens right now is facilitated via what I would call‘conventional tech’. So basically, the devices we have are in our pockets and in our houses. So the stuff that is affordable, cheap, and widespread. There is also a heavier effect on Black Asian minority ethnic groups. So, the same discriminatory aspects that affect societal and minority groups in general, are also playing out in the context of tech abuse, unfortunately. 

 

Many people also have predefined ideas of what abuse looks or should look like. For instance, the perpetrator is never the university professor; it’s never our neighbour, and it is never someone we know. It is always a surprise to everyone that someone in their surroundings can be abusive. But we must understand that there are perpetrators amongst us, and intimate partner violence sadly occurs in any demographic group. Abuse is also not always solely physical or sexual abuse. Intimate partner violence takes on many shapes and forms, and I am determined that we, as a society, change our perception of technological abuse. 

 

Technology is changing

 

My final point is that technology is changing. I was fortunate to have been a Postdoc as part of PETRAS - the National Centre of Excellence for IoT Systems Cybersecurity. The exposure I received in this role led me to consider and eventually scrutinise the ways smart, Internet-connected devices may be impacting gender-based violence and abuse and what social and technological mitigations could be put in place to counteract it. My research idea started in 2018 and has since led to me setting up a dedicated “Gender and Tech” research lab, where we study this phenomenon of ‘smart abuse’ - a topic that is now more important than ever.  

 

The same household devices that we are installing for dementia patients could lead to people being abused by their partners, nurses, doctors, and family members. I think an essential recognition is that many of these devices are small and that they contain tiny sensors that are not visible. Besides, many IoT systems look like ‘ordinary’ things we have seen and been exposed to before, such as a television or toaster. However, these previously analogue products have now become ‘smart’, giving them enhanced functionalities. IoT’s aspect of disguise makes it hard for people to assess their risk. For most consumers, it is difficult enough to conceptualise what a sensor is and what and how much data it collates, processes and measures. However, smart systems capabilities such as remote control exacerbate the reach of a perpetrator. One does not need to be physically present to make someone's life very, very miserable. 

 

Our research group tries to communicate this emerging risk to policy officials and is keen to make this a topic not just for domestic abuse organisations but also for industry and the regulatory domain. In doing so, we aspire to test and study systems to then convey some of the design shortcomings that, for example, developers must have on the radar. The crux of the issue is, however, that the same functionalities many consumers deliberately buy and seek out  such as voice control, location tracking or video recording – are the exact same tools that ultimately can be misused by domestic abusers. 

 

From a machine learning perspective – which is a capability embedded and probably soon far more prevalent in these devices - I am interested in understanding what it will mean to have been in an abusive relationship for, say, 20 years and to come out of this relationship with a skewed data model and profile. The latter may mean former abuse victims or survivors may not be eligible for loans or insurance products. There is already research showcasing that insurance providers have acted discriminatory towards domestic abuse victims, as their risk – for instance, in the context of life or health insurance - is so much higher. I, therefore, wonder what the consequences of our increasingly connected and datafied society may be for vulnerable groups such as intimate partner violence victims/survivors.  

 

Unfortunately, domestic abuse is still a very gendered phenomenon and I cannot tell you how often I’ve worked with support sector organisations (which are generally very female-dominated) and heard representatives express that they would be ‘not tech-savvy’. This viewpoint is pervasive. Indeed, looking at most heterosexual relationships, it is frequently the male partner who is in charge of purchasing devices, setting them up, maintaining them, and deciding when and how to replace them. Additionally, they are often the legal owner of the device, the account holder with the knowledge of authentication details such as passwords, and the payer of subscriptions. This creates a gender imbalance that is aggravated in intimate partner violence situations, where power dynamics take hold. 

 

I want to end by stating that if there is one thing I wish people to take away from our research, then please let it be this: We are currently in a very critical moment in time where both the underestimation of the capabilities of technology are just as dangerous as their overestimation. We see this with victims and survivors, who are being told horrendous things about what smart devices allegedly can do, and this hyped fear is not only fed by advertising and the media but is further worsened. Most often, claims about what a perpetrator is technically able to do, do not match the actual capabilities of most of these systems. That said, not accounting for the enhanced functionalities of these products would be just as dangerous. So, the possibility that it could be true and the lack of certainty about what smart devices can actually make possible feeds into victim’s and survivor’s angst. I consequently vouch for a realistic debate about IoT as well as emerging technologies such as AI, so we can focus on addressing real threats and risks and ultimately help and support victims/survivors without gaslighting them any further.