This isn’t a sci-fi dystopia, this is our current reality: AI exacerbates violence against women and girls
Over the past two years, I have noticed a trend in my university email inbox: more and more AI-generated emails are inviting me to AI-themed conferences and AI workshops. I find myself sifting through emails asking me to come and discuss the “future of responsible and human-centered AI” or to spend two days learning about “how humans and AI can work together to enhance problem-solving, creativity and innovation”. And yet, I wonder each time if any of these events will acknowledge the ways in which AI often exacerbates violence against women and girls? So far, no luck.
Out of curiosity, (and after a small apology to the environment), I opened up a generative AI website (that shall remain nameless – but it’s the one you think it is), and asked it to write me a poem about how AI enables violence against women and girls. It responded to tell me that my request was “reported for a suspected violation of [its] terms of use”. Undeterred by the thought that I’d made a robotic enemy, I continued to snoop.
At first, I assumed that most of the dangers of AI for women and girls would stem from the environmental and economic impact that AI is having. We know that the energy and water needed to power AI data centers are placing a strain on an already over-exploited planet, accelerating the impact that anthropogenic climate change is having on the Earth’s ecosystems and inhabitants. We also know that any strain on the planet disproportionately affects women and girls, for example with increasing economic insecurity directly correlating with increased sexual exploitation and human trafficking. This, coupled with the fact that the jobs that AI is likely to eradicate are jobs done disproportionately by women, struck me as an immediate risk of AI.
When I continued researching, however, I learned that this may just be the tip of the iceberg. At many of the conferences referred to above, keynote speakers use deepfake technology to make the audience laugh. What they don’t mention is the fact that 98% of all deepfake videos are pornographic, and 99% of those are of women. In fact, much of the deepfake technology has only been taught to make pornographic images of women, almost exclusively without the consent of the person they are impersonating. This exacerbates pre-existing threats of image-based sexual abuse and online abuse, and is often used to intimidate, threaten, coerce, and control. For example, over 400 deepfakes of 30 female UK politicians were viewed over 12 million times in 2024. We also know that these sexual depictions are often violent, fueling a porn industry that enables sexual violence and excludes sex workers from an industry they have been fighting to reclaim.
This is not technology designed by or for women and girls, and there is an overdue need to acknowledge that this means that it poses more risks for them. We often speak of AI sex dolls as if they are a theme in a distant sci-fi movie, but they are already a 30-billion-dollar industry. At one of the first tech exhibitions where this technology was displayed, the sex doll on display was so badly damaged, with so many objects put into its vagina by the people at the exhibition – the same people who control these industries – that by the end of the event it could not be displayed. Most of these dolls also are programmed to have a frigidity setting, where they can be programmed to beg their users to stop, or to appear lifeless, to allow users to act out rape scenarios. This fuels a culture which already sexualizes and objectifies women disproportionately, further blurring the line between sexual objects and women’s bodies. Similarly, in 2024, the top eleven AI girlfriend apps were downloaded over 100 million times. Almost all of them allow rape scenarios to be acted out, thereby normalizing cultures of non-consent.
Technological changes are happening quickly, and as with every element of the movement to end violence against women and girls, legal protections need to be coupled with cultural changes and corporate accountability if we are to reverse the accelerative impact that AI is having on violence against women and girls.
Written by Evie Gilbert, PhD researcher