Misogyny is at the heart of deepfakes – The Taylor Swift images prove it’s about to get a lot worse

The AI-generated images of the pop star shine a grim light on the rapid progression of deepfakes.

The pornographic depictions of Taylor Swift in artificial intelligence (AI) photos that circulated on social media last week highlight not only a burgeoning technology issue but also a new iteration of misogyny that’s deeply worrying, damaging and invasive. At a time when Swift’s high-flying career has seen her dominate the pop culture landscape, the singer has become a talking point over a set of humiliating, sexually explicit images that she’s featured in, however, none of these images are actually of her.

On Thursday, explicit, deepfake images of the 34-year-old were posted to X and circulated across social media, sparking dismay and outrage among her fans and many more besides. Deepfakes are artificially generated images or videos that depict a person’s likeness; Mashable reported that a study last year found that 98% of all deepfake videos online are pornographic in nature. Meanwhile, 99% of deepfake targets are women. 

The fake pictures of Swift being assaulted by Kansas City Chiefs fans, in reference to her relationship with the team’s tight-end, Travis Kelce. According to reports, one was seen more than 47 million times before X began removing them, but by this point, they were also being found on Facebook, Instagram and Reddit.

Swift’s fans were spurred into action, desperately trying to help quell the abuse of their favourite artist. They shared images and footage of her on stage, at awards shows, on the red carpet – any image that wasn’t the ones she very obviously hadn’t consented to. Ellie Wilson, a prominent sexual assault survivor and campaigner, was among the thousands to speak out against the pictures, but also examined why so many men were visibly engaging with them on a public platform.

“I think part of the reason so many men are loving these fake images of Taylor Swift being sexually assaulted is because it’s the humiliation and degradation of a strong and powerful woman that does it for them,” she wrote. “They want to see her ‘put in her place’.”

There have been plenty of warnings about the dangers posed by deepfake images – Swift is the latest victim but she won’t be the last. Other targets have included fellow pop singers such as Ariana Grande, journalists and influencers. However, it’s not just the world’s most famous and recognisable women who are falling victim to deepfakes. Right now, an inquest is taking place into the apparent suicide of British schoolgirl Mia Janin, 14, whose male classmates allegedly shared fake nudes of female pupils on Snapchat. 

Meanwhile, a small town in southern Spain, Almendralejo, was recently rattled by a scandal as 28 teenage girls – ages 11 to 17 – received naked photos of themselves. Only none of them had ever taken such pictures. The original images were taken from the girls’ Instagram accounts without their knowledge or consent, altered using an AI-powered ‘nudifying’ image generator and then distributed among their classmates via messaging apps like WhatsApp and Telegram. 

According to a report by Sensity, over 680,000 women and girls on Telegram – some of whom were underage – were targeted by an AI-powered bot service that ‘strip-naked’ their images, taken from social media pages or directly from private communication and then shared them with users who requested the service. A poll of the bot’s users suggested most (63%) were interested in generating fake nudes of “familiar girls, who I know in real life”. The next most popular answer, with 16% of users, was “stars, celebrities, actresses, singers”, followed by “models and beauties from Instagram” with 8%.

But unfortunately, the list of websites that can produce these altered images is forever growing. Ever since an open-sourced version of a deepfake software called ‘DeepNude’ emerged on the web in June 2019, several similar bots, apps and websites popped out of nowhere, promising to ‘nudify’ any girl or woman you’d like in a matter of seconds. (The software initially didn’t work on men, but this seems to have changed since then).  

Sharing deepfake porn has been illegal in England and Wales since June 2023, in a government bid to crackdown on “abusers, predators and bitter ex-partners who share intimate images online without consent of those depicted”. In the US, many states have still not updated their anti-revenge porn laws to include the use of technology in creating and sharing fake images. And clearly, companies such as X are not yet equipped to act with enough speed or efficiency when something like this occurs. 

Of course, it would’ve been far more difficult, or near-impossible, to create those manipulated sexual images and videos without the help of AI, but the driving force behind this worrying trend isn’t technological – it’s societal. If it weren’t for gender discrimination still embedded deep within our society and everything it fuels – from the disregard for women’s consent and autonomy to the constant objectification and hyper-sexualisation of the female body – we wouldn’t find ourselves in a situation where women and girls have to continuously pay the price for just… existing

As technology used to create AI deepfakes progresses and becomes even more accessible, cheaper and faster to use, incidents like the one we’ve seen with Taylor Swift and in Almendralejo will likely only become increasingly commonplace. And its real and painful impacts will be felt by thousands and thousands more women and girls. Unless we stop ignoring this issue, as tech giants and, to some extent, mainstream media have been so far.

WriterChris Saunders
Banner Image CreditBBC / The Graham Norton Show