Do British Asian Women think Deepfakes will increase Revenge Porn?

Amid the widespread presence of deepfakes online, we explored the views of British Asian women about the potential rise in revenge porn.

Do British Asian Women think Deepfakes will increase Revenge Porn

"The video was shared 40,000 more times"

In an era dominated by rapidly advancing technology, the emergence of deepfakes has added a new layer of complexity to the digital landscape.

Deepfakes refer to AI-generated content that convincingly replaces a person’s likeness in audio, video, or images with another.

While the technology presents exciting possibilities in entertainment and creativity, it also raises alarming concerns about privacy, misinformation, and its potential to exacerbate issues like revenge porn.

There have already been high-profile cases of South Asian celebrities suffering from deepfakes.

Alia Bhatt, Rashmika Mandanna, Priyanka Chopra Jonas and Kajol have had videos leaked online where their faces are provocatively plastered on other bodies.

Even the Mayor of London, Sadiq Khan, was a victim of deepfake. Sky News reported in November 2023: 

“The digitally generated audio, using the London mayor’s voice and mannerisms, purports to be a recording of him playing down the importance of Remembrance weekend commemorations.”

In the soundbite, ‘Khan’ can be heard saying: 

“I don’t give a flying s*** about the Remembrance weekend.”

Whilst authorities were aware that the clip was fake, it highlights the danger that deepfakes can cause in both visual and audio formats.

But, it presents a new alarming revelation.

Even though high-profile individuals are more prone to suffer from deepfakes, the item in question can be dismissed as fake almost instantly.

However, when it comes to normal people, they have fewer resources to remove images, audio or videos from the internet. 

Once something is posted online, it’s almost impossible to remove it. And, this means a new danger to victims of revenge porn. 

Understanding Deepfakes

Do British Asian Women think Deepfakes will increase Revenge Porn

Deepfake technology utilises artificial intelligence and deep learning algorithms to create realistic content, often indistinguishable from genuine media.

By mapping facial expressions, gestures, and voice patterns, these algorithms can manipulate existing content or generate entirely new material.

The ease with which deepfakes can be created raises concerns about the spread of misinformation.

With malicious intent, individuals could use deepfakes to manipulate public perception or tarnish the reputation of public figures within the South Asian community.

Deepfakes pose a significant threat to personal privacy.

As technology advances, the potential for malicious actors to create realistic simulations of individuals engaging in compromising situations grows.

This could lead to reputational damage and emotional distress, particularly within conservative South Asian societies.

One of the most pressing concerns is the potential for deepfakes to facilitate the spread of revenge porn.

By superimposing individuals’ faces onto explicit content, perpetrators could exploit this technology to harm the reputation and emotional well-being of women in the South Asian community.

Deepfakes have the potential to erode trust in personal relationships.

The ability to manipulate digital content could lead to increased scepticism and insecurity, affecting the dynamics of relationships where traditional values hold significant weight.

Deepfakes & Porn

Do British Asian Women think Deepfakes will increase Revenge Porn

For The Guardian, columnist Moira Donegan wrote: 

“Ads for deepfake services appear directly next to explicit videos on PornHub.

“Though deepfake technology can theoretically be used for any kind of content…the tech is being used to create nonconsensual porn.

“According to a 2019 report, 96% of deepfake material online is pornographic.”

She adds: 

“Deepfake revenge porn, then, merely fulfils with technology what mainstream porn has offered men in fantasy…

“…the assurance that any woman can be made lesser, degraded and humiliated, through sexual force.

“The non-consent is the point; the humiliation is the point; the cruelty is the point.”

Perhaps one of the most high-profile cases of this issue was that of Indian journalist, Rana Ayyub.

Following her coverage of the 2018 sexual assault case involving an eight-year-old Kashmiri girl, Ayyub faced backlash for asserting that India shields child predators.

Initially, trolls propagated false tweets using Photoshop to make it seem like Ayyub herself was tweeting them from her official account.

Some of the fake tweets read: “I hate India” and “I love child rapists”. 

In a Huffington Post article, Ayyub details that she was “forced to write a clarification” about the whole incident.

But she goes on to say that it didn’t stop the catastrophe that would happen next:

“A source from the ruling BJP sent me a message to say ‘Something is circulating around WhatsApp, I’m going to send it to you but promise me you won’t feel upset.’

“What he sent me was a porn video, and the woman in it was me.

“When I first opened it, I was shocked to see my face, but I could tell it wasn’t actually me because, for one, I have curly hair and the woman had straight hair.

“She also looked really young, not more than 17 or 18.

“I started throwing up. I just didn’t know what to do. In a country like India, I knew this was a big deal.

“I didn’t know how to react, I just started crying.

“I asked him why it was circulating within political circles and he told me people within the party had been passing it on.

“Before I could even gather myself, my phone started beeping and I saw I had more than 100 Twitter notifications, all sharing the video.

“My friend told me to delete Twitter but I couldn’t, I didn’t want people to think this was actually me.

“I went on Facebook and I had been inundated with messages there too.

“They were trying to derail me, every other person was harassing me with comments like ‘I never knew you had such a stunning body’.

“I deleted my Facebook, I just couldn’t take it.

“But on Instagram, under every single one of my posts, the comments were filling with screenshots of the video.

“Then, the fanpage of the BJP’s leader shared the video and the whole thing snowballed.

“The video was shared 40,000 more times.

“It ended up on almost every phone in India.”

“It was devastating. I just couldn’t show my face.

“You can call yourself a journalist, you can call yourself a feminist but in that moment, I just couldn’t see through the humiliation.

“It had exposed me to a lynch mob in India. People were thinking they could now do whatever they wanted to me.

“The next day, they doxxed me.

“Another tweet was circulated on social media with a screenshot of the video and my number alongside, saying ‘Hi, this is my number and I’m available here’.

“People started sending me WhatsApp messages asking me for my rates for sex.

“I was sent to the hospital with heart palpitations and anxiety, the doctor gave me medicine.

“But I was vomiting, my blood pressure shot up, my body had reacted so violently to the stress.”

This poignant event is felt by many women around the world but this is still just the tip of the iceberg, which is very alarming.

We wanted to gather the thoughts of British Asian women to see what they thought about this issue.

Do they even know what deepfake is? Are they worried? Do they care? 

The Perspective of British Asian Women

Do British Asian Women think Deepfakes will increase Revenge Porn

To gain an understanding of the implications towards those from South Asian communities, we spoke to women in the UK about their thoughts. 

Whilst the argument in South Asian countries is entirely different due to their technology, resources and awareness, a place like the UK is seeing deepfakes rise tremendously. 

28-year-old Sonia Patel from London told DESIblitz:

“I’m genuinely concerned about deepfakes; they’re like digital puppetry.

“Our lives are already a balancing act of cultural expectations.

“This just adds another layer, where men will find another way to demonise a woman for something she has no control over. 

Ayesha Bassi from Manchester stated:

“Deepfakes could easily distort our voices and actions.

“It’s not even just revenge porn, but women and men can use this to catfish people, do fraud, or even for their own sick pleasure.”

Likewise, Birmingham native, Meera Joshi said:

“As a South Asian woman, privacy is a prized possession.

“The thought of someone manipulating my image or voice is unnerving. We need safeguards against this digital invasion.

“I’m not worried at this point but I guess you have to be in this sort of climate.”

Zara Ahmad*, a 27-year-old nurse from Edinburgh expressed:

“I worry about the impact on dating.

“It’s already a minefield; deepfakes could turn it into a digital war zone where you question every interaction and image.

“Revenge porn has been around for decades and it is a sick fantasy that’s exploited by the Porn industry.

“So I fear that any law around this won’t make a difference.”

33-year-old Anika Kapoor from Bristol gave her thoughts:

“It’s not just about myself; it’s about the future.

“I’ve seen celebrity deepfakes and there was even that comedy show on ITV. When something is being joked about, it decreases the importance of control. 

“I can’t imagine having my face and mannerisms doing something dirty for people to view freely. It makes me sick.”

Sara Kaur from Oxford agrees, saying: 

“As a professional in tech, the idea that deepfakes could manipulate my image is distressing.

“I’m in a conflict because I love the industry that is accelerating the development of deepfakes and AI.

“So, if people are worried now, they have no idea how huge this could become. 

“It’s not just a personal concern; it’s a threat to our professional integrity.

“And for women who think they won’t be a target, think again.”

“There will come a time (hopefully not) when people can take a simple image and create a whole video using mannerisms, voice, movements etc.

“This will undoubtedly come with more revenge porn and deepfake porn cases. It’s even more worrying for the next generation who are brought up in a more tech/AI environment.

“They’ll pick up these skills instantly, like children did during the iPad era.”

Aisha Hemek from London similarly had her concerns: 

“Working in media, I’m excited about the possibilities AI brings, but the dark side of deepfakes is unnerving.

“It’s a wake-up call for us to be more discerning consumers and for the industry to adopt ethical guidelines to curb potential misuse.”

However, other British Asian women gave some interesting perspectives: 

25-year-old Rima Kang* from Birmingham revealed: 

“Deepfakes? Honestly, I’m still figuring out how to use Snapchat filters!

“AI is like a whole new language, but hey, as long as my cat ears and flower crowns stay cute, I’m not stressing about it.”

22-year-old Saira, also from Birmingham, told us: 

“I’m more worried about when my phone doesn’t unlock with my face than deepfakes. 

“I don’t think they’re that dangerous to us, only to celebrities.

“If someone is an expert in this field, they’re not going to target a nobody from Birmingham.”

Zeema Ahmed from London chimed in: 

“I know about ChatGPT but that’s about it.

“I think revenge porn in itself is an issue but deepfakes won’t impact normal brown women from the UK, that’s for sure.”

Zeema’s younger sister, 19-year-old Zainab, also told us her view: 

“I barely understand the TV remote half the time.

“Deepfakes? Is that, like, the distant cousin of fake IDs? AI can do its thing as long as I can still binge-watch my K-dramas without interruption.”

22-year-old Neha Khan from Leeds added:

“I honestly think girls my age are obsessed with TikTok and Insta more than anything else. I’m not too bothered about it to be honest.

“It also sounds like a man issue. I can’t imagine women using a man’s face to create a dirty video. 

“I reckon they need to control men first and this all goes away.”

Lastly, Meera Singh, aged 23 from Newcastle, expressed:

“I’m still amazed by autocorrect, let alone deepfakes!

“I did see the stuff about Alia Bhatt and Kajol on Instagram, but just thought it was an issue in India. It seems like something more dangerous over there than here.

“There are pervs everywhere but this is an issue for porn industries.

“They glorify all sorts of things and it makes it okay. That’s how people get away with it. 

“But, nothing is gonna change or stop – either way, I’m not worried.”

It’s clear here is a split between the awareness of deepfakes and the harm they could potentially cause in relation to revenge porn.

Whilst most British Asian women are drawn to the issue, some don’t see deepfake as a priority and think they aren’t directly affected.

Although this could be true, the technology surrounding deepfakes is growing rapidly.

With already high-profile celebrities being targeted, normal communities could find it harder to even know if they have been a victim of this problem.

So, there are a lot of factors to take on board. 

By fostering awareness, implementing robust legal frameworks, and leveraging technology responsibly, communities can safeguard their members from the insidious threats posed by deepfake technology.

If you are or know anyone suffering from revenge porn, reach out for help: 



Balraj is a spirited Creative Writing MA graduate. He loves open discussions and his passions are fitness, music, fashion, and poetry. One of his favourite quotes is “One day or day one. You decide.”

Images courtesy of Instagram.

*Names have been changed for anonymity.





  • What's New

    MORE

    "Quoted"

  • Polls

    Are you happy about Venky's buying Blackburn Rovers?

    View Results

    Loading ... Loading ...
  • Share to...