Its 2023 And Women Should Not Be the Target of Deepfake Content

AI, Deepfake, AI content, deepfake content, Artificial intelligence, Intelligent tech

Issues with deepfake pornographic content is at a peak rise with Google and Microsoft’s search engines. 

Considering the emergence of deepfakes over half a decade ago, there has been a consistent abuse in technology towards women by using machine learning to modify a person’s head without their permission into pornography. The amount of nonconsensual deepfake content is accelerating at an exponential rate, due to the advancements in AI technology, fueling the expansion of the deepfake ecosystem. 

The issue that comes along with deepfake content is that there aren’t any effective legal solutions. Big tech companies are looking to apply this in order to eliminate the deepfake images. 

But how is that possible?  

“Updating laws on image-based abuse to account for the realities of deepfake technology. Clear legal deterrents are needed. Groups like the Cyber Civil Rights Initiative have good recommendations.,” said Founder & CEO at InGogu Digital Media Gaurav Pundir to Inside Telecom. 

In recent years, a new industry of deepfake abuse has evolved, primarily targeting women, and manufactured without any consent or awareness. Face-swapping applications that uses still photographs, as well as apps that allow clothes to be “stripped off a person” in a shot with a few clicks, are also popular. 

These programs are likely used to produce millions of photos.

Who is the target of deepfakes?

Sensity AI, a research firm that has been tracking internet deepfake films since December 2018, has constantly discovered that 90% to 95% of them are nonconsensual pornographic content. Approximately 90% of that is nonconsensual female content. 

“This is a violence-against-women issue,” says Adam Dodge, founder of EndTAB, a non-profit that educates people about technology-enabled abuse.

Pundi also highlighted that “responsibility for hosting non-consensual deepfakes content on platforms. We require clear takedown guidelines and their enforcement. However, targeted should not be responsible for paying for deletions. Putting affected women’s voices front and center in policy discussions. These power abuses are enabled by structural misogyny. Legislators and IT corporations need to recognize the urgent insights that those targeted have to offer.”

Research and Findings

Recent research contributed evidence that psychotic personality traits are correlated with the dissemination and creation of deepfake pornography have been published in Computers in Human Behavior journal.

“Although practitioners and lawmakers within the United Kingdom have recently made great strides to make behaviors associated with image-based sexual abuse illegal, such as revenge pornography’ and up skirting, a lot of work should be done in order to address equally detrimental and destructive habits, such as the manufacturing of deep-fake sexual media,” author Dean Fido stated in a study, a senior lecturer in forensic psychology at the University of Derby.

The researchers allocated 290 UK volunteers at random to read one of four vignettes detailing a deepfake occurrence in which an individual generated and shared a fake sexualized image of another person after being unable to engage them in a physical relationship. The victim was characterized as either a male or a woman, a celebrity, or an average person.

The researchers discovered that participants who scored higher on a psychopathy scale were more likely to believe the victim was to fault for the occurrence, were less likely to regard the scenario as damaging, and were less likely to believe the incident was criminal in nature. More psychopathic people were also more eager to produce and propagate deepfakes.

The researchers repeated their findings in a second trial with 364 U.K. participants. They also wanted to see if deepfake photographs created just for personal use were regarded differently than images shared. Participants reported greater victim suffering for both celebrity and non-celebrity victims when the photographs were disseminated versus when the images were simply made for personal use.

Take Home Message

Women are perceived as the sensitive gender. Men on the other hand, ‘can absorb more’ when it comes to tougher situations. According to several studies, women perceive deepfake images as harmful, threatening, and in some severe cases, criminal acts. Men on the other hand, do not perceive them to be harmful unless he is attracted to the woman. 

Studies have shown that this is an issue that must come to an end with the new age of technology. Psychiatrists emphasize that mental health is a red line that no one should cross and deteriorate. We should work on finding solutions before we lose more women to suicide and other mental health problems. 

To close, I will add a personal statement. As a woman I stand tall, and say we have the right to post our pictures without living in constant fear about what might happen next. PERIOD!


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.