Deepfake Pornography: A Call for Reform

Rulta>Blog>Deepfake Pornography: A Call for Reform
profile picture of the writer
E. KayaContent creator at Rulta2/27/24

What is Deepfake?

Deepfake technology employs sophisticated artificial intelligence (AI) algorithms to create convincing fake videos or audio recordings. By manipulating existing media, deepfakes can make it appear as though individuals are saying or doing things they never did. These manipulated videos are often created by swapping faces, voices, or both, resulting in content that is difficult to distinguish from reality.

What Are Deepfakes Used For?

  • Election manipulation
  • Social engineering
  • Identity theft
  • Scams and cyberattacks
  • Financial Fraud
  • Nonconsensual pornography

The fact that deepfakes seem so realistic is a serious dilemma that needs to be addressed and regulated in a timely manner. A study revealed that it is somewhat challenging to detect whether an image is AI-generated or not. A group of participants were shown a sequence of images: 50 deepfake images of human faces and 50 images of real human faces. They were asked to identify whether an image was artificially-generated or not. “Although participant accuracy was 62% overall, this accuracy across images ranged quite evenly between 85 and 30%, with an accuracy of below 50% for one in every five images. We interpret the findings as suggesting that there is a need for an urgent call to action to address this threat.” In simpler terms, overall detection accuracy was 62%, which means that almost half the time, those participants weren’t able to distinguish the deepfakes from the real images.

What is Deepfake Porn?

Deepfake porn specifically involves the use of deepfake technology to create pornographic content. Typically, the faces of people are superimposed onto the bodies of adult film actors or actresses. The resulting videos can be highly realistic and may be shared widely on adult websites or social media platforms. Many sources suggest that deepfake pornography accounts for 96% of deepfakes on the internet.

One notable example of deepfake porn involves the creation of fake videos featuring celebrities engaging in sexual acts. These videos, often circulated without the consent of the individuals depicted, can have severe consequences for their reputations and personal lives. For instance, singer Taylor Swift fell victim to deepfake porn when fake nude photos of her began circulating online.

The proliferation of deepfake porn poses significant ethical and legal challenges. Victims of deepfake porn may experience emotional distress, reputational harm, and privacy violations. Moreover, the spread of such content contributes to the normalization of revenge porn and cyber exploitation.

As deepfake technology continues to evolve, it is essential to take proactive measures to combat its harmful effects. While legislation and technology are being developed to address deepfake threats, individuals can also take steps to protect themselves. This includes utilizing the Digital Millenial Copyright Act (DMCA). It is important to note that in some cases of deepfake porn, DMCA law may not be applicable. 

Are Deepfakes Illegal?

In recent years, the rise of deepfake technology has brought about significant legal and ethical challenges, particularly in the realm of deepfake pornography. This technology, which uses artificial intelligence to create or alter video content in a way that makes it convincingly real, has been used maliciously to produce non-consensual pornographic material. The legality of deepfakes, especially those used for creating explicit content without consent, varies by jurisdiction, but there is a growing body of laws aimed at combating this issue.

In the United States, the creation and distribution of deepfake pornography are addressed under both federal and state laws. The DMCA provides a mechanism for victims to request the takedown of copyrighted materials, including deepfakes that use their likeness without consent. However, the DMCA's effectiveness is limited in cases where the victim does not own the copyright to all elements of the deepfake content.

Additionally, some states have enacted laws specifically targeting deepfake pornography. For example, Virginia expanded its revenge porn laws to include deepfakes, making it illegal to distribute sexually explicit images or videos that falsely depict someone without their consent.

California state law came into effect on January 1, 2020, which banned the manufacturing and distribution of synthetic pornography without the consent of the people depicted. Under this law, the victims of deepfake pornography are provided injunctive relief and the perpetrators are threatened with legal threats of statutory and punitive damages. While this law focuses on civil actions, other existing criminal laws could be invoked to pursue criminal charges against individuals involved in the creation or distribution of non-consensual deepfake pornography.

The applicability of criminal penalties would depend on the specific circumstances of each case and the presence of laws that criminalize such behavior beyond the scope of this law. For example, the creation or distribution of deepfake pornography could potentially fall under other criminal statutes related to harassment, stalking, or specific laws against non-consensual pornography, sometimes referred to as "revenge porn" laws. 

Another example is the “Preventing Deepfakes of Intimate Images Act'', which was reintroduced by a New York congressman in January 2024. It “seeks to prohibit the non-consensual disclosure of digitally altered intimate images.” This means that the distribution of nonconsensual deepfake pornography in New York is criminalized and perpetrators risk up to $150,000 in civil damages and imprisonment of up to 10 years.

Legislators have been forced to address deepfake pornography, especially as over the recent years children started becoming victims of this crime. A 14-year-old became a victim of deepfake pornography in a high school in New Jersey, which is horrifying and signaled the need to take action.

It is important to note that there is no federal law which criminalizes the creation and distribution of deepfake porn. U.S. lawmakers need to do better.

In the United Kingdom, the sharing of private sexual images without consent is illegal under the Criminal Justice and Courts Act 2015. Furthermore, in January 2024, British legislators announced that there will be changes made to this Act to better protect victims of deepfakes, as a way of keeping up with evolving technology, making it illegal to share AI-generated pornography without consent. However, this law will not come into effect until 2027, another setback in the legal system for deepfake victims.

Europe has taken steps to protect individuals from deepfake pornography through the General Data Protection Regulation (GDPR), which gives individuals the right to have personal data erased. This right can be invoked to combat deepfake content, although enforcement can be challenging across international borders.

In Canada, the non-consensual distribution of intimate images is illegal under the Criminal Code. Canadian law also allows victims to pursue civil remedies to have the content removed and to seek damages. However, specific legislation targeting deepfakes is still developing.

Despite these legal frameworks, many victims of deepfake pornography find that the laws are not sufficient to protect them fully. The anonymous nature of the internet, jurisdictional challenges, and the rapid advancement of deepfake technology often outpace legal responses. Victims face significant hurdles in having content removed and in holding perpetrators accountable. Law enforcement agencies and social media platforms are urged to improve their responses to deepfake complaints, ensuring quicker action is taken to remove content and support victims.

International corporations also have a responsibility to step in. For example, megacorporation Google continues to list websites that are clearly dedicated to nonconsensual deepfake pornography. This poses the question: why isn’t Google removing them from their index? Is Google doing its part in reducing the discoverability of these websites created for the sole purpose of humiliating others, especially women? 

There is a clear need for better legislation that specifically addresses the unique challenges posed by deepfake technology. This includes laws that can cross international boundaries and more robust mechanisms for identifying and penalizing creators and distributors of non-consensual deepfake pornography. As technology continues to evolve, so too must our legal and social responses to protect individuals from these deeply invasive and harmful practices.

What Deepfakes Mean for Women

Deepfakes have become a serious problem, especially for women in the spotlight. They can find themselves in fake videos without their permission, causing a lot of stress and damage to their personal and professional lives. According to Declarations, a human rights podcast produced by the Centre of Governance and Human Rights at the University of Cambridge, deepfake pornography makes up 96% of all deepfakes, and they almost exclusively target women. So, it is safe to label the deepfake pornography scandal as gender-based, political violence.

Deepfakes can cause a lot of harm, especially when they're used for something called image-based sexual violence, or what some people might know as revenge porn. 

Most of the time, women are the ones who suffer from this kind of online abuse. Studies show that between 60% and 95% of revenge porn victims are women, and men are usually the ones who share these images without permission. According to an article published by www.iadb.org (Inter-American Development Bank), victims of deepfakes experience anxiety, feelings of shame, anger, humiliation and stigma. “The violation of their privacy and the spread of manipulated images can profoundly affect their mental health and, consequently, their quality of life, and even cause trauma. Additionally, it can have repercussions on the personal relationships and professional opportunities of affected women.”

The harm isn't just about jobs or reputation. It's also about feeling like you've lost control, having your privacy violated, and not being able to trust others. Interestingly, studies have found that men who go through similar experiences tend to feel less ashamed and blame themselves less than women do. When it comes to getting help, women often face more challenges. Some reports say that men get better responses from the police when they report these crimes. This might reflect a bigger issue in society where people blame women for the abuse they face, instead of supporting them.

It's clear that we need to do more to fight against deepfakes and image-based sexual violence. Online platforms and law enforcement must step up their game to protect victims and stop this kind of abuse from spreading. Everyone deserves to feel safe and respected, both online and off.

Deepfakes: The Legal Landscape

Despite the existence of laws aimed at protecting individuals from digital abuse, the harsh reality is that these legal frameworks often fall short when it comes to protecting women targeted by deepfake pornography. There are only a very few instances of the imprisonment of perpetrators of deepfake pornogaphy.

Deepfake technology, which manipulates videos and images to create realistic but entirely fabricated content, has become a tool for perpetrating a new form of sexual violence against women. While some legislation exists to combat this issue, the rapid advancement of technology and the global nature of the internet present significant challenges to effective enforcement.

Current laws, including those related to copyright, privacy, and cyber harassment, offer some avenues for recourse, but they are not specifically tailored to address the unique challenges posed by deepfakes. For instance, copyright laws can protect against unauthorized use of one's image, but they require the victim to prove ownership of the original content, a process that can be both complex and inadequate in the context of deepfakes. Privacy laws, on the other hand, vary widely by jurisdiction, making it difficult to pursue cases that cross international borders.

Moreover, the legal process can be slow and cumbersome, failing to provide timely relief to victims. By the time legal action is taken, the damage to a woman's reputation, mental health, and personal life can be irreversible. The anonymity afforded to perpetrators on the internet further complicates efforts to hold individuals accountable, allowing them to hide behind a veil of digital obscurity.

Legislators need to recognize the severity of the harm caused by deepfake pornography and take decisive action to strengthen legal protections for victims. This includes enacting laws that specifically address the creation and distribution of deepfake content, simplifying the process for victims to take legal action, and enhancing international cooperations to tackle this global issue. Additionally, there is a pressing need for legal measures that focus on prevention and education, helping to deter potential perpetrators and raise awareness among the public about the consequences of creating and sharing deepfake pornography.

The rise of deepfake technology calls for a proactive and comprehensive legal response. Legislators must step up and do better to protect women from the devastating impact of deepfake pornography. Without significant legal reforms and a commitment to enforcing these laws, women will continue to be vulnerable to this insidious form of digital violence. It's time for the legal system to catch up with technology and provide real protection for those targeted by deepfakes.

More on this

~$ loading rulta...