With AI, anyone can be a victim of nonconsensual porn. Can laws keep up?

Maskot/GettyImages

 

Connecting state and local government leaders

States around the country are scrambling to respond to the dramatic rise in deepfakes, a result of little regulation and easy-to-use apps.

This story was originally published by The 19th.

More than two dozen students at Westfield High School in New Jersey were horrified last year to learn that naked images of them were circulating among their peers. According to the school, some students had used Artificial Intelligence (AI) to create pornographic images of others from original photos. And they’re not the only teenage girls being victimized by fake nude photos: Students in Washington State and Canada have also reported facing similar situations as the ability to realistically alter photos becomes more broadly accessible with websites and apps. 

The growing alarm around deepfakes — AI-generated images or videos — in general was amplified even further in January, as one involving the superstar Taylor Swift spread quickly through social media.

Carrie Goldberg, a lawyer who has been representing victims of nonconsensual porn — commonly referred to as revenge porn — for more than a decade, said she only started hearing from victims of computer-generated images more recently. 

“My firm has been seeing victims of deepfakes for probably about five years now, and it’s mostly been celebrities,” Goldberg said. “Now, it’s becoming children doing it to children to be mean. It’s probably really underreported because victims might not know that there’s legal recourse, and it’s not entirely clear in all cases whether there is.” 

Governing bodies are trying to catch up. In the past year or so, 10 states have passed legislation to criminalize the creation or dissemination of deepfakes specifically. These states — including California, Florida, Georgia, Hawaii, Illinois, Minnesota, New York, South Dakota, Texas and Virginia — outlined penalties ranging from fines to jail time. Indiana is likely to soon join the growing list by expanding its current law on nonconsensual porn. 

Indiana Rep. Sharon Negele, a Republican, authored the proposed expansion. The existing law defines “revenge porn” as disclosing an intimate image, such as any that depict sexual intercourse, uncovered genitals, buttocks or a woman’s breast, without the consent of the individual depicted in the image. Negele’s proposed bill passed through both chambers and is now awaiting the governor’s signature.

Negele said she was motivated to update Indiana’s criminal code when she heard the story of a high school teacher who discovered that some of her students had disseminated deepfake images of her. It was “incredibly destructive” to the teacher’s personal life, and Negele was surprised to see that the perpetrators could not be prosecuted under current law. 

“It started with my education of understanding the technology that is now available and reading about incident after incident of people’s faces being attached to a made up body that looks incredibly real and realistic,” Negele said. “It’s just distressing. Being a mom and a grandmother and thinking about what could happen to my family and myself — it’s shocking. We’ve got to get ahead of this kind of stuff.”  

Goldberg, whose law firm specializes in sex crimes, said she anticipates more states will continue expanding their existing legislation to include AI language. 

“Ten years ago, only three states had revenge porn or image-based sexual abuse laws,” Goldberg said. “Now, 48 states have outlawed revenge porn, and it has really created a tremendous reduction in revenge porn — not surprisingly — just as we advocates had said it would. The whole rise of deepfakes has filled in the gaps as being a new to way to sexually humiliate somebody.” 

In 2023, more than 143,000 new AI-generated videos were posted online, according to The Associated Press. That’s a huge jump from 2019, when the  “nudify” websites or applications were less commonplace, and still there were nearly 15,000 of these fake videos online, according to a report from Deeptrace Labs, a visual threat intelligence company. Even back then, those videos — 96 percent of which had nonconsensual pornography of women — had garnered over 100 million views. 

Goldberg said policymakers and the public alike seem to be more motivated to ban AI-generated nude images specifically because virtually anyone can be a victim. There’s more empathy.

“With revenge porn, in the first wave of discussions, everyone was blaming the victim and making them seem like they were some sort of pervert for taking the image or stupid for sharing it with another person,” Goldberg said. “With deepfakes, you can’t really blame the victim because the only thing they did was have a body.” 

Amanda Manyame, a South Africa-based digital rights advisor for Equality Now, an international human rights organization focused on helping women and girls, said that there are virtually no protections for victims of deepfakes in the United States. Manyame studies policies and laws around the world, analyzes what’s working and provides legal advice around digital rights, particularly on tech-faciliated sexual exploitation and abuse. 

“The biggest gap is that the U.S. doesn’t have federal law,” Manyame said. “The challenge is that the issue is governed state by state, and naturally, there’s no uniformity or coordination when it comes to protections.” 

There is, however, currently a push on Capitol Hill: A bipartisan group of senators introduced in January the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 — also known as the DEFIANCE Act. The proposed legislation aims to stop the proliferation of nonconsensual, sexually-explicit content. 

“Nobody — neither celebrities nor ordinary Americans — should ever have to find themselves featured in AI pornography,” Republican Sen. Josh Hawley, a co-sponsor of the bill, said in a statement. “Innocent people have a right to defend their reputations and hold perpetrators accountable in court.” Rep. Alexandria Ocasio-Cortez has introduced a partner bill in the House.

According to new polling from Data for Progress, 85 percent of likely voters across the political spectrum said they support the proposed DEFIANCE Act — with 72 percent of women in strong support compared to 62 percent of men. 

But younger men are more likely to oppose the DEFIANCE Act, with about one in five men under 45 — 22 percent — saying they strongly or somewhat oppose legislation allowing subjects of explicit nonconsensual deepfakes to sue the creator.

Danielle Deiseroth, executive director of Data for Progress, said this issue showed one of the “more sharp contrasts” between young men and women that she’s seen in awhile.

“We can confidently say that women and men under 45 have diverging opinions on this policy,” Deiseroth said. “This is an issue that disproportionately impacts women, especially young women, who are more likely to be victims of revenge porn. And I think that’s really the root cause here.” 

Goldberg said that creating policies to criminalize bad actors is a good start but is ultimately insufficient. A good next step, she said, would be to take legal action targeting the online distributors, like the App Store and Google Play, that are providing products primarily used for criminal activities. Social media platforms and instant messaging apps, where these explicit images are distributed, should also be held accountable, Goldberg added. 

The founders of #MyImageMyChoice, a grassroots organization working to help victims of intimate image abuse, agreed that more should be done by private companies involved in the creation and distribution of these images. 

The founders – Sophie Compton, Reuben Hamlyn and Elizabeth Woodward – pointed out that search engines like Google drive most of the total web traffic to deepfake porn sites, while credit card companies process their payments. Internet service providers let people access them, while major services like Amazon Cloudflare and Microsoft’s Github host them. And social media sites like X allow the content to circulate at scale. Google changed its policy in 2015 and started allowing victims to submit a request to remove individual pieces of content from search results and has since expanded the policy to deepfake abuse. However, the company does not systematically delist image-based sexual violence and deepfake abuse sites.

“Tech companies have the power to block, de-index or refuse service to these sites — sites whose entire existence is built on violating consent and profiting from trauma,” Compton, Hamlyn and Woodward said in a statement to The 19th. “But they have chosen not to.” 

Goldberg pointed to the speed at which the Taylor Swift deepfakes spread. One image shared on X, formerly known as Twitter, was viewed 47 million times before the account that posted it was suspended. Images continued to spread despite efforts from the social media companies to remove them.

“The violent, misogynistic pictures of Taylor Swift, bloody and naked at a Kansas City Chiefs football game, is emblematic of the problem,” Goldberg said. “The extent of that distribution, including on really mainstream sites, sends a message to everybody that it’s okay to create this content. To me, that was a really pivotal and quite frightening moment.” 

Given the high profile nature of the victim, the incident sparked pronounced and widespread outrage from Swift’s fans and brought public attention to the issue. Goldberg said she checked to see whether any of the online distributors had removed products from their online stores that make it easier and cheaper to create sexually explicit deepfakes — and she was relieved to see they had. 

As the country’s policymakers and courts continue to try to respond to the quickly developing and increasingly accessible technology, Goldberg said it’s important that lawmakers continue deferring to experts and those who work directly with victims, such as lawyers, social workers and advocates. Lawmakers who are regulating abstract ideas or rapidly advancing technologies can be a “recipe for disaster” otherwise, she added. 

Manyame also emphasized the importance in speaking directly to survivors when making policy decisions, but added that lawmakers also need to be thinking more holistically about the problem and not become too bogged down by the specific technology — at the risk of always being behind. For example, Manyame said the general public is only now beginning to understand the risks posed by AI and deepfakes — something she helped write a report back in 2021. Looking ahead, Manyame is already thinking about the Metaverse — a virtual reality space — where users are starting to reckon with instances of rape, sexual harassment and abuse. 

“A lot of the laws around image-based sexual abuse are a little bit dated because they speak about revenge porn specifically,” Manyame said. “Revenge porn has historically been more of a domestic violence issue, in that it is an intimate partner sharing a sexually exploitative image of their former or existing partner. That’s not always the case with deepfakes, so these laws might not provide enough protections.” 

In addition, Manyame argued that many of these policies fail to broaden the definition of “intimate image” to consider diverse cultural or religious backgrounds. For some Muslim women, for instance, it might be just as violating and humiliating to create and disseminate images of their uncovered head without a hijab. 

When it comes to solutions, Manyame pointed to actions that can be taken by the app creators, platform regulators and lawmakers. 

At the design phase, more safety measures can be embedded to limit harm. For example, Manyame said there are some apps that can take photos of women and automatically remove their clothing while that same function doesn’t work on photos of men. There are ways on the back end of these apps to make it harder to remove clothes from anyone, regardless of their gender. 

Once the nefarious deepfakes are already created and posted, however, Manyame said the social media and messaging platforms should have better mechanisms in place to remove the content after victims report it. Many times, individual victims are ignored. Manyame said she’s noticed these large social media companies are more likely to remove these deepfakes in countries, such as Australia, that have third-party regulators to advocate on behalf of victims.

“There needs to be monitoring and enforcement mechanisms included in any solution,” Manyame said. “One of the things that we hear from a lot of survivors is they just want their image to be taken down. It’s not even about going through a legal process. They just want that content gone.” 

Manyame said it’s not too big of an ask for many tech companies and government regulators because many already respond quickly to remove inappropriate photos involving children. It’s just a matter of extending those kinds of protections to women, she added. 

“My concern is that there’s been a rush to implement A.I. laws and policies without considering what some of the root causes of these harms are. It’s a layered problem, and there’s many other layers that need to be tackled.” 

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.