I admit that I had never heard of Jenna Ortega before I read that a shady app called PerkyAI was using a deepfaked nude photo of the actress, who was underage at the time the real photo was taken, to promote its AI-generated pornography. This trend is disturbing for celebrities such as Taylor Swift who have had to endure seeing fake pornographic images of themselves, and deepfake porn has the potential to do a lot more harm to a lot of people who wanted to stay anonymous, including children.
You might ask: The pictures are faked, so no one was actually nude and therefore no one was actually harmed, right? Wrong.
I have no qualms with adults consensually creating pornography that is viewed by other willing adults. A lot of porn isn’t made by consenting adults, but issues surrounding the porn industry and human trafficking are for another post - here I just want to make clear that I take no issue with legal pornography, in which actors are either paid adults or happily volunteering adults. All good, adult fun. Enjoy.
Deepfake porn is not that.
Techtarget.com defines “deepfake” as “a type of artificial intelligence used to create convincing images, audio and video hoaxes. The term describes both the technology and the resulting bogus content, and is a portmanteau of deep learning and fake.” You may have read stories about people using this technology to create fake audio of President Biden to discourage voting in New Hampshire’s primary election, and I expect that the 2024 election will see a lot more of this kind of horseshit.
The harm in the case of the fake Biden call is obvious: suppressed voter turnout which can undermine democracy and alter the outcome of an election. The harms from deepfaked pornography do not have global impact, but can harm individuals in obvious and less obvious ways.
The most immediately obvious harm from deepfaked pornography is that it will be seen by people who do not know that it’s fake. Will those people be interviewing the victim for a job in the future? What about a future partner? A parent or grandparent? The humiliation here is obvious, and with it comes potential loss of income.
Deepfake porn is also a potential end-run around laws banning so-called “revenge porn,” more appropriately referred to as image-based sexual abuse (IBSA). If platforms and/or the law continue to allow the creation and dissemination of deepfake porn, the assholes who produce IBSA cannot be prosecuted because they aren’t using “real” content. For example, New York state passed the first law against IBSA, and the statute criminalizes a person who “intentionally disseminates or publishes a still or video image of such other person, who is identifiable from the still or video image itself or from information displayed in connection with the still or video image, without such other person's consent…”
The key words here are “other person” because if the images are deepfaked, we aren’t dealing with a real other person, and the lack of personhood of the fake image will impede prosecution, even though the harms are virtually the same.
Importantly, the harms of deepfake porn could be felt by individual victims to be worse than real images. When a person is victimized by IBSA, in the past they knew the photos were real and probably also the person who took them; in many cases the photos may have been taken consensually within a loving relationship that later ended. This dissemination of IBSA is bad enough - but with deepfaked pornography the harm could come from anywhere. Literally anyone who knows you—or doesn’t know you but accessed a photo of you online—could create pornographic images of you.
The threat of anyone at anytime producing fake pornography of you must be harrowing for celebrities. I imagine feeling like you’re in a un-funhouse with the potential for naked pictures of you to be anywhere and everywhere. With anonymous strangers profiting off of these pictures, like on the disgusting website MrDeepfakes (to which I will not link). People are already doing it.
I don’t think it is in any sense acceptable to make deepfake porn of celebrities, but celebrities do, in general, give up a measure of personal privacy in exchange for their fame and fortune. Anonymous people do not, yet there are billions of photos of anonymous people on the internet, and any one of these could be turned into deepfake pornography. The threat of that happening is terrifying. The reality of it will be traumatizing for victims—and who knows how they will respond to the trauma? Therapy will be needed. Paranoia could result. Agoraphobia? Cameraphobia? Or, worst of all, suicide.
The real-world harms of deepfake pornography are real and tangible. Policymakers absolutely must get ahead of this problem before we all lose control of it, for as we all know, once images are on the Internet, it’s impossible to get rid of them.