Japan Today

Here
and
Now

opinions

Deepfake porn: Why we need to make it a crime to create it, not just share it

5 Comments
By Clare McGlynn
Image: iStock/Cecilie_Arcurs

Deepfake pornography – where someone’s likeness is imposed into sexually explicit images with artificial intelligence – is alarmingly common. The most popular website dedicated to sexualized deepfakes, usually created and shared without consent, receives around 17 million hits a month. The content almost exclusively targets women. There has also been an exponential rise in “nudifying” apps which transform ordinary images of women and girls into nudes.

When Jodie, the subject of a new BBC Radio File on 4 documentary, received an anonymous email telling her she’d been deepfaked, she was devastated. Her sense of violation intensified when she found out the man responsible was someone who’d been a close friend for years. She was left with suicidal feelings, and several of her other female friends were also victims.

The horror confronting Jodie, her friends and other victims is not caused by unknown “perverts” on the internet, but by ordinary, everyday men and boys. Perpetrators of deepfake sexual abuse can be our friends, acquaintances, colleagues or classmates. Teenage girls around the world have realized that their classmates are using apps to transform their social media posts into nudes and sharing them in groups.

Having worked closely with victims and spoken to many young women, it is clear to me that deepfake porn is now an invisible threat pervading the lives of all women and girls. Deepfake pornography or nudifying ordinary images can happen to any of us, at any time. And, at least in the UK, there is nothing we can do to prevent it.

While UK laws criminalize sharing deepfake porn without consent, they do not cover its creation. The possibility of creation alone implants fear and threat into women’s lives.

Deepfake creation itself is a violation

This is why it’s time to consider criminalizing the creation of sexualized deepfakes without consent. In the House of Lords, Charlotte Owen described deepfake abuse as a “new frontier of violence against women” and called for creation to be criminalized.

It’s also a debate taking place around the world. The U.S. is considering federal legislation to give victims a right to sue for damages or injunctions in a civil court, following states such as Texas that have criminalized creation. Other jurisdictions such as the Netherlands and the Australian state of Victoria already criminalize the production of sexualized deepfakes without consent.

A common response to the idea of criminalizing the creation of deepfakes without consent, is that deepfake pornography is a sexual fantasy, just like imagining it in your head. But it’s not – it is creating a digital file that could be shared online at any moment, deliberately or through malicious means such as hacking.

It’s also not clear why we should privilege men’s rights to sexual fantasy over the rights of women and girls to sexual integrity, autonomy and choice. This is non-consensual conduct of a sexual nature. Neither the porn performer nor the woman whose image is imposed into the porn have consented to their images, identities and sexualities being used in this way.

Creation may be about sexual fantasy, but it is also about power and control, and the humiliation of women. Men’s sense of sexual entitlement over women’s bodies pervades the internet chat rooms where sexualized deepfakes and tips for their creation are shared. As with all forms of image-based sexual abuse, deepfake porn is about telling women to get back in their box and to get off the internet.

Taking the law further

A law that only criminalizes the distribution of deepfake porn ignores the fact that the non-consensual creation of the material is itself a violation. Criminalizing production would aim to stop this practice at its root.

While there are legitimate concerns about over-criminalization of social problems, there is a worldwide under-criminalization of harms experienced by women, particularly online abuse.

And while criminal justice is not the only – or even the primary – solution to sexual violence due to continuing police and judicial failures, it is one redress option. Not all women want to report to police, but some do. We also need new civil powers to enable judges to order internet platforms and perpetrators to take-down and delete imagery, and require compensation be paid where appropriate.

As well as the criminal law laying the foundation for education and cultural change, it can impose greater obligations on internet platforms. If creation of pornographic deepfakes was unlawful, it would be difficult for payment providers to continue to prop up the deepfake ecosystem, difficult for Google to continue returning deepfake porn sites at the top of searches and difficult for social media companies such as X or the app stores to continue to advertise nudify apps.

The reality of living with the invisible threat of deepfake sexual abuse is now dawning on women and girls. My women students are aghast when they realize that the student next to them could make deepfake porn of them, tell them they’ve done so, that they’re enjoying watching it – yet there’s nothing they can do about it, it’s not unlawful.

With women sharing their deep despair that their futures are in the hands of the “unpredictable behavior” and “rash” decisions of men, it’s time for the law to address this threat.

Clare McGlynn is professor of Law, Durham University, England.

The Conversation is an independent and nonprofit source of news, analysis and commentary from academic experts.

© The Conversation

©2024 GPlusMedia Inc.

5 Comments
Login to comment

So AI to throw us all out of work and run our battlefields is fine, but creating photos is a step too far!!!

2 ( +3 / -1 )

There is lots of malignant fakery that has become possible with AI. I do not know why this guy wants to single out this particular one.

-1 ( +2 / -3 )

There is lots of malignant fakery that has become possible with AI. I do not know why this guy wants to single out this particular one.

As the article helpfully explains the problem is that people are trying to argue that creating deepfakes should not be considered a crime, only sharing the fakes would. The arguments why this is not adequate make a lot of sense.

2 ( +5 / -3 )

A little like an argument trying to legitimize the ownership of weapons of mass destruction, by saying it isn't their ownership that is the problem, but their use.

2 ( +2 / -0 )

Ban all creators of all pornography, sales and possession of it to make the world a better and safer place for the decent people.

-4 ( +0 / -4 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites