London24NEWS

BARONESS OWEN: Why we should finish the scourge of deepfake pornography now

Imagine discovering that your image has been inserted into a porn film. The appalling footage is being watched over and over by a person you know. They have the ability to upload it to the internet or share it at any moment.

You feel physically sick. How has this happened? Your image has been taken and manipulated using AI technology without your consent. It’s not really you on screen – but it’s horrifyingly realistic.

This is the disgusting world of ‘deepfake‘ pornography. An image of a real woman’s face is taken and made into explicit porn in a matter of seconds using digital trickery.

Let me be clear: this is sexual abuse. It should be a crime – but, shockingly, it is not.

That is why I am introducing a Bill today to the House of Lords to outlaw this disturbing practice.

Baroness Owen (pictured) said of deepfake pornography: 'This is sexual abuse. It should be a crime ¿ but, shockingly, it is not'

Baroness Owen (pictured) said of deepfake pornography: ‘This is sexual abuse. It should be a crime – but, shockingly, it is not’

File photo. 'No woman is safe from the technology used to create this deepfake porn'

File photo. ‘No woman is safe from the technology used to create this deepfake porn’

No woman is safe from the technology used to create this deepfake porn, which is growing increasingly common thanks to ‘nudify’ apps and a large number of online platforms that create deepfake content.

Analysis from campaign group MyImageMyChoice has found that 80 per cent of deepfake websites and apps have launched in the past 12 months alone, with the largest receiving an average of 13.4million hits a month.

The Government’s power to enact legislation cannot keep pace with such growth, meaning that laws relating to the making of obscene images are woefully inadequate.

Last year, the Online Safety Act made it illegal to share intimate images online without consent, but it left glaring omissions, which means that creating sexually explicit deepfakes remains legal.

It is time to act. No woman should see her image used in porn without her consent, nor should she fear that it might be.

My Private Members’ Bill will make it illegal to take, create or solicit non-consensual explicit images and videos, and bring an end to this abuse. The Bill has been inspired by a real woman we call Jodie, who has bravely told me of her experience of being deepfaked by someone she believed to be her best friend.

Jodie discovered that some of her Instagram images had been ‘nudified’ and turned into pornography, then posted on online forums such as Reddit.

'No woman is safe from the technology used to create this deepfake porn, which is growing increasingly common thanks to ¿nudify¿ apps', Baroness Owen writes (File image)

‘No woman is safe from the technology used to create this deepfake porn, which is growing increasingly common thanks to ‘nudify’ apps’, Baroness Owen writes (File image) 

'Women are sick and tired of living in fear of this appalling abuse', the Baroness said (pic 2023)

‘Women are sick and tired of living in fear of this appalling abuse’, the Baroness said (pic 2023)

For five years she endured this abuse, finding hundreds of X-rated pictures and videos of herself, her friends and other young women. The pornographic content was often accompanied by degrading comments in which viewers described what they would like to do to ‘little Jodie’, encouraging each other to rate her body.

Due to the lack of regulation, Jodie struggled to bring the perpetrator to justice. She eventually found recourse in the Communications Act, under which the suspect was charged with sending messages that were grossly offensive or of an indecent, obscene or menacing nature. However, no charges were brought for the creation of the deepfake pornography.

Because of Jodie’s experience, my legislation includes the key offence of ‘soliciting creation’.

The person who ‘creates’ the vile image or video is bad enough – but it’s equally important that anyone who asks another person to do so is also held to account. By making it unequivocal in law that both creating and soliciting these images and videos without consent is a crime, the path to justice will be clearer for victims like Jodie.

The Bill will also close a loophole around another deeply disturbing trend in deepfake pornography.

Photos of women¿s faces, again innocuously uploaded to the internet, are posted to online forums where users adapt the images (File image)

Photos of women’s faces, again innocuously uploaded to the internet, are posted to online forums where users adapt the images (File image) 

A victim who finds that a picture of her face has been used in this way has to pursue the perpetrator through communications and harassment legislation (File image)

A victim who finds that a picture of her face has been used in this way has to pursue the perpetrator through communications and harassment legislation (File image) 

Photos of women’s faces, again innocuously uploaded to the internet, are posted to online forums where users adapt the images in ways too revolting to describe here. 

Under current legislation, as the original picture is not an ‘intimate image’, this is not included in the Sexual Offences Act. 

A victim who finds that a picture of her face has been used in this way has to pursue the perpetrator through communications and harassment legislation.

My Bill will bring redress to victims in this ever-evolving area. It not only seeks to prevent virtual violations, but also endeavours to make its future forms punishable, too. 

Women are sick and tired of living in fear of this appalling abuse. With tens of thousands of new pieces of pornographic content created each week, they cannot afford to wait any longer.