London24NEWS

Woman discovered horrific pretend pics of her having intercourse on-line – who did them made her sick

A woman who was deepfaked and found fake ‘porn’ videos of herself having sex online – done by her best friend.

In 2019, Jodie – who goes by pseudonym – began seeing her social media pictures pop up on dating websites including OK Cupid and Happn, but tried to ignore it.

Then they appeared on Twitter alongside messages soliciting sex – before hundreds appeared on websites where men post ex-partners’ pics and encourage trolling.

Police were stumped as Jodie couldn’t pinpoint who was behind it but then a tip-off in March 2021 led her to grotesque AI-manipulated porn videos of her made from innocent snaps she posted online. Revealing her ordeal meant confronting “the ultimate violation” to loved ones.

But then she realised the culprit was her pal Alex Woolf – because of an unique picture only he had.

Jodie reported it to the Met Police but the creation, distribution and solicitation of deepfake images wasn’t – and still isn’t – considered to be a crime.



Woolf admitted 15 charges of sending messages that were grossly offensive
Woolf admitted 15 charges of sending messages that were grossly offensive

In August 2021, Woolf admitted 15 charges of sending messages that were grossly offensive or of an indecent, obscene or menacing nature over a public electronic communications network.

The derogatory comments accompanied pictures he uploaded to pornographic websites of the women, including Jodie, which had been taken from social media.

None of the pictures were pornographic or indecent, but he asked users to photoshop his victims’ heads onto pornographic actresses’ bodies, which were then posted on adult websites.

Only Jodie’s images were deepfaked, while others were normal images shared alongside grossly offensive language – and it was the language, not the deepfaked pictures of Jodie, which led to his conviction.

“When I saw the AI-generated pictures and videos, I was terrified,” admitted Jodie, 26. “There were nine or ten pictures and videos of me being what I can only describe as raped, and anally penetrated.”



Horrifyingly, one altered image depicted her head on a schoolgirl’s body embroiled in a ‘student-teacher relationship’.

“To take my photo out of context and have it used like that – I think it’s everyone’s worst nightmare. It was the ultimate violation.”

“In my victim impact statement I told how it made me feel suicidal and it has made it difficult for me to trust anyone again.

“He was cowering in the corner when he was sentenced and he couldn’t even look at me when I spoke to him.”

In April this year, it was announced a new law would be introduced to crack down on deepfake image abuse. But then the Conservatives were voted out of government, leaving Jodie and other victims questioning the future of the planned bill.

Clare McGlynn, Professor of Law at Durham University, who supports the campaign, explained the current legal standpoint. She said: “The current law only makes it illegal to distribute or threaten to distribute intimate photos or videos of someone, including deepfake images, without their consent.”

“A vital creation offence was announced in April 2024 under the previous government, which aimed to criminalise the act of making these images in the first place, though it would have only covered certain cases of creation. However, when the general election was called, that commitment fell with the Criminal Justice Bill.”



“So far, the new Labour government has not made any commitment to reintroduce a creation offence, leaving a critical loophole in place.”

Jodie feels deepfake is “the next iteration of violence against women and girls.”

She thinks the current situation allows “loopholes whereby perpetrators can get away with crimes without facing real repercussions or rehabilitation.”

She’s on a mission to get tougher penalties for those peddling and sharing deepfakes, and she wants the creators criminalised too. In September, Jodie kicked off a Change.org petition with The End Violence Against Women Coalition (EVAW), #NotYourPorn, Professor Clare McGlynn, and Glamour UK.

The petition demands: “For too long the government’s approach to tackling image-based abuse has been piecemeal and ineffective.”

“This crisis demands more.”