Labour will ban the creation of sexually explicit deepfake images this year, a Home Office minister said yesterday.
Safeguarding minister Jess Phillips promised action on the issue, despite the Government failing to back a Tory bid to outlaw the vile practice last month.
Ms Phillips said she had been a victim of the trend, in which fake explicit images of victims are created and shared.
‘The Government fully intends to legislate in this session on the use of deepfakes,’ she said.
‘My feeling comes from the fact that I am a victim of this crime. And so I understand the violation on a personal level.’
Tory peer Baroness Owen brought a private members’ bill on the issue to the Lords last month. But ministers did not back it, leaving it with no chance of becoming law.
It comes despite an open letter signed by 400 AI experts, celebrities and politicians, demanding lawmakers back a blanket ban against deepfake technology.
The letter argued that AI-generated videos are a threat to society due to the involvement of sexual images, child pornography, fraud, and political disinformation.
Safeguarding minister Jess Phillips pledged to ban the creation of sexually explicit deepfake images this year, adding that she herself had been targeted by the technology
A private members’ bill was tabled in the Lords on the issue by Tory peer Baroness Owen, but it failed to gain support of government ministers (file photo)
Last year, sexually explicit deepfake images of Taylor Swift went viral on social media, fuelling calls for stricter regulation of the technology.
Another target of the deepfake technology was Harry Potter star Emma Watson, who was featured in a fake ad on social media where she appeared to engage in a sexual act.
In 2023, deepfaked videos received a reported 34 million views in 2023, with women accounting for 99 percent of the people targeted.