London24NEWS

Fraudsters utilizing AI to copy speech of household and associates to beg for money

Voice cloning scams are soaring with millions set to be targeted by AI fraudsters. The scammers use technology to replicate the speech of family or friends before begging for cash.

‌A government-backed report found as many as 28% of Brits have fallen prey to one of the futuristic swindles in the last 12 months.

‌Greedy con artists are making millions from the sly hustles because half of folk surveyed have never even heard of such scams, let alone know how to protect themselves.

READ MORE: ‘World’s sunniest city’ where locals sweat 50C heats and ‘only go out at night’

Click for more of the latest news from the Daily Star.



criminal on laptop
Scammers are using AI to replicate our family’s voices

‌Fraudsters can clone a voice from as little as three seconds of audio, which can easily be captured from a video someone has uploaded online to social media.

‌The thieves then trawl profiles to identify relatives before calling them or sending them voicemails or voice notes asking them for cash urgently.

‌Banks are now encouraging people to create a safe word or phrase which only their relatives would know in the event of an urgent request for money.

‌Security expert Lisa Grahame, of Starling Bank, said: “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters.



James Nesbitt is helping to spearhead the campaign
James Nesbitt is helping to spearhead the campaign

‌“Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a safe phrase to thwart them.

‌“Simply having a safe phrase in place with trusted friends and family – which you never share digitally – is a quick and easy way to ensure you can verify who is on the other end of the phone.”

‌Starling Bank recruited Northern Irish actor James Nesbitt to promote their safe phrases campaign to show how easy it is to recreate even the most distinctive voices.



Don't fall foul of the new AI voice scams
Don’t fall foul of the new AI voice scams

‌The Cold Feet star, 59, said: “To hear it cloned so accurately was a shock.

‌“You hear a lot about AI, but this experience has really opened my eyes to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands.

‌“I have children myself, and the thought of them being scammed in this way is really scary.”

‌Lord David Hanson, fraud minister at the Home Office, added: “AI presents incredible opportunities for industry, society and governments but we must stay alert to the dangers, including AI-enabled fraud.”