- 28% of people have been targeted by this type of scam in the last 12 months
A sinister voice cloning scam is on the rise and millions of people could be at risk of getting caught out.
Fraudsters are using AI technology to replicate the voices of their unassuming victim’s friends or family members and using them to extort money.
Over a quarter of people say they have been targeted by AI voice cloning scams at least once in the past year, a survey of 3,000 people from Starling Bank shows.
Yet, nearly half of people have never heard of this type of scam.
New scam alert: Fraudsters are using AI voice cloning techniques to tirck their victims into sending money
Financial fraud across the board is on the rise, with the average UK adult targeted by a scam five times in the past 12 months.
But AI is giving criminals new ways to target people for money.
What is AI voice cloning scam and how does it work?
An AI voice cloning scam is a sophisticated type of scam where fraudsters use voice cloning technology to replicate a person’s voice from a short clip of audio.
Fraudsters can cheaply and easily capture and make an audio deepfake online in just a few minutes.
Deepfakes are audio clips, videos and photographs that mimic a real person.
The audio clips used in AI voice cloning scams can easily be captured from a video someone has uploaded online or to social media or even from a voicemail message.
Lisa Grahame, chief information security officer at Starling Bank, said: ‘Scammers only need three seconds of audio to clone your voice.’
Scammers can then identify their victim’s family members and use the cloned voice to stage a phone call, voice message or voicemail to them, asking for money that is needed urgently, for example due to being in an accident or to pay rent.
Starling Bank found that nearly one in 10 people say they would send whatever the person calling needed in this situation, even if they thought the call seemed strange – meaning millions of people could be at risk of falling victim to this scam.
How to spot and avoid AI voice cloning scams
Only three in 10 people say they would know what to look out for if they were being targeted with a voice cloning scam.
Voice cloning can be highly convincing because it uses the voice of someone familiar to you but scammers are able to manipulate them with technology to say whatever they want.
You should listen out for any unnatural pauses or robotic-sounding speech if you are being asked to send money to someone saying they are a friend or family member who needs money on the phone.
One of the main ways to avoid falling for this scam is to agree on a password with your family that is easy to remember and not public information.
That way, when a family member calls you with an urgent request for money you can ask them for the password. If they can’t give it to you, then it could be a warning that a scammer is on the end of the line.
In response to the rise of AI voice cloning scams, Starling Bank has launched a Safe Phrases campaign, in support of the government’s fraud campaign.
It is encouraging people to agree a ‘safe phrase’ or password with their close friends and family that no one else knows, to allow them to verify that they are really speaking to them when they call asking for money to be sent.
Lisa Grahame adds: ‘People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters.
‘It would only take a few minutes with your family and friends to create a safe phrase to thwart them.
‘So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim.
‘It is a quick and easy way to ensure you can verify who is on the other end of the phone.’
SAVE MONEY, MAKE MONEY
Affiliate links: If you take out a product This is Money may earn a commission. These deals are chosen by our editorial team, as we think they are worth highlighting. This does not affect our editorial independence.