UK online bank Starling Bank warns that with just three seconds of audio from a video posted online, scammers can use AI to copy the victim's voice to make fraudulent calls.
Millions of people could be scammed by people using artificial intelligence (AI) technology to clone their voices, UK online bank Starling Bank has said.
The bank warned that with just three seconds of audio from a video posted on social media, scammers could use AI to create a copy of a victim’s voice. They could then approach their friends and family to make fraudulent calls, asking for money transfers using the copied voice.
A survey of 3,000 people conducted by Starling Bank and Mortar Research in August found that more than a quarter of respondents said they had been targeted by an AI voice-spoofing scam in the past 12 months.
The survey also revealed that 46% of respondents were unaware of such scams, and 8% would transfer money at the request of a friend or family member even if they doubted the authenticity of the call.
People regularly post content online that includes their own voice recordings without ever thinking that it could make them an easy target for scammers, said Lisa Grahame, chief information security officer at Starling Bank.
Starling Bank recommends customers set up a “safe phrase” with their loved ones – a simple, random, memorable phrase that is different to passwords – to verify their identity over the phone.
The bank also recommends that customers do not share their security phrase via text message, as the message could be seen by a bad actor. If they do share it this way, they should delete it immediately after reading it.
Earlier this year, Australia's NAB bank included AI voice-spoofing scams in its list of top dangerous tricks that people in the Oceania nation should be wary of in 2024.
Associate Professor Toby Murray of the University of Melbourne's Department of Computer Science and Information Systems said AI voice cloning technology allows for the imitation of someone's voice with high accuracy, and the results are increasingly difficult to distinguish.
With just a few seconds of voice recording from a social media video, scammers can create a convincing clone of your voice. They then use this clone to make a call with a pre-recorded message, pretending to need money urgently and requesting money to be transferred via gift cards, e-wallets or bank accounts.
Associate Professor Murray said that once users share their personal voices and videos online, they cannot prevent their voices from being copied.
In the age of social media, he said, it is unreasonable to ask people not to post their videos online, just as it is difficult to stop others from creating fake social media accounts impersonating you.
However, users can set privacy for their social media accounts so that only friends and family can see the content they post.
TH (according to VNA)