Starling Bank warns of voice cloning scams

Voice Cloning

In a rapidly evolving digital landscape, a new generation of scams is emerging. These scams use high-tech voice cloning to deceive victims. Often, they replicate the voices of friends or family members to ask for money.

This makes them highly convincing. New data reveals that over a quarter of UK adults have been targeted by voice cloning scams in the past year. That’s 28% of the population.

Concerningly, almost half of the population is unaware that such scams are even possible. This increases their likelihood of falling victim. A survey conducted by Starling Bank involved over 3,000 participants.

It highlighted the alarming rise of voice cloning scams. Using as little as three seconds of audio, scammers with sophisticated AI technology can mimic a person’s voice. Given the abundance of voice recordings on social media, it’s easier than ever for fraudsters to gather the necessary audio samples.

They then use these samples to make phone calls or send voice messages to the victim’s family members. They urgently request money. Nearly 1 in 10 respondents admitted they would send money even if the call sounded suspicious.

This potentially puts millions at risk. Despite this, only 30% of people said they would feel confident recognizing a voice cloning scam.

Voice cloning scams on the rise

Starling Bank has urged the public to adopt extra security measures. One suggestion is using a code word or phrase known only to close friends and family. This can verify the authenticity of emergency calls.

The initiative is part of the government’s ‘Stop! Think Fraud’ campaign. It aims to increase public awareness and reduce fraud incidents. The Starling research notes that financial fraud offences in England and Wales surged by 46% last year.

The average UK adult has been targeted by fraud scams five times in the past 12 months. Lisa Grahame is the bank’s chief information security officer. She emphasized the importance of being aware of online vulnerabilities.

“People regularly post content online which includes recordings of their voice, without imagining it’s making them more vulnerable to fraudsters,” she stated. To illustrate the ease with which such scams can be conducted, actor James Nesbitt participated in the campaign by having his voice cloned. Nesbitt is known for his distinctive voice.

He found the experience eye-opening. “Hearing my own voice cloned so accurately was shocking. It highlighted how advanced AI technology has become and how easily it can be misused for criminal activities.

I’ll definitely be setting up a safe phrase with my own family and friends,” he said. The public is encouraged to stay vigilant. They should protect themselves and their loved ones by adopting simple but effective measures against these sophisticated scams.

devxblackblue

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.

About Our Journalist