Starling warns of rise in voice cloning scams

[ad_1]

Voice cloning scams – where fraudsters use AI technology to replicate the voice of a friend or family member – could be set to catch millions out, according to new research from Starling Bank.

Editorial

This content has been selected, created and edited by the Finextra editorial team based upon its relevance and interest to our community.

The study found that over a quarter (28%) of UK adults say they have been targeted by an AI voice cloning scam at least once in the past year.

Starling says faudsters can now use voice cloning technology to replicate a person’s voice from as little as three seconds of audio, which can easily be captured from a video someone has uploaded online or to social media.

Scam artists can then identify that person’s family members and use the cloned voice to stage a phone call, voice message or voicemail to them, asking for money that is needed urgently. In the survey, nearly 1 in 10 say they would send whatever they needed in this situation, even if they thought the call seemed strange.

Despite the prevalence of this attempted fraud tactic, just 30% say they would confidently know what to look out for if they were being targeted with a voice cloning scam.

To help combat the fraudsters, Starling Bank has launched the Safe Phrases campaign, in support of the government’s Stop! Think Fraud campaign, encouraging the public to agree a ‘Safe Phrase’ with their close friends and family that no one else knows, to allow them to verify that they are really speaking to them.

Lisa Grahame, chief information security officer at Starling Bank, comments: “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters. Simply having a Safe Phrase in place with trusted friends and family – which you never share digitally – is a quick and easy way to ensure you can verify who is on the other end of the phone.”

To launch the campaign, Starling Bank has recruited leading actor, James Nesbitt, to have his own voice cloned by AI technology, demonstrating just how easy it is for anyone to be scammed.

Commenting on the campaign, Nesbitt says: “I think I have a pretty distinctive voice, and it’s core to my career. So to hear it cloned so accurately was a shock. You hear a lot about AI, but this experience has really opened my eyes to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands. I have children myself, and the thought of them being scammed in this way is really scary. I’ll definitely be setting up a Safe Phrase with my own family and friends.”

[ad_2]

Source link


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *