Fraudsters are using AI technology to replicate the voices of unassuming victim’s friends or family members and using them to extort money.

More than a quarter of people have already been targeted by these AI voice cloning scams at least once in the past year.

And now millions more could be at risk of getting caught out, data shows.

Fraudsters are using AI voice cloning techniques to tirck their victims into sending money

It comes as the Payments Systems Regulator confirmed that it would cut the proposed fraud reimbursement limit from £415,000 to £85,000 with the rules coming into force on 7 October.

This is Money asked banks how they are combatting the rising tide of AI voice cloning scams and what they are doing to protect customers.

Why are AI voice cloning scams a concern?

An AI voice cloning scam is a sophisticated type of scam where fraudsters use voice cloning technology to replicate a person’s voice from a short clip of audio.

Fraudsters can cheaply and easily capture and make an audio deepfake online in just a few minutes.

The audio clips used in AI voice cloning scams can easily be captured from a video someone has uploaded online or to social media or even from a voicemail message.

Scammers are said to only need three seconds of audio to clone your voice.

They can identify their victim’s family members and use the cloned voice to stage a phone call, voice message or voicemail to them, asking for money that is needed urgently, for example due to being in an accident or to pay rent.

What are banks doing to protect customers?

Rob Woods, fraud and identity specialist at LexisNexis Risk Solutions said: ‘AI driven deepfake scams, such as voice cloning, are an increasing concern for UK banks, since they’re an effective way to convince victims that they need to urgently make a money transfer to a friend or relative in need.’

The problem with these scams is that by getting the victim to authorise the payment, fraudsters effectively bypass all the bank’s strong authentication steps put in place to stop them stealing your money.

‘The challenge for banks is in understanding how to spot fraudulent transfer requests, compared to those that are genuine,’ Woods continues.

‘There are a number of risk signals available that can help, such as AI-powered behavioural biometrics that analyses how a phone is being used, and live call detection – banks use these types of signals to build risk models that help detect when fraud could be underway.’

While criminals are finding more and more ways to use AI to scam people, banks are also using AI to fight fraud and having been doing so for the last 20 years.

Woods said: ‘Three major banks that introduced AI models to tackle scams saw an average uplift of 260 per cent in fraud detected.’

LexisNexis Risk Solutions was not able to reveal which three banks have introduced AI models to tackle scams.

Santander

Santander uses machine learning models, powered by a company called Lynx Tech, to fight fraud in cards and payments.

Lynx Tech’s platform uses AI to learn customers’ transactional behaviour and detect fraud. 

It says that its system sifts through 66billion transactions and protects 300million customers from fraud every year.

A Santander spokesman said: ‘We have been using AI for payment detection, behavioural detection, and a variety of other use cases for a number of years.

‘Reported AI scams are quite hard to identify, as a lot of the time customers don’t know that AI has been involved in the scam.

‘We have a range of comprehensive checks and balances in place to detect and prevent AI voice cloning being used.’

Santander research found over half of Britons have either not heard of the term deepfake, or misunderstood what it meant.

While only 17 per cent of people confident they could easily identify a deepfake video.

Nationwide

One of the ways Nationwide Building Society protects its customers from AI Voice cloning scams is by not enabling payments over telephone banking. If a customer wanted to make a payment to a friend or family member, they would be able to make the payment in a branch

Nationwide also uses AI to analyse transaction data. 

A Nationwide spokesman said: ‘Nationwide doesn’t allow payments to be made via telephone banking, but we still monitor for suspicious voice activity to keep our customers safe.

‘AI and advanced analytics form an important part of our multi-layered fraud prevention framework. 

‘We are concerned about the growth in these scams and work hard to ensure our customers remain protected. This includes the use of advanced analytics and voice-specific controls. It is important that consumers are vigilant to attacks that can impact them directly, such as callers purporting to be family or friends.’

Barclays

Barclays invests in multi-layered security systems that help protect customers. It says these typically prevent several thousands of attempted fraudulent transactions every day. 

This includes a sophisticated transaction profiling system that is unique to every customer.

A Barclays spokesman said: ‘For each of the 50million plus payments our UK customers make every month, our fraud detection systems and machine learning models determine in less than a second if it is likely to be a fraudster rather than the customer, or if our customer appears at risk of being scammed.

‘If the transaction seems risky, the customer is presented with additional checks prior to the payment being released.’

‘Alongside our technical prevention, we work tirelessly to help arm the public with information and tools to spot and stop fraud and scams, including warning messages throughout the payment journey, scams education via in-app notifications, social media channels, press and a dedicated website.’

Starling Bank

Starling launched a Safe Phrases campaign, in support of the government’s fraud campaign to raise awareness of AI voice cloning scams among customers in response to their rise.

It is encouraging people to agree a safe phrase or password with their close friends and family that no one else knows, to allow them to verify that they are really speaking to them when they call asking for money to be sent.

Then if anyone is contacted by someone purporting to be a friend or family member, and they don’t know the phrase, they can immediately be alerted to the fact that it is likely a scam. 

Santander uses AI powered by Lynx Tech to fight fraud in cards and payments

Santander uses AI powered by Lynx Tech to fight fraud in cards and payments

Monzo

Monzo introduced new fraud protections for customers earlier this year.

One of these was a trusted contacts feature where customers can chose a friend or family member to double check any bank transfers and savings withdrawals over their set daily allowance.

It involves customers consenting to selected friends and family seeing some details about transactions they are making. Then Monzo will ask them to confirm it’s really you and check that it looks safe. 

The idea behind it is that, as someone who knows the customer, friends and family will be able to flag if anything looks suspicious. For example if they know your you’re not planning any large purchases.

NatWest

A spokesman from NatWest said: ‘AI voice cloning scams are a threat that we do recognise and monitor, drawing in internal and external technical experts to ensure we have robust authentication and detection capabilities to prevent this type of abuse and testing of this protection exists.’

‘We recognise the opportunity that fraudsters have, and the fast-paced evolution of technologies which are making more accurate clones of individuals voices, and that these synthetic voices can be used to either manipulate a customer via social engineering, or can be used in an effort to impersonate our customers to the bank, to gain access to banking services or customer funds

‘Improved privacy controls on social media are important but also people can help protect themselves by considering what information they are publicly sharing on social media.’

Woods said: ‘Banks and other financial services are now under even more pressure to stop APP fraud as a result of the PSR’s new scam reimbursement rules.

We also approached HSBC and Lloyds for comment, but both chose not to comment. 

SAVE MONEY, MAKE MONEY

5.09% on cash for Isa investors

Investing boost

5.09% on cash for Isa investors

Investing boost

5.09% on cash for Isa investors

90 day notice account rate boost

5.27% savings rate

90 day notice account rate boost

5.27% savings rate

90 day notice account rate boost

No account fee and free share dealing

Free share offer

No account fee and free share dealing

Free share offer

No account fee and free share dealing

Flexible Isa that now accepts transfers

4.84% cash Isa

Flexible Isa that now accepts transfers

4.84% cash Isa

Flexible Isa that now accepts transfers

Get £200 back in trading fees

Dealing fee refund

Get £200 back in trading fees

Dealing fee refund

Get £200 back in trading fees

Affiliate links: If you take out a product This is Money may earn a commission. These deals are chosen by our editorial team, as we think they are worth highlighting. This does not affect our editorial independence.

Some links in this article may be affiliate links. If you click on them we may earn a small commission. That helps us fund This Is Money, and keep it free to use. We do not write articles to promote products. We do not allow any commercial relationship to affect our editorial independence.


Source link

0 Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like