This post first appeared on the Retail Banker International website.
Jonathan Anastasia outlines why it is becoming so important for financial institutions to find new ways to protect consumers against increasingly sophisticated payments fraud
Trust is a vital part of how our societies function. It’s what enabled the first communities to form, as people understood they’d be better off working together than alone. Across the centuries people have continually innovated new ways to connect with one another, and to prove their identity and that their actions are genuine.
But for as long as there has been trust, there have been those looking to exploit it – and in our digitized world this can have far-reaching consequences if left unchecked. From romance fraud, where people are duped into sending money to someone they may not know, to fictitious online deals, these impersonation scams have shaken the confidence of their victims.
Take the recent example of a US couple who thought they were helping their grandson after an accident, when in fact they were being scammed. These things have become commonplace: one in every five people surveyed in 2022 said they had fallen victim to payments fraud in the last four years, and that figure may be increasing.
The global fraud challenge – and the opportunities it presents
As recent cases show much of these scams are happening in plain sight. In our fast-paced world of real time payments, there has been a particular rise in fraud where the consumer is coerced into authorising the transaction themselves.
This contemporary kind of scam, known as Authorised Push Payment (APP) fraud is increasingly common: it affected 26.9% of all victims of fraud in 2022, making it the number one global threat.
The scam is simple but pernicious: a fraudster sends a message, often by text or email, to a member of the public asking for them to transfer money to another account, posing as an acquaintance, their bank or another trusted institution.
Scammers typically find and target their victims online, and until now it has been very difficult for banks to intercept due to the increasing speed of digital transactions. FIs need to implement new ways to protect their retail banking customers from this very modern-day fraud – and to retain their trust.
AI-Powered Anti-Fraud
At Mastercard, trust is at the heart of our business – we continuously assess the cyber landscape for new challenges and threats, innovating to create new opportunities to protect the connections that fuel our economy.
Using our latest AI technology and a unique network view of payments on our network, it’s been possible to create cutting-edge technology that can allow banks to predict which payments are likely to be scams and stop them in real-time before funds leave a victim’s account.
The AI looks across a range of factors, from a person’s banking habits to payee history in order to seek and identify markers of fraud.
High street banks leading the way
As a result of this technology, we can now help banks predict and prevent payments to scams of all types in real time.
It is already in use by nine major banks in the UK, including TSB, Lloyds Bank, Halifax, Bank of Scotland, NatWest, and Monzo.
And in just four months TSB, an early adopter of the Consumer Fraud Risk tool, said it has been able to dramatically increase its fraud detection. Indeed, it believes the amount of scam payments prevented over the course of a year would equate to almost £100m ($125m) should all UK banks follow suit adopt it.
The AI-enabled future of anti-fraud
A proactive approach in tackling financial crime like this is non-negotiable if we are to diminish the impact of digital crime to enable banks to retain the trust of their customers – which is why we are now looking to expand this program to other parts of the world.
By remaining ahead of the game and leveraging the latest tools and insights we’re actively creating a future built on trust, in which all our customers and consumers feel safe and able to keep pace with the advances brought about by our digitized world.