By Adedapo Adesanya
Experts and regulators have warned that Artificial Intelligence (AI) scams using voice cloning are the new frontier for fraudsters targeting consumers.
According to the Southern African Fraud Prevention Service (SAFPS), impersonation attacks increased by 264 per cent for the first five months of the year compared to 2021.
According to iiDENTIFii, a remote biometric digital authentication and automated onboarding technology platform, the format involves receiving a call, email or SMS from the authorities urgently requesting payment.
“The details of the request are clear and professional and include personal information unique to you, so there is no reason to doubt it. This scam is fairly common, and the majority of consumers are on the lookout for it,” it noted.
“Now imagine receiving a call from a loved one and hearing their unmistakable voice on the other end of the line saying that they need money or your account information right away. This may sound like a fraud lifted straight out of science fiction, but – with the exponential development of AI tools – it is a growing reality,” it added in a statement.
Mr Gur Geva, founder and CEO of iiDENTIFii said the threat of voiceprint has become easier since it has become cheaper and more accessible.
“The technology required to impersonate an individual has become cheaper, easier to use and more accessible. This means that it is simpler than ever before for a criminal to assume one aspect of a person’s identity.”
“Historically, voice has been seen as an intimate and infallible part of a person’s identity. For that reason, many businesses and financial institutions used it as a part of their identity verification toolbox,” he explained.
This also raised worry among regulators in the United States; the Federal Trade Commission (FTC) last week issued an alert urging consumers to be vigilant for calls in which scammers sound exactly like their loved ones.
“All a criminal needs is a short audio clip of a family member’s voice – often scraped from social media – and a voice cloning program to stage an attack,” it warned.
iiDENTIFii warned that the potential of this technology is vast. Microsoft, for example, has recently piloted an AI tool that, with a short sample of a person’s voice, can generate audio in a wide range of different languages.
“While this has not been released for public use, it does illustrate how voice can be manipulated as a medium.”
Audio recognition technology has been an attractive security solution for financial services companies across the globe, with voice-based accounting enabling customers to deliver account instructions via voice command.
Voice biometrics offers real-time authentication, which replaces the need for security questions or even PINs, with companies like Barclays and Visa adopting voice-based authentication platforms for e-commerce.
“As voice-cloning becomes a viable threat, financial institutions need to be aware of the possibility of widespread fraud in voice-based interfaces. For example, a scammer could clone a consumer’s voice and transact on their behalf,” Mr Geva warned.
The rise of voice-cloning, according to the expert, illustrates the importance of sophisticated and multi-layered biometric authentication processes.
“Our experience, research and global insight at iiDENTIFii has led us to create a remote biometric digital verification technology that can authenticate a person in under 30 seconds, but more importantly, it triangulates the person’s identity, with their verified documentation and their liveness.
“While identity theft is growing in scale and sophistication, the tools we have at our disposal to prevent fraud are intelligent, scalable and up to the challenge,” he concluded.