The emergence of advanced AI technology demonstrates swift technological progress, offering a wide array of benefits. Nevertheless, along with these advantages come potential risks, such as a recent rise in scams utilizing Artificial Intelligence, as highlighted by Starling Bank, a digital bank in the UK.
How AI operates voice duplication technology
AI can mimic a person’s voice accurately using voice cloning technology with just a brief audio snippet, as demonstrated by Starling Bank’s revelation of scammers replicating voices with only three seconds of audio.
Criminals can exploit relationships with the victim’s acquaintances to request money, as highlighted by a Starling Bank study. Over 25% of respondents were targeted by AI voice cloning scams in the last year, with 46% unaware of such threats. Alarmingly, 8% admitted they would send money to a suspicious contact believed to be a friend or relative. The risk of voice sharing online is underestimated, especially on social media platforms where personal details are freely shared.
How to defend against an AI-driven coup?
Starling Bank recommends using a unique, easy-to-remember “safe phrase” as a protective measure against voice cloning. It should be different from common passwords and not shared via text message to prevent exploitation by scammers. Avoiding misuse of AI tools is crucial as seen with OpenAI’s decision not to release their Voice Engine due to scam risks.