Millions of people are at risk of falling prey to AI-driven voice-cloning scams, warns Starling Bank. Fraudsters can now replicate a person’s voice from just a few seconds of audio found online, using it to deceive family and friends into sending money. As AI technology advances, the threat of synthetic voice fraud is growing, with hundreds already affected in the past year.

Key Points:

  • AI Voice-Cloning Threat: Scammers can mimic someone’s voice from just three seconds of audio, using it to stage fake calls that trick loved ones into sending money.
  • Survey Findings: Over 25% of surveyed adults have been targeted by these scams in the past year, yet 46% were unaware of the threat.
  • Safe Phrases for Protection: Starling Bank advises people to agree on a “safe phrase” with loved ones to verify identities over the phone and avoid falling victim to AI scams.
  • Growing Concern: As AI voice replication tools become more sophisticated, concerns are mounting about its potential misuse in fraud and misinformation.

With AI voice-cloning scams on the rise, it’s more important than ever to stay vigilant and take protective measures like using safe phrases with loved ones. As AI continues to evolve, so too do the risks associated with it.