
AI-generated voice scams—also known as voice cloning or deepfake audio scams—are rising sharply, exploiting advances in artificial intelligence.
Using just a few seconds of audio, scammers replicate voices of family members, executives, or public figures with uncanny accuracy.
Scammers use cloned voices to trick victims into sending money or revealing sensitive information by creating urgency and emotional distress.
Common targets include older adults, who are often contacted by someone claiming to be a grandchild in trouble.
In one case, a Canadian couple lost $21,000 after receiving a call from a voice mimicking their son.
Scammers also impersonate company executives to defraud employees, and even public figures to spread misinformation, as seen in a 2023 robocall campaign mimicking U.S. President Joe Biden.
AI tools for voice cloning are increasingly accessible, with some services costing as little as $5.
This ease of access has led to billions in losses, particularly in imposter scams.
To protect yourself: avoid unknown calls, safeguard your voice recordings, establish a private family password, and verify any urgent claims through alternative means.
Staying vigilant and informed is essential to avoiding deception in this rapidly evolving threat landscape.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.