AI can now replicate anyone's voice from a few seconds of audio. Here's what that means for you.
Beth Andress
Digital Self Defence & AI Governance Educator
"If you hear a familiar voice asking for money urgently, that's exactly when you should be most skeptical."
For most of human history, a person's voice was one of the most reliable ways to confirm their identity. You knew your mother's voice, your boss's voice, your bank manager's voice. That certainty is now gone. AI voice cloning technology can replicate a person's voice from as little as three seconds of audio — enough to be captured from a social media video, a voicemail, or a public recording. The resulting clone is convincing enough to fool family members, colleagues, and even trained professionals.
The technology has become widely accessible. Tools that once required significant technical expertise and expensive computing resources are now available as consumer applications, some of them free. This has enabled a new category of fraud that is growing rapidly in Canada: the grandparent scam, now enhanced with AI voice cloning. In the traditional version, a caller claims to be a grandchild in trouble — arrested, in a car accident, in hospital — and urgently needs money. With voice cloning, the call now sounds exactly like the grandchild. The emotional impact is immediate and overwhelming.
Business email compromise has evolved into business voice compromise. Fraudsters clone the voice of a CEO, CFO, or senior executive and call finance staff directly, requesting urgent wire transfers or changes to banking details. The voice is familiar, the request sounds legitimate, and the urgency prevents the employee from pausing to verify through other channels. Canadian businesses have lost millions of dollars to this type of fraud, and the losses are increasing as the technology improves.
The protective response to voice cloning is the same principle that applies to all social engineering: independent verification through a separate channel. If you receive a call from someone you know — a family member, a colleague, a financial institution — requesting urgent action involving money or sensitive information, end the call and contact that person directly using a number you already have. Do not call back on the number that called you. Do not assume that a familiar voice confirms identity.
Families and organizations can also establish verification protocols in advance. A family code word — a word or phrase that only family members know, agreed upon before any emergency arises — can be used to verify identity in a suspected cloning situation. Organizations can implement callback verification requirements for any financial transaction requested by phone or voice message, regardless of how familiar the voice sounds.
It's also worth auditing your public audio footprint. Voice cloning requires source material, and the most accessible source material is public social media. Videos on Instagram, TikTok, YouTube, Facebook, and LinkedIn all contain audio that can be used to train a voice clone. This doesn't mean you should delete all your social media, but it does mean being thoughtful about what you post publicly, particularly if you're in a position of financial authority or have family members who might be targeted.
The broader lesson is that the verification methods we've relied on for generations — recognizing a voice, seeing a familiar name, receiving a call from a known number — are no longer sufficient. Caller ID can be spoofed. Voices can be cloned. Email addresses can be impersonated. The only reliable verification is independent contact through a channel you control, using information you already have.
Next Step
Beth's AI fraud awareness sessions help organizations understand and respond to emerging AI-enabled threats.
Learn About AI Fraud Awareness TrainingContinue Reading