This past February, Paul Vallas, a Chicago mayoral candidate, learned of a recording circulating on Twitter. In it, he contradicted his previous political stances and insulted key supporters. It was dynamite. It was also AI generated. Vallas never said such things. It was one of a growing number of voice deepfakes.
No big deal, you say? Voice deepfakes may target politicians and celebrities but not John Q. Public.
Think again. You are already in the crosshairs.
AI voice cloning is almost indistinguishable from human speech. It is also readily available online. A quick Googe search lists several apps that will allow users to create AI voices. All that’s needed is a small sample (a few seconds) of someone’s real voice. Once the sample has been processed, a voice deepfake can call or leave voicemails that are virtually indistinguishable from the real thing.
Voice deepfakes have become the new standard for the classic “your grandson is in jail” phone scams. Don’t think you’ll fall for that old gag? OK, but what if you received a brief voicemail from your son or daughter in distress? One that sounds like the real thing? Many have been fooled.
Still unconcerned? Then, read this excerpt from a New York Times article entitled, “Voice Deepfakes Are Coming for Your Bank Balances.” Many financial institutions use voice authentication systems to verify customers’ identities. Scammers know this; they have begun using voice deepfakes to impersonate customers. While the institutional parties involved are understandably reticent to discuss specifics, the scammers have had some success and the threat is growing.
So, what should you do? First, be aware of the potential problem. Ignore calls and messages from unfamiliar phone numbers. Ask your bank or other financial institution to use another identity verification (2 step security) method. No more voice authentication, please. Finally, be aware of what you are posting online. Reducing the instances of your recorded voice online might not be a bad idea.
It’s a new, AI influenced world. Voice deepfakes are just one AI development. Stay informed. And remember: Hearing is no longer believing.
Want to learn more? Contact me here.
Peter has spent the past twenty-plus years as an acting/consulting CFO for a number of small businesses in a wide range of industries. Peter’s prior experience is that of a serial entrepreneur, managing various start-up and turnaround projects. He is a co-founder of Keurig.