May: Security & Peace of Mind
AI is increasingly prevalent in our everyday lives. We're seeing a noticeable rise in scams that leverage this technology. The latest scam circulating involves fraudsters using AI to clone the voices of family members or friends to deliver urgent-sounding messages. These fakes are especially convincing because it can take as little as 30 seconds for AI to reproduce someone’s voice
But we must rely on our human judgment and caution to help stop these scams. Watch this brief video to learn the three red flags that can help keep you and your loved ones safe from this threat.
References:McAfee "The Artificial Imposter" Report: This study found that scammers only need 3 seconds of audio (often pulled from social media) to create a clone that is 85% accurate.Scientific Reports (March 2025): Research published here indicates that humans are only about 48% accurate at detecting a fake voice—essentially no better than a coin flip—which justifies the need for "Safety Words" rather than relying on "listening closely."Federal Trade Commission (FTC): The FTC launched the "Voice Cloning Challenge" in 2024 to find technical solutions to protect consumers. They explicitly warn about "Grandparent Scams" where AI mimics a relative’s voice to request money.Federal Communications Commission (FCC): In early 2024, the FCC officially outlawed AI-generated voices in robocalls under the Telephone Consumer Protection Act (TCPA). This ruling confirms that AI voice cloning is a recognized tool for consumer deception.
