
AI-Enabled Voice Cloning Fuels Grandparent Scams
Criminals are increasingly using artificial intelligence (AI) to mimic the voices of strangers and trick victims into handing over large sums of money. This technology, known as voice cloning, allows scammers to impersonate loved ones, such as grandchildren, and create a sense of urgency to extract money from unsuspecting victims.
Older adults are particularly vulnerable to these scams, as they may not be familiar with AI technology and are more likely to trust familiar voices. Scammers often target seniors with urgent pleas for help, claiming to be in trouble and needing cash immediately.
The FBI reports that senior citizens lost approximately $3.4 billion to financial scams in 2023. The agency warns that AI has made these scams even more believable, as it can be used to correct human errors that might otherwise raise red flags.
Protecting Yourself from Grandparent Scams
Fortunately, there are steps you can take to protect yourself and your loved ones from falling victim to these scams. Cybersecurity experts and law enforcement officials recommend creating a family "safe word" and establishing a protocol for verifying a family member's identity.
Choose a unique and unpredictable safe word: Avoid using easily guessed information like street names, alma maters, or other details readily available online.
This provides an extra layer of security.
Never give out financial information or make payments without verifying the caller's identity.
Ensure everyone understands the importance of not volunteering the safe word.
By following these simple steps, you can significantly reduce your risk of falling victim to grandparent scams and protect your hard-earned money. Remember, if you are ever unsure about a caller's identity, it's always best to err on the side of caution and hang up the phone. You can then contact your loved one directly to verify their situation.