One of the seven A’s of SuperAging is Avoidance — working your way around negative forces. The biggest is ageism, and one of the big components of ageism is the prevalence of frauds and scams that target older people. According to the Federal Trade Commission, older Americans lost $1.6 billion to fraud in 2022 — and the actual number is probably higher because many are afraid to report being victims.
In our book, we list several ways to deal with this. But the truth is you have to be vigilant all the time, because the scam artists are constantly developing new tactics.
It figures that they’d start to use AI. And the way they’re using it is particularly dangerous.
As reported here, scammers are now using AI technology “to mimic people’s voices who then make calls to the victims, family members or loved ones, asking for money.” Victims have come forward, in Congressional testimony, explaining how they received calls “that sounded exactly like their loved one was in danger, was injured or was being held hostage.”
From the article:
“One older couple, featured in a video testimony in the hearing, received a call from who they thought was their daughter. She sounded distressed and asked for help.
“‘My daughter was, she was crying on the phone, profusely crying and saying, ‘mom, mom, mom,’ and of course my wife was saying, ‘LeAnn, LeAnn, what is the matter?’, and she repeated it again, ‘mom, mom, mom’ and it sounded exactly like her,’ Terry Holtzapple, one of the victims, said.
“Gary Schildhorn, a Philadelphia-based attorney and another targeted victim of an AI voice clone scam, also testified… He almost sent $9,000 to the scammer until he confirmed with his daughter-in-law it was an extortion attempt.
“The scammer, posing as an attorney, called Schildhorn requesting funds to bail his son out of jail for causing a car accident and failing a breathalyzer test.”
How do they clone the loved one’s voice? It’s not that difficult. A phone call to the loved one, pretending to be a salesman or researcher or with some other pretext, captures a few seconds of audio of their voice. That can then be cloned by AI and turned into an emergency call to you.
OK, what do you do about this?
One practical step that experts recommend is to create a family code word. Share it with children and grandchildren. If you receive what sounds like an emergency call — and particularly if it asks some action on your part, like sending money — simply ask for the family code word. If there is the slightest hesitation, or no correct answer at all, you’ll know it’s a scammer. It’s also a good idea to call the loved one back, to check that everything is OK.
The important point is this: keep up to date on frauds and scams, and be aware that this ground is constantly shifting, with new techniques coming into play. The US Federal Trade Commission (FTC) has a good online resource here.