Edited by: Gianna Merat
AI tools are getting better at voice reproduction – avoid getting scammed by following these expert tips and let your loved ones know
She answers a call from a family member who relentlessly explains how a terrible car accident occurred. You have to send the money now or he'll go to jail. You can hear the desperation in his voice as he begs for an immediate cash transfer. Even though it definitely looks like him, and the call came from his number, you feel like something is off.
So I decided to hang up and call him immediately. When your family member receives your call, they say there was no car accident and they have no idea what you are talking about.
Congratulations, you've just successfully avoided a scam call with AI.
As AI production tools become more powerful, it is becoming easier and cheaper for fraudsters to create fake—but convincing—voices of people. These AI-powered transcriptions are trained on existing audio clips of human speech, and can be customized to mimic almost anyone. The latest models can speak several languages. OpenAI, the maker of ChatGPT, recently announced a new text-to-speech model that could further improve voice reproduction and make it more widely available.
Naturally, criminals use AI-powered cloning tools to trick victims into thinking they are speaking to a loved one on the phone, even though they are speaking on a computer. Although the threat of AI scams can be scary, you can stay safe by keeping these expert tips in mind the next time you receive an unexpected, urgent call.
Remember: AI voice is difficult to detect
It's not just OpenAI. Many technology startups are working to reproduce near-perfect human speech, and recent progress has been rapid. Any security strategies that rely on detecting anomalies aurally over the phone are outdated and ineffective.
He hung up and called again
Security experts warn that it's very easy for scammers to make a call look like it's coming from a legitimate phone number.
“Oftentimes, fraudsters will spoof the number they're calling you from and make it look like they're calling you from a government agency or bank,” says Michael Jabbara, global head of fraud services at Visa. “You have to be prepared.”
Whether it's from your bank or a family member, when you receive a call asking for money or personal information, ask to contact them yourself.
You can also try sending them a message via a different, verified line of communication, such as video chat or email.
Create a secure password named “Password”
A common security tip recommended by many sources is to create a safe word that only you and your loved ones know and that you can request over the phone.
“You can also set up a word or phrase with your loved ones that they can use to prove who they really are if they're under stress,” says Steve Grubman, chief technology officer at McAfee. While it is best to call or check in through another means of communication, a safe word can be especially useful for young people or elderly relatives who may be difficult to contact otherwise.
Or just ask what they had for dinner
What if you haven't decided on a safe word and are trying to figure out if the distress call is real? Take a moment and ask a personal question.
“It could be as simple as asking a question that only a loved one knows the answer to,” Grubman says. “It could be, 'Hey, I want to make sure it's really you.' Can you remind me what we had for dinner last night?'' Make sure the question is specific enough that the scammer can't answer correctly by guessing.
Understand which sound can be imitated
Deep voice cloning isn't just limited to celebrities and politicians, like the calls in New Hampshire that used AI tools and sounded like Joe Biden was discouraging people from going to the polls.
“What people don't realize is that with just 5 to 10 seconds of your voice, in a TikTok you may have created or a YouTube video from your career, that content can easily be used to clone you,” says Rahul Sood, the company's president. From a producer at Pindrop, a security company that discovered the possible source of AI Biden's audio.
Using AI tools, an outgoing voicemail message on your smartphone could be enough to trigger your voice.
Do not give in to emotional appeals
Whether it's a run-of-the-mill “I'm your new assistant accountant” scam or an AI-powered phone call, experienced scammers can gain your trust, create a sense of urgency, and find your vulnerabilities.
“Be wary of any interaction in which you feel an intense feeling of emotion, because the best scammers are not necessarily the most experienced tech hackers,” experts say. “But they have a very good understanding of human behavior.” If you take some time to think about a situation and avoid acting impulsively, this may be a good time to avoid cheating.
Information from WIRED. Please, those of you who read the post, share the instructions with your loved ones, especially the elderly.
“Total alcohol fanatic. Coffee junkie. Amateur twitter evangelist. Wannabe zombie enthusiast.”
More Stories
Is this what the PS5 Pro will look like? (Image)
Finally, Windows 11 24H2 update significantly boosts AMD Ryzen – Windows 11 performance
Heart Surgeon Reveals The 4 Things He ‘Totally Avoids’ In His Life