You may have heard of scam calls that ask for your credit card or other personal information and like most Americans, you probably already know not to fall for those traps but do you know how to avoid AI voice scams? What if the person’s voice on the other line sounded exactly like your family member or friend? Would you believe their request?
Unfortunately, AI voice scams are on the rise due to deepfake and ai voice cloning technology that’s so real it causes even the smartest people to fall victim to scams. To stop this from being you, today we’re covering everything you need to know about AI voice scams.
An AI voice scam is a type of scam where a malicious actor uses generative AI tools to create AI-generated voices that sound remarkably human in order to manipulate others into complying with their demands, whether it be sending money, purchasing gift cards, divulging sensitive information, downloading malware, etc. These voices can even sound identical to an authority figure or even your loved ones if the scammer used voice cloning algorithms.
Since advancements in artificial intelligence tools have made it easier than ever to manipulate images, videos, and voices, more scammers and cybercriminals are using AI technology to carry out their scams. In fact, they’re creating convincing replicas of people’s voices in real time to impersonate loved ones, exploit emotions, and coerce victims into sending money or disclosing sensitive information at alarming rates.
What’s even scarier about this significant threat to cybersecurity and personal privacy is that according to recent research by McAfee, because the voices are so realistic, 70% of adults are not confident that they could identify a voice clone of their loved one from the real thing.
As a result, unsuspecting victims are more susceptible to getting scammed when the voice sounds like a loved one, acquaintance, or trusted authority figure like a government official.
According to the Federal Trade Commission, older Americans reportedly lost $1.6 billion to fraud in 2022, including $404 million in investment scams, $271 million in business impersonation scams, and $159 million in tech support scams. From just January 2020 to June 2021, the FBI also found that $13 million was lost to grandparent and person-in-need scams. And these numbers are said to be drastically underreported and only increasing with scammers using AI technology for evil.
To protect yourself and your loved ones from falling victim to these schemes, it’s crucial to understand how they operate.
Imagine this: You receive a call from a random number and answer, it sounds like your daughter and she’s telling a story about how her phone broke and she’s trapped in Israel and needs money to return home. Would you believe it was her if it sounded like her voice? Unfortunately, scenarios like this are usually AI voice scams.
Fraudsters use AI or deepfake technology to generate convincing audio clips of individuals saying things they never actually said from social media content and other sources. Then they use their voices for AI voice cloning scams.
For example, imagine if you heard your daughter crying saying she needed money immediately because she was being held at gunpoint or ransom but it was never really her at all.
If those examples sound scary, let’s take a look at a few more common voice scams. After all, the more you know about these scams, the better prepared you will be if someone tries to scam you.
Family scams involve scammers impersonating your family member or loved one and claiming to be in distress. They often claim to have been involved in a car accident or facing a legal or health emergency. The imposter chooses these scenarios because they create a sense of urgency, encouraging the victim to send money or provide personal information to resolve the situation quickly without verifying the authenticity of the call.
Some common examples of this type of scam involve a scammer calling a grandparent and posing as a grandchild in distress or even claiming to be a family member who’s been kidnapped and in need of ransom money in exchange for their release.
During vishing scams, scammers use AI voices to impersonate trusted individuals, such as tech support representatives, government officials, or financial advisors, and make unsolicited phone calls. They may claim there’s a problem with your account and request sensitive information, such as passwords, social security numbers, or payment to resolve the issue.
For example, the scammer may claim to be from Microsoft or Apple in an effort to trick individuals into believing their account has been compromised, leading them to disclose login credentials or financial details. There’s also a scam where fraudsters pretend to be from the IRS.
Another AI voice scam includes fake customer service lines. Scammers create fake customer service lines, chatbots, or voice assistants with AI-generated voices to trick victims into believing they are speaking with legitimate representatives of reputable companies. They then manipulate victims into providing account details or transferring funds under false pretenses.
While detecting a deepfake AI voice call can be challenging, there are several key signs to watch out for to help you determine whether you’re receiving a call from a fraudster or a friend, such as:
It’s important to note that sometimes deepfake voice scams are so realistic, they lack these tell-tale red flags, so simply because audio sounds real, doesn’t mean that it is.
Knowledge is power so, of course, the first step to protecting yourself against voice scams is to stay informed about the latest AI voice scam tactics. There are so many more out there than the ones we’ve covered here. Here are a few other ways to protect yourself against voice scams:
When you learn about new AI voice scams, share the news with your family, especially older adults who may be more vulnerable to scams. This helps them understand AI voice cloning technology and what to watch out for. Encourage them to be cautious when receiving unexpected calls from family members requesting money or assistance, and advise them to verify the identity of the caller.
Always verify the identity of the caller, especially if they request sensitive information or financial transactions. Ask for their name, position, and contact details, then verify their legitimacy through official channels. If someone calls and claims to be a family member or friend from a number you don’t recognize, always hang up and call their number directly. Never give money until you can get in touch with them by other means. If you’re still skeptical, you can ask them personal questions only the real person would know or even ask to do a video call from their real phone number.
Treat unsolicited calls with caution, especially if they pressure you to act quickly or disclose personal information. Legitimate organizations typically do not request sensitive data or payments over the phone without prior arrangement. If you find yourself in a situation like this, hang up immediately and contact the company directly through official channels.
Utilize call-blocking cybersecurity features on your phone to filter out potential scam calls from unknown numbers. Many smartphones offer built-in features or downloadable apps that can help identify and block suspicious numbers.
Pay attention to caller ID information, especially for unknown numbers or calls from unfamiliar area codes. However, do not trust caller ID alone, as scammers can spoof phone numbers to appear legitimate. Additionally, avoid calling numbers back that are provided in unsolicited voicemails. Instead, always look up the number to ensure it really belongs to the person or company the voicemail claims.
When seeking assistance from customer service representatives, always use official channels provided by reputable companies. Refrain from sharing sensitive information or making payments through unfamiliar platforms, and be vigilant for any signs of impersonation or fraudulent activity.
Establish a secret code word or phrase with your family that only you would know. If someone claiming to be a family member calls asking for financial assistance or in an emergency situation, ask for the code or phrase to verify their identity. This extra layer of authentication can help you detect fraudulent callers attempting to impersonate your loved ones using AI voice cloning technology.
Exercise caution when asked to make immediate payments or transfers, especially via unconventional methods like gift cards or wire transfers. Legitimate authorities or family members typically do not demand payment through such channels.
Be cautious about sharing personal information about yourself or your family, such as birth dates, addresses, or financial details, over the phone with unfamiliar callers. Remind family members to be mindful of the information they share, as scammers can use seemingly innocuous details to construct convincing AI voice scams or identity theft.
While AI voice technology can be used for scams like we mentioned, it can also be used for good. Many companies are actively researching and developing AI voice tools for positive purposes, as well as promoting responsible use and transparency in their applications. For instance, ethical uses of AI voices include:
If you’ve fallen victim to an AI voice scam or suspect fraudulent activity, it’s crucial to report it to the Federal Trade Commission (FTC), which specializes in consumer protection and combating deceptive practices.
To report an AI voice scam to the FTC, you can visit their website or call their toll-free hotline to file a complaint. Providing as much detail as possible about the scam, including the caller’s phone number, the nature of the scam, and any financial losses incurred, can help the FTC investigate the matter and take appropriate action against the perpetrators.
Since we understand that advancements in AI can lead to an increase in scams, PlayHT is dedicated to upholding ethical AI standards, placing an emphasis on responsible voice usage. We are committed to ensuring that our users engage with our AI and voice cloning technology in a manner that aligns with ethical principles. With PlayHT, users have the power to create captivating voice overs while adhering to ethical guidelines. Try PlayHT’s responsible voice AI today to level up your next project.
To prevent scammers from hacking into your social media, always use unique passwords and two-factor authentication methods to secure your accounts.
Voice cloning can be detected through various methods such as analyzing speech patterns, inconsistencies in audio quality, and discrepancies in nuances.
A scammer can exploit your voice for fraudulent activities such as impersonating you to deceive others, manipulating recordings to fabricate false evidence, or creating convincing fake audio messages for extortion purposes.
To prevent falling victim to AI voice impersonation scams, safeguard personal information, verify the identity of callers or message senders through trusted channels, and stay informed about emerging AI technologies and their potential misuse.
Company Name | Votes | Win Percentage |
---|---|---|
PlayHT | 539 (678) | 79.50% |
ElevenLabs | 98 (191) | 51.31% |
Speechgen | 24 (177) | 13.56% |
TTSMaker | 65 (174) | 37.36% |
Listnr AI | 58 (168) | 34.52% |
Uberduck | 86 (166) | 51.81% |
Resemble AI | 74 (156) | 47.44% |
Narakeet | 65 (155) | 41.94% |
Speechify | 68 (153) | 44.44% |
Typecast | 43 (140) | 30.71% |
NaturalReader | 15 (49) | 30.61% |
Murf AI | 10 (42) | 23.81% |
WellSaid Labs | 10 (40) | 25.00% |
Wavel AI | 9 (39) | 23.08% |