That emergency phone call from a loved one could actually be scammers using AI — how to stay safe
AI tools like ChatGPT, Microsoft Copilot and Google Bard can be quite useful but there’s also a darker side to this emerging technology you now have to worry about.
While both businesses and ordinary people have started using AI to make things easier, so too have scammers. Besides writing more convincing phishing emails, scammers are now using AI to create deepfakes of loved ones' voices to use in their attacks.
In addition to regular scam calls, you now need to be on the lookout for calls that appear to come from a friend or family member. Though questioning the legitimacy of a phone call in an emergency might seem rude, doing so could prevent you from falling victim to a new round of “I’ve been in an accident” scams that are becoming increasingly popular due to advancements in AI.
Better AI makes for more convincing fakes
In a new blog post, Malwarebytes goes into further detail about a story in the San Francisco Chronicle about a family that was almost taken to the cleaners by one of these AI-powered scams.
The family got a phone call that appeared to come from their son in which he said that he had been in a car accident and hurt a pregnant woman. As Malwarebytes points out, these types of scams are becoming more common and besides a car accident, the familiar voice on the other end of the phone might have unexpectedly ended up in the hospital or suffered some other kind of tragedy.
Just like with online scams, this strategy is used to create a sense of urgency so that potential victims act quickly before they have a chance to think too much about what is actually happening. Years ago, these types of scams were easy to spot but that’s no longer the case as AI has quickly become much more advanced.
Now, it’s quite easy for scammers to take a clip from a video on social media and convincingly fake the voice of a loved one. To make matters worse, the AI tools used to fake voices “are available either in the public domain for free, or at a very low cost” according to FBI special agent Robert Tripp who spoke with the San Francisco Chronicle.
After this initial call about the accident, there was a second call in which someone posing as a legal representative for the son in question asked for bail money. Fortunately, though, the family got suspicious when this so-called lawyer told them he’d send a courier to pick up the bail money.
How to stay safe from phone scams
AI-powered scam calls aren’t going anywhere anytime soon. As such, you need to learn how to recognize while making it harder for scammers to impersonate you in order to prevent your own friends and family from falling for these scams.
For starters, you shouldn’t answer telephone calls from numbers you don’t recognize or calls from private numbers. If you’re expecting an important call and do answer a call from such a number, take what the person on the other end says with a grain of salt. Likewise, you should never provide personal or financial information to anyone you don’t know over the phone.
If you do end up getting one of these calls, it’s a good idea to reach out to the family member or loved one in question to verify that they really are in an emergency situation. However, if you can’t reach them directly, you should then try to call someone else who might know where they are.
To prevent others from falling victim to these types of scams, you’re also going to want to notify the police immediately. For those in the U.S., you can reach out to the FBI directly to report suspicious activities at the bureau’s Internet Crime Complaint Center (IC3). Here, you can file a complaint and look at the FBI’s FAQs for more information on all of the different types of scams that are currently out there.
Like other tools, AI can be used for both good and bad. Hopefully though, businesses and governments figure out a way to use their own AI tools to take on the cybercriminals behind these scams.