By Robert L. Cain, Copyright 2023, Cain Publications, Inc.
Never was there a new technology that scammers didn’t figure out a way to take advantage of to rip off the unwary. Imagine their joy today. Scammers hit the goldmine with Artificial Intelligence. They have come up with ways to scam not possible before, ways they couldn’t have dreamt about five or so years ago. By the time we’ve figured out how to stop them, millions of people and businesses will have been scammed out of billions of dollars.
We’ll look at three different Artificial Intelligence “techniques” scammers use to steal money from the unwary: voice cloning, prompt injection attacks, and phishing.
Numerous sites on the internet provide apps for cloning anyone’s voice. They market it as having fun with your own voice. For example, deepswap.ai suggests “generating faceswap videos, photos, and GIFs. Over 150 million users make funny face swapping here, including movie role refacing, gender swaps, face memes, etc. Spoof your friends now!” Yes, “spoof your friends.” How about spoofing people into sending you money such as with the notorious “grandparent scam.”
Before voice cloning, they had to just hope a grandparent didn’t know for certain the sound of the voice of his or her grandchild. The scammers would call the grandparent pretending to be a grandchild, usually a grandson, saying he’d been arrested, had been in an automobile accident, or some such and needed bail money right away or he’d be put in jail for who knows how long. They instruct the grandparent to either go to Walmart and buy sufficient gift cards, wire money, transfer it using Zelle or Venmo, or go to the bank and withdraw money and somebody would come by to pick it up. That last one is the most perilous for scammers, of course, because a suspicious grandparent who spotted the scam could have police waiting for the pickup. With voice cloning, one possible snag had been removed. Scammers can download as little as 10 seconds of someone’s voice from a social media site such as Facebook or Tik Tok, and clone their own cloned voice into the grandson’s. For the unwary, terrified grandparent, it works.
The cost of using voice cloning? Deepswap.ai “premium” service is $9.99 a month or $49.99 for the first year, going up to $99.99 after that. Other sites are just as inexpensive. Cogni.ai advertises a free trial with up to 30 minutes of synthesis time and $20 for four hours. Others are equally reasonable considering the thousands of dollars scammers will extort from the less-than-suspicious.
Voice cloning works for more than just the grandparent scam. Politics. Considering how much politicians talk, it’s simple to download that voice and have it say anything you want it to. Especially right before an election, imagine the words they could put in a politician’s mouth too late to have the ad or news release cancelled, but not too late for people to believe what they heard the “politician” say.
Businesses aren’t immune, either. Scammers can clone the voice of the owner, CEO, Executive Vice President or some other high-ranking person whom an employee wouldn’t dare disobey or question, and get passwords or other means of accessing company servers. The result could be a ransomware attack or access to company bank accounts.
These pose equal danger to a business or individual. The example consumeraffairs.com used “You feed an AI chatbot, like the Bing assistant, a prompt, like ‘Hi, can you find me a cheap flight to Madrid in May?’ And Bing finds you a cheap flight.
Enter the hacker looking to slither his way into your convo and pocketbook. He’s already injected a prompt of his own into a website that you happen to have open in a separate tab. No one knows the prompt is there — not you, not the website owner. That rogue code jumps into the Bing chat box like a flea and hijacks your conversation.” Consumer affairs reports that programmers at GitHub engineered a virus to impersonate a Microsoft employee dealing in cheap laptops
Then come the phishing attacks. With Artificial Intelligence software, a scammee doesn’t even have click on a message. As soon as someone visits an infected website, that person’s computer or phone is infected by code that had been injected by a scammer giving him or her access to all the information the unsuspecting person’s computer, bank records, Social Security number, and with the ability to bleed someone’s bank accounts dry. How do they inject that code? Artificial intelligence alters the source code of the website just like in an injection attack. And no one could ever see the code if they looked because the scammer writes the code in white type on a white background. It only takes a few line so even the most wary programmer would think it was just a space in the code, a line break maybe.
Phishing attacks have risen exponentially in the past year or so, showing a 47.2 percent increase since 2022. For some reason, education has been targeted most with an increase of 576 percent since 2022.
How do we protect ourselves? First, trust nothing you see or hear until you verify its accuracy. Second, never, ever send cryptocurrency, buy gift cards, or use some other untraceable method of payment. Third, be wary of voice quality with no background noise. After all, the scammer is alone calling from an empty room, not police station. Fourth, listen for inconsistencies in the information provided, and ask questions. Several places also suggest using “safe” words that only family members would know. Scammers won’t know them.
Never, ever click on unknown software ads. Go directly to the website of the company by using your search engine to find it. Most of all, be skeptical of anything you find online. For any suspicious call, end it and contact your friend or family member directly or call someone who can confirm any situation. And social media, don’t give away personal information that a scammer can use to dig down and find out more. Scammers depend on believability to fool people, so they’ll use personal information they find about your family, friends, neighborhood, or anything else to lend to their believability.
With the increasingly sophisticated Artificial Intelligent software, scammers are digging in their gold mine and coming up with huge gold nuggets they can use to steal from the unwary or unsophisticated. We need to be more skeptical than ever.