“Just transfer the money by noon or we will lose the deal” came an instruction from your boss during a phone call. Who would stop and think “no, I won’t send the money, it might not be my boss; so we lose that big deal, so what, it’s only my job on the line”?
The above scenario, or at least something like it, played out recently when a CEO of a British firm, transferred around £200,000 to a fraudster. The CEO had been fooled into thinking the person he was talking to on a phone call was his boss, the head of the company’s parent organisation. He was fooled with the help of a “deepfake”, aka a technology that uses Artificial Intelligence (AI) in the form of deep learning to create a fake voice or video.
You may think this won’t happen to you, but watch out, because deepfake phishing scams are coming to a phone or video near you…
What is a Deepfake?
Deepfakes did not start out as a malicious tool to be used for phishing and CEO fraud. Instead, the AI-based technology was designed to be more of a plaything to make even more ‘funtastic’ cat videos on the internet or more often, to ‘put words into a politician or celebrity’s mouth’ in a YouTube video. One such infamous deepfake was that of Mark Zuckerberg in a video posted to Instagram, where fake Mark said:
“Imagine this for a second: One man, with total control of billions of people’s stolen data, all their secrets, their lives, their futures…I owe it all to Spectre. Spectre showed me that whoever controls the data, controls the future.”
The video is very realistic and scarily ominous.
If you want to see just how real deepfakes can be, check out this sample from Nova PBS.
Deepfakes Go Phishing
As is expected, cybercriminals have already started to take advantage of the possibilities that deepfakes offer, our introduction story being a prime example. Scams, like ‘Whaling’ and phishing, use social engineering tricks, that rely on natural human behaviour such as “trust”, fear of missing out (FOMO), and worry and concern. Basically, anything that can initiate a knee-jerk reaction as instinct kicks in.
Deepfakes take phishing up a notch by using the trust that we feel when we recognise voices or faces. If you hear a voice you know or see a face you recognise you are more likely to do as the person asks, give them your data, hand over credit card details or move money to an account.
This was the case in the “Grandparent Scam”. In this scam, grandparents were called by fraudsters who pretended to be their grandchild calling from abroad needing Grandma’s help. You may think you’d never fall for that sort of scam, but imposter scams are very lucrative. In the U.S., imposter scams accounted for 13 percent of 3 million criminal complaints last year.
Imagine then, if a cybercriminal added in a dimension of deepfake to imposter scams like the Grandparent Scam. The fraudsters carefully chose their target, then download videos or audio files of the child or grandchild of the target from social media. They then use deepfake technology to manipulate the videos or voice to say something like:
“Grandma, I really need help, I’m about to lose my flat if I don’t pay the 3-months’ rent I owe, I’ve got until noon to pay up or I’m out – please transfer the money now! I Love you!”
You can see that the same type of scam the CEO in the introduction fell for could just as easily be modified for a more general audience.
Deeply Easy Scams
Although not a deepfake, this example shows you what can be done with some effort. A group of cybercriminals used a silicone mask to impersonate a French minister. The group also mocked up the office of the minister in their flat to make the backdrop more convincing before using Skype to call their targets. This scam tricked a number of business people out of 80 million euros. Not bad for a bit of effort.
Deepfake technology makes the above scenario easier to carry-out.
You can imagine what type of malicious acts could be carried out more effectively and efficiently using deepfake technology. Fake news is an obvious one, but the sextortion scam is a great example of where deepfakes could take these threats to new levels of success.
The current sextortion scam relies on the fear of the unknown and shame. Sextortion is a scam that scares people into thinking they have been caught on video in a compromising position. If the scammer could take your photo from a social media site and use deepfake AI technology to make their claims seem real, then the sextortion scam could take on a new level of threat. Even if you know it isn’t real, you’d still want to prevent it from hitting your mother’s email account by paying the ransom.
A Deep Dark Future of Phishing and Deepfake Scams
Deepfakes are now crossing the chasm of expertise into affordable options that anyone can use. In a report by Data & Society, they identify the growing use of “cheap fakes”. The report argues that technology that was once only available at a high cost and usable by experts, is now available to anyone, the resultant fake being published onto social media platforms. These cheap fakes are not as slick as their expert counterparts, but they may well be ‘good enough’ to scam the general public.
This means the cyber-war has just made a quantum leap. In other words, just as malware has become ubiquitous thanks to rentable options on the darknet, so too, deepfake kits are likely to become part of “Phishing-as-a-Service options. This will make deepfake-based attacks more common.