Email security firm GreatHorn discovered a new phishing attack this week that raises the bar for criminal ingenuity, not to mention outright cheek. Hackers designed a fake email that looks to be from another well-known email security brand, right down to its technical ‘signature’.
Targeting Office 365 accounts in an attempt to steal credentials, they were able to spoof the email security company’s legitimate ‘return path’ and ‘received’ headers, which were crafted to make it look like the emails had passed through its own email servers.
This was done so convincingly that it reached nearly a quarter of GreatHorn customers across numerous sectors and company sizes, and was seen in both Office 365 and G Suite environments. The emails got past standard Office 365 email protections in many instances. Microsoft arguably compounded the deceit by automatically appending legitimate received header details to the spoofed headers, making the phishing emails appear that much more legitimate.
The email body was designed to look like a Microsoft alert around Quarantine, SharePoint, and OneDrive. The sending email address would be a direct spoof of the recipient, so if the intended victim’s email address firstname.lastname@example.org, the sender email address would also be email@example.com.
Raising their game, again
Phishing attacks often try and mimic well-known brands by spoofing display names or using look-alike domains to add credibility to their attempts.
This attack displays a much higher level of sophistication than we typically see. Its designed to not only trick the unsuspecting, but also to get around security awareness training that shows you how to look at header information before interacting with an email.
While the spoofed company was an email security vendor this time, the strategy could be applied to other well-known brand names as a way to establish trust and trip-up more switched-on end users.
It just goes to show how exploitable online services continue to be, and why organisations need to continually update security awareness programmes in order to keep up of the latest exploits, malware, and phishing techniques.
Regardless of the level of sophistication, phishing attacks are on the rise. The latest numbers show that 83 percent of cybersecurity teams experienced phishing attacks last year, up from 76 percent in 2017.
If employees can be trained to spot phishing attempts as they happen, and sustain their level of awareness, the risk of a security breach diminishes. The report noted that companies with a security awareness training programme — nearly 60 percent — saw an increase in detection when staff knew how to recognise potential attacks.
As we explained in a previous post, training is essential to minimising the insider threat, and simulations are a highly effective way to create teachable moments that increase retention and sensitise people to the latest risks. The pace of innovation in malware and phishing techniques is more evidence that training needs to be updated regularly.
And the manner of training is important. Forget about classrooms and textbooks. With detection resting on the ability to catch ever more nuanced techniques to build trust, simulations that put the end user directly in front of a new phishing attack is the best way to test their real-world reactions.
As they get better, so can you
Real-time phishing simulations have proven to double cybersecurity awareness retention rates over more traditional training tactics.
Empowering your employees won’t happen overnight however. Simulations need to be part of a broader programme of security awareness training where the focus is on showing instead of telling. Done right, they are a great way to strengthen security-aware culture, and provide employees with tangible, real-life scenarios to better understand their own security instincts when a new phishing email lands in their in box.
Want to learn more about empowering your employees’ security defences? Why not sign up for a free demo and find out how we’re already helping organisations just like yours.