Steps to Mitigate Spear Phishing

Why Technical Solutions May Work Better Than Training
Steps to Mitigate Spear Phishing
Eric Johnson

While many organizations rely on employee training to help mitigate the risks of spear phishing, such efforts are generally ineffective, says Eric Johnson of Vanderbilt University, who explains why a technical solution might be more effective.

See Also: Demonstrating HIPAA Compliance

Vanderbilt conducted a study that concluded "training exercises are not going to move the needle a lot," Johnson says in an interview with Information Security Media Group (transcript below).

"That really puts the ball back into the technical court," he says. "That is, how can we protect users from ever being able to make a decision on these things either by ensuring they never receive the e-mail to begin with, warning them appropriately in the e-mail, or by catching them as they click and preventing that connection from occurring?

In the interview, Johnson:

  • Describes the results of the new study on spear-phishing mitigation training;
  • Explains why embedded training isn't always effective; and
  • Discusses possible ways to get employees to avoid clicking on links in spear-phishing e-mails.

Before joining the Vanderbilt faculty last summer, Johnson served as associate dean for the MBA program and faculty director of the Glassmeyer/McNamee Center for Digital Strategies at Dartmouth College's Tuck School of Business. His teaching and research focus on the impact of IT on the extended enterprise. Through federal grants, Johnson studies how IT improves process execution, but also how security failures create friction throughout the extended enterprise. He also focuses on the role of IT in improving healthcare quality and reducing cost.

Why Training Doesn't Mitigate Phishing

ERIC CHABROW: Your research paper outlines the relative costs and benefits from spear phishing campaigns. Tell us about that.

ERIC JOHNSON: One of the things we've learned over the years is that the weakest link in security is often the human being in the loop. Anytime that you can somehow deceive a human being, many times it's much easier to do that than to hack into something; you can make a lot of money. ... If you just think about the crazy spam that we all get in our inboxes, most of that stuff does not generate lots and lots of revenue, but very targeted spear phishing can be highly effective. What makes it so effective is it is so personal and that creates much higher open rates, click-through rates, and willingness to share sensitive information that can then be used and monetized. It's what makes spear phishing so lucrative.

CHABROW: Can you explain how some phishing attacks could net a profit of $150,000?

JOHNSON: It's really an order of magnitude more effective, and it really is only limited by the imagination and creativity of the attackers because, again, their research and creativity in building very, very focused attacks is what makes this so effective and potentially far more lucrative.

Security Professionals Phished

CHABROW: Can you talk about certain spear-phishing attacks where even some security companies have been hacked?

JOHNSON: That's the thing about this; even security professionals will click on links they shouldn't click on. They may not give up sensitive information, but they'll click on a link. There is something about clicking. My good friend, John Stewart at Cisco, said, "All links want to be clicked." There is just something in there, even for the most astute security folks, when you get a link that looks like it is real, looks like it came from a friend, has a compelling message, it's very hard to pull the finger back from the mouse.

Embedded Training

CHABROW: Why don't you define first what you mean by embedded training. And how widespread is the use of it to reduce the effectiveness of spear phishing?

JOHNSON: Spear phishing is just so effective, and there are ways, of course, technically that we can address that subject. We try to strip links out of e-mails and put link warnings into these kinds of things automatically. But the other side of that, of course, is trying to improve the human firewall. That is, train the users to better recognize suspicious situations or links. Embedded training is really a focused effort on that.

The idea is quite simple. Most of us are in companies where we do kind of regular security training. Once a year, you've got to take the online test or you don't get your paycheck - some such thing like that. While it makes everybody feel better that we've checked the box and done it, it doesn't prove to be all that effective. Embedded training is an idea that says "Let's try to take the training to the point of need, that is: At a point of time when a user is about ready to make a mistake or is making a mistake, can we provide some training?" Maybe they will pay a little more attention, Maybe they'll really learn from it, and that is the whole idea of embedded training.

CHABROW: Is it widespread?

JOHNSON: It is widespread. I think that what happened maybe four or five years ago as companies began to see spear phishing rise, many of them started their own campaigns internally to try to warn against spear phishing. Sometimes ...that involves phishing their own employees. Then if they click, take them to a site where they were kind of shamed or told that they made a mistake. Those kinds of interventions have been quite popular, though many argue it is not very effective simply to shame people. What we're looking at is kind of the next step past that, which is in that moment where they click on something inappropriate, taking them to a site that actually provides some training about why or how they should have been able to discern this particular link was going to be a problem.

CHABROW: Do they get the training at that point?

JOHNSON: The embedded part is really the way to interpret that, and at the time the user makes a mistake, give them a little training. In this case, send them a phishing e-mail that is really designed to deceive them and if they click on it, then show them why or how they should have been able to identify this as suspicious.

Is Training Effective?

CHABROW: And this is not always effective according to your research?

JOHNSON: Well that is what we find now. We ... did a large study inside of a company, involving 1,500 users over a period of time with multiple campaigns running. What we really find is that it's not that effective. It's one of these things that in many ways is shocking to us. We were thinking when we started out that we were going to find that embedded training worked, but sadly it doesn't seem to be all that effective.

CHABROW: Why do you suspect that?

JOHNSON: There are many reasons that we've been hypothesizing, and this is, of course, the fascinating part of the research. It seems like in groups of people, particularly inside a corporate firewall, who just click on everything, training doesn't seem to slow them down one iota. We certainly saw that in the research. We called them the "Clickers" and it didn't matter how much training you did, these people just kept clicking. There were other folks who were naturally, or maybe through their own learning, much more cautious and they weren't clicking. That group really doesn't benefit so much from the training because they are already not clicking. It's hard to really understand why; I think it is just human curiosity at play. It's very hard to get folks, particularly when the deception is pretty good, to really step back for thirty seconds and look at it and say, "Is this something I should be clicking on?"

Protecting Users

CHABROW: Are there solutions to this?

JOHNSON: Our conclusion is that these types of training exercises, while maybe they're not completely useless, are not going to move the needle a lot. That really puts the ball back into the technical court, that is, how can we protect users from ever being able to make a decision on these things either by ensuring they never receive the e-mail to begin with, warning them appropriately in the e-mail, or by catching them as they click and preventing that connection from occurring.

CHABROW: So the solution is a technical solution?

JOHNSON: It certainly feels that way. The hope that we could somehow train users to be smart about phishing attacks may be a false hope.

New Training Processes

CHABROW: Can you explain how you are going to combat that false hope?

JOHNSON: One of the things that is really interesting in our research ... is we've been looking at different ways to go about the training process itself - different messages we can provide users. There are still glimmers of hope in that. One of the things that we've learned is that depending on how you frame the message and the training to the user, it does have some small impact. That is, we find that users seem to be more motivated by protecting themselves against losses to themselves. In the training, trying to point out to users that clicking on this link may have bad impact for the company might not have such a big of an effect. But if you point out to the users that they may themselves experience negative outcomes from clicking, something personal and at a cost to them, that does seem to have some impact on users and their memory around what is a dangerous link to click on.

CHABROW: Any final thoughts?

JOHNSON: As we look at embedded training, I think we'll really look at studying more closely that particular piece. How can you frame and make the message more memorable and vivid to the individual user to try to move the needle on their clicking activity?


About the Author

Megan Goldschmidt

Megan Goldschmidt

Associate Editor

Goldschmidt is the former Associate Editor for ISMG. A recent graduate of Ithaca College, she has worked for multiple publications in NJ and NY, including the Trentonian and the Rochester Business Journal, instilling a passion for writing, editing and social media.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing databreachtoday.co.uk, you agree to our use of cookies.