If you think technology is the answer to your organization’s cyber security threats, consider the following:
- Sending phishing emails to just 10 employees gets hackers inside corporate gates 90 percent of the time, according to Verizon Communications’ 2015 Data Breach Investigations Report. For the last two years, more than 2/3 of cyber-espionage incidents have featured phishing wherein attackers establish themselves on user devices and infiltrate the network.
- In the first half of 2015, malicious attachments became the go-to method for hackers to gain access, and a new stream of phishing attacks targeting businesses also began, according to a threat report from Proofpoint.
- The percentage of cyber-attacks targeted toward employees jumped from four percent in 2007 to 20 percent in 2010, according to a KPMG study. More recent research estimates that 40 percent of cyber-breaches could have been avoided had employees been aware of the risks and taken appropriate actions.
“The best technology on the market won’t help you if the bad guys get to your people,” says Mark Stone, CIO of the Texas A&M University System. Danny Miller, Texas A&M University System’s CISO, echoes that. “Even if you have the latest and best technology installed, one misstep by a user can throw it all out of the window.”
As more attacks are directed toward employees, many organizations are unprepared to react. They are also surprised by how difficult it is to change behaviors of the people on the front lines.
Here are five ways to transform your people from cyber security liabilities to cyber safety assets.
- Realize it’s a change problem. Behavioral change is tough, and we cannot achieve a marked and long-lasting change with traditional communication, standard training, or mild promises of a better future. One reason is that these blanket approaches promote a diffusion of responsibility – people don’t think the problem lies with them. Think of sexual harassment and inclusiveness training; people attend courses because they must, but they think the training is targeted at someone else.
One chief security information officer confided that his biggest issue is that each department believes its own people are fine, and the risk truly sits with “the knuckleheads in other departments.” His question: how do we get people to understand that cyber risk is not a reflection on the integrity of the team; it is an individual mindset of vigilance to identify constant, undefined risk?
We can use behavioral science to help us. For example, research shows that we stay in a system until it no longer works for us personally. Economists and behavioral scientists Daniel Kahneman and Amos Tversky found that we feel the pain of loss more acutely than the pleasure of gain. So we unconsciously take greater risks and make bigger changes when confronted with painful situations. It’s why we stay in expired relationships and mind-numbing jobs. Until the pain is too great, we do nothing.
So how do we get people to feel enough “pain” to change? How do we get our colleagues, employees, and affiliates to understand that every one of us is a weak link and our current behavior is dangerous?
“Scaring people is a tactic that I certainly use to get their attention,” says Chris Walter, CIO of Central Garden & Pet. “Getting permission from employees who have been targeted and then using those real examples… That really resonates.” Some companies use internal phishing exercises. Through real-life stories and simulations, the employee understands viscerally that it can happen to him or her.
Jeff Dalton, Information Security Officer for the Bank of Marin recommends making it personal. “You wouldn’t want your personal information out on the web, would you? Be prudent when you surf the web or click on something. Relate the experience to that individual level.”
Walter says he helps leadership feel the urgency in a number of ways. “I tell the executives that the network we have is just as much a corporate asset as your plant. Imagine if your plant were hit by a tornado.”
- Link it to culture. Culture is made up of the unspoken rules by which decisions get made. MIT professor Dr. Edgar Schein says that, when a group of people engage in a behavior and are successful, they repeat it. That constant repetition becomes culture. We might also think of culture as a collection of organizational habits.
It’s a powerful force, perpetuated by the brand of the company. People choose to work for Google or Coke because something about that brand resonates for them, personally. And it’s tenacious; changing a culture is like changing the course of a river – it requires dynamite. It’s easier not to swim upstream, but to harness the power of that culture to change behaviors. We must link new behaviors we want to what the culture already supports.
Behruz Nassre, VP Technical Operations, Security & Compliance for TubeMogul, an advertising software company, directly links their security efforts to their culture. “There are two to three things that are important here. First, we train people that if they say they are going to do something, they do it. Second, we do things fast and do not reprimand failure. We send that message with our internal hacking attempts. If you fail, it's OK so long as we learn from it and move on.” He also links their security efforts to the coding sprints their developers already do. “We encourage our programmers to look at security output as a reflection of quality rather than risk. In the same way there’s a bug and a fix; security bugs through static code or vulnerability are a quality feature to address.”
- Make it familiar, controlled, and successful. Familiar: We evaluate all new situations by comparing them to what we already know or experienced. If something is familiar and we judge it as safe, we are more likely to do it. As we try to change risky cyber behavior, consider what our constituents might compare this effort to, and create links that make sense to them. For example, in a health environment, caregivers vigilantly wash hands. Asking people to pause before clicking is the electronic equivalent.
When Patrick Wilson, Chief Information Security Officer and Associate Director of Clinical Applications at Contra Costa Health California implemented a mandated password format change, he said, “One metaphor I used was comparing the complexity of an eight-character password to walking across the Golden Gate Bridge. Changing it to 12 characters is like walking from the Golden Gate to the Statue of Liberty – it’s that much more difficult to breach.”
Controlled: In a chaotic world, we seek structure and predictability. We can handle devastating news, even a cancer diagnosis, if we know what to expect and have specific actions to manage our situation. We must design cyber programs with this in mind. In many ways, it’s like planning a typical IT deployment but for the employee’s experience, so you and they know what to expect and feel in control.
In the case of cyber risk, this starts with clearly defining the behaviors we need from our employees – what we what them to do (behavior), when they should do it (trigger) and the confirmation that it worked (reinforcement). And we must schedule these activities so they layer systematically, establishing new habits.
A program conditioning people to recognize undefined, potential hazard might look like this:
- Month 1: Recognize cyber risk generated by others.
- Month 2: Recognize the cyber risks I create.
- Month 3: Recognize risks inherent in my environment.
Each week would focus on one simple behavior, with the associated trigger and acknowledgement. Like so:
Month 1: Recognize cyber risk generated by others.
- Week 1: Pause before clicking attachments.
- Week 2: Pause before opening external email.
- Week 3: Call IT if suspicious.
- Week 4: Tell the requestor you’ll call them back.
Successful: Finally, people adopt new behaviors if they believe they will be successful. In the research, it’s described as “outcome primacy” -- our first experience has a “substantial and lasting effect on subsequent behavior.” (Journal of Experimental Psychology). For example, if someone starts a diet and loses weight in the first week, he or she will continue the diet. We must engineer success for users, as they practice and start to use the right cyber behaviors – make sure they understand when they have done it right, then repeat that experience over and over so that the behavior is naturally reinforced.
4. Don’t communicate; focus attention. Many organizations create extensive communication programs, but each wave of communication is competing for individuals’ attention. We filter out “noise,” and pay attention to what is clear and relevant to us. Our challenge then is to focus attention so that our information about safe cyber behavior gets through our people’s selective filters.
One of the most effective ways is to get the organization “on-message” about the program, much like a political campaign. If team members can describe the effort passionately, without a PowerPoint, using their own examples, we win.
Think of the message as a square anchored by four words: one for the current situation, one for the solution, one for the method of getting there, and one for the result. For example, these words might be:
- Current: Vulnerable – Our current way of working is broken and we are at risk.
- Solution: Discerning -- Employees should easily decide what’s nefarious.
- Method: Questioning – Employees should think about whether each action is risky.
- Results: Nimble – Our organization responds quickly and appropriately to relentless threats.
It’s imperative that key people agree on those words; the debate will help internalize them. The square shape helps too - visualization makes it memorable. Examples supporting the words must come from the team working on the message. And, as long as words remain constant, the team will be able to describe what they’re doing consistently in every conversation from the coffee shop to the boardroom.
- Measure and benchmark behaviors. It always comes down to accountability, both at the organizational and individual levels.
Nassre uses a variety of tactics. “We put out a monthly security report to our execs that includes a product and IT perspective, and a physical security perspective. These are the number of machines hit by viruses, how they were hacked, these were the campaigns we ran, these are total number of people who clicked that shouldn’t have, these are the number of bad passwords we found. We’re looking at trends.”
The approach taken by Wilson is similar. “We discuss the number of infections, number of inquiries regarding investigations, the number of computers needing to be rebuilt due to malware.” He added, “We do a lot of our own assessments internally. There’s a physical audit where we go to a new site each month and act as normal patients and talk with the local onsite management to discuss what they did extremely well and what they didn’t. Many organizations forget about the physical site.”
There are traditional benchmark sources like (ISC)2, SANS Institute and Brian Krebs. But, as Wilson observed, many organizations are overlooked by the larger benchmarking firms, and take inspiration from peer groups. “We meet with other facilities of the same size and revenue range.”
The importance of the right cybersecurity behaviors cannot be overstated. And pulling the right behavioral levers will make all the difference to your company’s security.
Trish Emerson is the co-author of three books: The Change Book, The Learning & Development Book, and The Technology Change Book. She runs Emerson Human Capital Consulting, which has the largest fulltime US workforce focused exclusively on changing and sustaining behavior. www.emersonhc.com