Adding the human element to cybersecurity strategy

Sept. 20, 2019
A self-aware workforce combined with advanced technology are key elements to a secure environment

The Myers-Briggs Company and ESET recently conducted a study that showed 42% of businesses are focusing on delivering compliance training as part of their cybersecurity protocol, while over 63% use passwords as a gatekeeper of their systems. While these are certainly indispensable elements of cybersecurity, they don’t necessarily account for the fact that many breaches are largely attributable to human error. While cybersecurity is typically thought to be a job best left to the experts, in fact, a lot of breaches could be avoided if organizations took a more integrated approach to cybersecurity.

One key component that most organizations are missing in their cybersecurity strategy: self-awareness. This goes beyond a general awareness of the problem to include a deep understanding of how individuals may be vulnerable to cybercrimes. While it may be thought of as an IT concern, the fact is that it affects the bottom line--the conversation about cybersecurity needs to be one that takes place at all levels of business. It requires an integrative human/machine approach.

The Nature of Cybercrime

Cybercrime is inherently amorphous and unpredictable. Cybercriminals can be anyone, anywhere, and their motives aren’t always clear. Furthermore, the range of strategies they may pursue is almost limitless. Nevertheless, today’s threats can be broadly classified by a few trends:

●    Formjacking--targeting online forms, for example, skimming credit card details as they’re entered by customers of online retail sites.

●     PowerShell--rely on eccentric behavior to feed off supply chains. A PowerShell script will disguise itself within a ‘safe’ process (the ‘shell’) and phish for data and/or intelligence.

●     IoT Attacks--targeting smart, connected devices, which are often overlooked as ‘points of entry’.

In an alarming cycle, we see that the next advance in cybercrime often hits us before we’ve even had a chance to fully understand the last one. Unconstrained by the restrictions which govern legitimate software developers and white-hat hackers, cybercriminals can implement changes at unrivaled speed, enabled by such rapidly evolving technologies as artificial intelligence. For example, before security experts could get a handle on ransomware, it was supplanted by more direct methods of skimming cash or stealing data. In a business environment where ‘agility’ is the rallying cry, cybercriminals put even the most rapidly innovating companies to shame.

While it may be tempting to throw our hands in the air in resignation, the fact is that most successful cyber attacks have something important in common: they rely on a degree of human error and/or ignorance. For example, cybercriminals can install phishing codes onto systems via Alexa because people don’t recognize their smart devices as points of entry for attackers.

Or consider the lingering view of cybercrime as something done for fun by malcontents. In fact, in recent soon-to-be-published research by The Myers-Briggs Company, almost 20% of respondents did not agree that a data breach would have serious consequences for their organization.

This mindset allows people to let their guard down, viewing a potential virus as little more than an annoyance. In reality, of course, the motives of cybercriminals may be insidious, and the associated losses can be astronomical, with far-reaching consequences such as direct financial loss (through theft, downtime, etc.), brand damage and fines.

A New, Holistic Approach

Clearly, companies need a holistic approach that factors in not just the strengths and weaknesses of their technology systems, but of human team members as well. In order to understand the ‘human vulnerabilities’, it is necessary to understand humans, and one of the best ways to do this involves psychometric assessment. Among other things, this can inform how training can be better tailored to the needs of the various individuals on the team. For example, The Myers-Briggs Type Indicator model identifies four dimensions of personality:

●     Where you focus your attention (Extraversion or Introversion)

●     What sort of information you prefer (Sensing or Intuition)

●     How you prefer to make decisions (Thinking or Feeling)

●     How structured you like your life to be (Judging or Perceiving)

Our study shows that these dimensions of personality type influence both how we may be vulnerable to different kinds of cybersecurity errors, and how we may in other instances react effectively to threats:

●     People with more Extraverted personality types (who focus on the outer world of people and activity) tend to be more vulnerable to ‘social engineering’ attacks that leverage manipulation, deceit, and persuasion. Being highly tuned towards external communication has advantages though, and people in this group tend to more quickly pick up on external attacks than others. 

●     People with a preference for Sensing (who tend to observe and remember details) are more likely to spot phishing attacks than their bigger picture-oriented Intuitive counterparts. However, they are also more likely to take cybersecurity risks--this is particularly true of those who have combined preferences for Perceiving (and tend to be more flexible and casual) and/or Extraversion.

●     People with a preference for Feeling (those guided by personal values) and people with a preference for Judging (those who are systematic or structured) are more likely to fall victim to social engineering attacks than more analytically minded folks with a preference for Thinking. However, people who prefer Thinking may overestimate their own competence, which makes them vulnerable to mistakes. On the other hand, those with preferences for Judging and/or Feeling tend to be more cautious when following cybersecurity policies.

Cybercriminals Understand Our Psychology: We Need to as Well

Make no mistake: cybercriminals are very good at reading the blind spots of those that they prey on. This sort of analysis is second nature to a new wave of threat actors who are well aware that techniques such as these create a far wider reach than old fashioned linear phishing emails,” said Jake Moore, Cybersecurity Specialist, ESET. “The more we are aware of such threats and how they work, the more equipped we are to thwart such attacks and better protect our assets.”

Understanding that all personality types have different strengths and blind spots that can make them vulnerable to attacks can help organizations develop more effective, integrated and coherent security protocols. For example, for employees with a preference for Intuition (the opposite to Sensing), training may benefit from emphasizing the need to look for specific detailed queues, such as odd email addresses. Additionally, training programs can be tailored to maintain the interest of different personality types; discussions and experiential activities might appeal more to those with a personality preference for Extraversion, for example, while those who prefer Introversion may respond positively to written exercises and require more time for reflection.

All these tactics boil down to one thing--self-awareness. The more aware people are of their own blind spots, the more they’re empowered to take appropriate action and responsibility for their own role in the organization’s cybersecurity. A self-aware workforce combined with a solid technology approach will make it much, much more difficult for cybercriminals to penetrate your defenses.

About the author: John Hackston, head of thought leadership at The Myers-Briggs Company, is a chartered psychologist with more than 30 years’ experience in helping clients to use psychometric tests and questionnaires in a wide range of contexts including selection, leadership development, performance management and team building.