As security consultants charged with improving program effectiveness, one of the most frequent questions is whether a security program is mature. Rather than fostering discussion around what maturity means, this often descends into questions of benchmarking. In psychology this is called Social Comparison Theory – in short, evaluating yourself based on what the other guy is doing. But treated as a standalone business case for security investments, benchmarking can represent a fraught way of looking at the world, for several important reasons.
Check Your Metrics
One of the critical pitfalls of benchmarking is evaluating the wrong metrics. We often see this falling to the simplest common denominator in the security space: headcount and budgets. It is tempting to propagate the idea that because a peer program spends $30M a year on security and has more than 70 full-time employees in its organization, you should have it to. But these are hardly helpful metrics, tied to arbitrary cost-centric numbers and not specific outcomes or goals. They are unlikely to resonate with executive sponsors and it encapsulates why professionals often say that bad or misleading data is worse than no data at all.
SOP Doesn’t Always Equal Success
Likewise, just because a large firm in a comparable industry does something a certain way does not mean it is “right.” Security teams across the Fortune 500 often do not do the basics: manufacturing plants do not have business continuity plans, security leadership is buried within the organization chart, and crisis management teams do not properly tier and escalate incidents. Reporting relationships and staffing models can be dysfunctional, and just because it exists does not mean it should be emulated. Yet, it is precisely this kind of data that often finds its way into benchmarking exercises. Benchmarking against large but less mature security organizations is not a useful exercise and can create the wrong narrative for leadership.Apples to Oranges
Perhaps most importantly, different organizational charts, company cultures, business models, and supply chains render most security programs apples to oranges comparisons, despite having similar mandates. Size, geographic location, unique value streams, and asset prioritization all link with broader business objectives to frame how a company tolerates risk and then builds, implements, and continuously assesses a security program. These elements will rarely be the same between two companies and these categorical differences ultimately make many of the standard takeaways from benchmarking exercises irrelevant. The Federal Bureau of Investigation makes a similar recommendation when it releases its annual report around uniform crime reporting data, discouraging users from drawing comparisons between jurisdictions due to the myriad factors that go into the data being what it is.
Security PKIs Okay for Business, but Different for Security
This is not to say that benchmarking has no value. When used to answer nuanced and confined problem sets, security leaders can learn what has led to successful outcomes in other programs and apply best practices where considered relevant. But security leaders would do well to remember that the nature of security strategy and operations is different than other functional areas within a company that they may interact with daily. Corporations measure themselves in terms of quality, time, and cost – key inputs to profitability that is relatively easy to benchmark against. Security teams operate under a different set of Key Performance Indicators, serving as a business enabler but not a direct and easily quantifiable driver of bottom-line profitability.
Benchmarking for Success
What can be done? Cybersecurity programs have for years benefitted from various maturity models, including the National Institute of Standards and Technology (NIST) Cybersecurity Framework to the Department of Defense’s new Cybersecurity Maturity Model Certification (CMMC). Physical security industry groups and standards organizations should follow suit, a process that is already well underway.
Later this year, for example, ASIS International will unveil it's Enterprise Security Risk Management (ESRM) Maturity Model Self-Assessment. While designed with simplicity in mind, this is one step to properly evaluating security maturity. Various niche functional areas also benefit from maturity frameworks. Insider Threat, for example, has multiple models in place, including those developed by Carnegie Mellon and the National Insider Threat Task Force (NITTF). Third-party consultancies, for their part, can also play an important independent role in assessing organizational models from the outside and communicating best practices.
Together this means security programs should reevaluate how they think about effectiveness. Key priorities, investments, and overall improvement should increasingly be framed through the lens of accepted maturity models and less by what peer companies are doing. This is far more likely to generate useful insights that guide a risk-based deployment of people, processes, and technology across all types of security organizations.
About the author: Brogan Ingstad is a Vice President with Teneo Security Risk Advisory. He supports Fortune 500 clients with security and intelligence program development, business resiliency frameworks, and risk mitigation. Brogan brings a decade of experience conducting CSO organizational reviews, site-level physical security surveys, and threat and vulnerability assessments of global security programs.
Previously he served as a strategic security and risk management consultant at The Chertoff Group in Washington D.C., addressing the varied security, risk, and business continuity challenges affecting international companies with dispersed people and assets. Prior to Chertoff Group, Brogan was a head of research at global business intelligence and country risk firm Oxford Business Group, based in the Middle East, Africa, and Latin America. Brogan has additionally supported public-private sector collaboration through the United States Department of Homeland Security’s Analytic Exchange Program (AEP), focusing on the nation’s critical cargo and port security challenges.
Brogan received his bachelor’s degree in International Business from the McCombs School of Business at the University of Texas at Austin and a Master of International Public Policy (MIPP) at Johns Hopkins University’s School of Advanced International Studies (SAIS).
Brogan is a Certified Protection Professional (CPP) with ASIS International and a Certified Risk Management Professional (CRMP) with DRI International.