Debunking vulnerability assessment myths: Part 2

Aug. 13, 2013
Characteristics to look for in vulnerability assessors, things to include in a VA report

Editor's note: This is part two in a two-part series on clearing up misconceptions about vulnerability assessments (VAs). Part one examines common myths associated with VAs. Part two looks at the traits good vulnerability assessors have and what should be included in a VA report.

The old adage that "it takes a thief to catch a thief" has some merit for vulnerability assessments (VAs). This isn’t to say you should hire a bunch of felons to look at your security. What it does mean is that the vulnerability assessors need the right mindset. Design engineers and people with lots of brains and security experience aren’t automatically good at doing VAs. After all, if you are thinking like all other security professionals instead of thinking like the bad guys, you’re unlikely to be able to predict what they might do. It can be surprisingly hard for engineers and security professionals to think like the bad guys when they have spent their lives and careers desperately wanting security to work.

Vulnerability assessors should be psychologically predisposed to finding problems and suggesting solutions, and ideally have a history of doing so. In our experience, the best assessors have a hacker mentality and tend to be highly creative, narcissistic, skeptical/cynical, questioners of authority, loophole finders, hands-on types, smart alecks/wise guys, as well as people skilled with their hands (e.g., artists, artisans, craftspeople) who are interested in how things work.

Another old adage also applies well to VAs: "A prophet is never honored in his own land." As we can personally attest to, there is a lot of "shoot the messenger" syndrome associated with identifying security problems. While vulnerability assessors are sometimes called "red teamers" (from the Cold War era) or "black hatters" (from cowboy westerns), they are also often called worse things that can’t be repeated in polite company.

Doing a VA for your own organization can be a threat to your career, or at least place real or perceived pressure on the assessors not to find vulnerabilities. This is one of the reasons that vulnerability assessors should ideally be chosen from outside the organization. Wherever they come from, however, they must be able to be independent and allowed to report whatever they discover. There can be no conflicts of interest. They cannot be advocates for the security product or program under study, nor benefit from its implementation.

The VA Report

The difficult part of any VA isn’t finding vulnerabilities and suggesting countermeasures, it’s getting security managers and organization to do something about them. In physical security, unlike cyber security, making changes is sometimes unhelpfully viewed as admitting to past negligence.

The good things need to be praised in the VA report at the start, because we want them to continue (they might be an accident) and we want to prepare the reader to be psychologically ready to hear about problems. It is important to at least suggest possible countermeasures in the report. Security managers and organizations will be reluctant to deal with the security problems if there aren’t at least some preliminary fixes available. (Often, however, security managers can devise more practical countermeasures than the vulnerability assessors starting from their suggestions.) Findings should be reported to the highest appropriate level without editing, interpretation, or censorship by middle managers or others fearful of what the report may say.

The written VA report should also include all the following:

  • Identity and experience of the vulnerability assessors
  • Any conflicts of interest
  • Any a priori constraints on the VA
  • Time and resources used
  • Details, samples, demonstrations, and videos of attacks
  • Time, expertise and resources required by an adversary to execute the attacks
  • Possible countermeasures
  • A sanitized, non-sensitive summary of the findings if the sponsor wishes to take public credit for the VA; statistics are helpful.

Other Misunderstandings About VAs

There are other common VA problems and mistakes that should be avoided. Sham rigor—thinking that the VA process can be done in a rigorous, formalistic, linear, reproducible, and/or quantitative manner—is a common problem. In fact, effective VAs are creative, right-brain exercises in thinking like somebody you’re not (the bad guys). The VA process is difficult to formalistically characterize, reproduce, or automate (see the Vulnerability Pyramid).

Another common VA mistake is to focus on high-tech attacks. In our experience, relatively low-tech attacks work just fine, even against high-tech devices, systems and programs. It is also a big mistake to let the good guys and the existing security infrastructure and strategies define the problem—the bad guys get to do that. We must also be careful not to let envisioned attack methods solely define the vulnerabilities—it ultimately has to work the other way around.

Placing arbitrary constraints on the VA in terms of scope, time, effort, modules, or components is also a common mistake. Often, software experts are brought in to look at the software, mechanical engineers to look at the physical design, electronics experts to examine the electronics, etc. While there is nothing wrong with using experts, the fact is that many attacks occur at the interface between modules or between disciplines. An effective VA needs to employ a holistic approach and people who can think holistically.

There is nothing wrong with testing and certifying security devices, systems, and programs—assuming the tests and certifications are relevant, meaningful, and well thought through. But testing and certifying is something quite apart from undertaking a vulnerability assessment. Be sure you understand what a vulnerability assessment is and is not, how it should be done and by whom, and why it is important to do it.

About the Authors: Roger G. Johnston, Ph.D., CPP, is leader of the Vulnerability Assessment Team at Argonne National Laboratory. He was founder and head of the Vulnerability Assessment Team at Los Alamos National Laboratory from 1992 to 2007. Roger graduated from Carleton College (1977), and received M.S. and Ph.D. degrees in physics from the University of Colorado (1983). He has authored over 170 technical papers and 90 invited talks (including six keynote addresses), holds 10 U.S. patents, and serves as editor of the Journal of Physical Security.

Jon S. Warner, Ph.D., is a systems engineer with the Vulnerability Assessment Team at Argonne National Laboratory. From 2002-2007 he served as a Technical Staff Member with the Vulnerability Assessment Team at Los Alamos National Laboratory. His research interests include vulnerability assessments, microprocessor and wireless applications, nuclear safeguards, and developing novel security devices. Warner received B.S. degrees in Physics and Business Management at Southern Oregon University (1994), and M.S. and Ph.D. degrees in physics from Portland State University (1998 & 2002).