Editor's note: This is part two in a two-part series on clearing up misconceptions about vulnerability assessments (VAs). Part one examines common myths associated with VAs. Part two looks at the traits good vulnerability assessors have and what should be included in a VA report.
The old adage that "it takes a thief to catch a thief" has some merit for vulnerability assessments (VAs). This isn’t to say you should hire a bunch of felons to look at your security. What it does mean is that the vulnerability assessors need the right mindset. Design engineers and people with lots of brains and security experience aren’t automatically good at doing VAs. After all, if you are thinking like all other security professionals instead of thinking like the bad guys, you’re unlikely to be able to predict what they might do. It can be surprisingly hard for engineers and security professionals to think like the bad guys when they have spent their lives and careers desperately wanting security to work.
Vulnerability assessors should be psychologically predisposed to finding problems and suggesting solutions, and ideally have a history of doing so. In our experience, the best assessors have a hacker mentality and tend to be highly creative, narcissistic, skeptical/cynical, questioners of authority, loophole finders, hands-on types, smart alecks/wise guys, as well as people skilled with their hands (e.g., artists, artisans, craftspeople) who are interested in how things work.
Another old adage also applies well to VAs: "A prophet is never honored in his own land." As we can personally attest to, there is a lot of "shoot the messenger" syndrome associated with identifying security problems. While vulnerability assessors are sometimes called "red teamers" (from the Cold War era) or "black hatters" (from cowboy westerns), they are also often called worse things that can’t be repeated in polite company.
Doing a VA for your own organization can be a threat to your career, or at least place real or perceived pressure on the assessors not to find vulnerabilities. This is one of the reasons that vulnerability assessors should ideally be chosen from outside the organization. Wherever they come from, however, they must be able to be independent and allowed to report whatever they discover. There can be no conflicts of interest. They cannot be advocates for the security product or program under study, nor benefit from its implementation.
The VA Report
The difficult part of any VA isn’t finding vulnerabilities and suggesting countermeasures, it’s getting security managers and organization to do something about them. In physical security, unlike cyber security, making changes is sometimes unhelpfully viewed as admitting to past negligence.
The good things need to be praised in the VA report at the start, because we want them to continue (they might be an accident) and we want to prepare the reader to be psychologically ready to hear about problems. It is important to at least suggest possible countermeasures in the report. Security managers and organizations will be reluctant to deal with the security problems if there aren’t at least some preliminary fixes available. (Often, however, security managers can devise more practical countermeasures than the vulnerability assessors starting from their suggestions.) Findings should be reported to the highest appropriate level without editing, interpretation, or censorship by middle managers or others fearful of what the report may say.
The written VA report should also include all the following:
- Identity and experience of the vulnerability assessors
- Any conflicts of interest
- Any a priori constraints on the VA
- Time and resources used
- Details, samples, demonstrations, and videos of attacks
- Time, expertise and resources required by an adversary to execute the attacks
- Possible countermeasures
- A sanitized, non-sensitive summary of the findings if the sponsor wishes to take public credit for the VA; statistics are helpful.