It was time for another foray into the intractable Washington, DC, traffic scene. I had finished a long day of meetings, e-mails, cell phone calls and writing projects. I was tired. The day was winding toward evening, but for me it wasn’t yet time to rest. I had an evening graduate class to teach at the university. I jumped in my car at the office and took off down the thoroughfare toward the urban campus.
Fortunately, I was driving against traffic. As motionless lines of frustrated commuters sat idling in the lanes headed for the suburbs, I made decent progress into the city with fellow motorists who were at least moving, if not at highway speeds. I finally wheeled into the parking garage with 10 minutes to spare.
As I grabbed my briefcase, laptop and umbrella out of the trunk, one of my students who had also just parked called out my name. I bumped my head on the trunk lid.
“Hey, professor! Sorry for the surprise. I brought in a paper for our discussion topic tonight!” He beamed as he waved a thick stack of printed slides.
“Great,” I said, rubbing my temple. “You found something interesting on information security methodologies?”
“Sure,” he said. “What’s cool about this methodology is that this guy says people like you don’t know what you’re doing,” he responded with a mischievous grin.
“Did he mention me by name?” I asked.
“No, but he basically says structured analyses and formal risk assessment processes don’t work—that they are snake oil,” he said, obviously tying to get a rise out of me.
“It sounds like you couldn’t wait to get to class tonight with this,” I laughed. “Let me take a look at it while you take the quiz I promised last week.”
He gleefully dropped the materials into my arms and went inside to find a seat. In class, I handed out the quiz sheets and exam booklets, then settled down to look over the materials he had left with me. I was amazed at what I was reading, but my jaw bounced off the table when I came to these sentences at the end of the abstract:
“We must eliminate risk from our information security concepts if we are to put our security art on a firm footing. We must replace the objective of reducing intangible risk with a new, more positive one of achieving due diligence and good practice by applying safeguards consistent with our new concept of information security as a business enabling function to meet the requirements of increasing legislation and standards.”
I flipped back to the cover to see what security heretic could have written such seditious material. I was taken aback when I found the writer was one of the eminent academics and researchers of the Golden Age of computer security. I read everything he had published back when I was a young researcher working in the federal government. And here he was, saying security is an art and not a science, advocating abandonment of empirical risk analysis in favor of “good practices” (he claims “best practices” do not exist), regulatory compliance, and security practitioners’ best efforts.
I wondered what could have possessed him to make such a claim. I read on, hoping to find the answer.
“Researchers are attracted to (risk analysis) because it is ‘mathematical,’ only partially developed, and unproved. Using due diligence is uninteresting and routine, and it is completely developed. There is no more research or development to be done other than collecting, standardizing, and documenting the hundreds of already known safeguards. It leads directly to selection of safeguards based on what others are already doing. It is not leading edge or experimental. It is quick, low cost, and not particularly dramatic.”
I was now flabbergasted. Because the use of mathematics in evaluating risk was only partially developed, we should no longer practice it?
Physics isn’t a completely developed field either; does that mean physicists should throw up their hands and go back to alchemy?
However, the real surprise was his recommendation to apply safeguards based on what others are already doing. As I read his defense of this practice, I kept hearing my mother’s voice asking me if I would be willing to jump off a bridge because all my friends were doing it. I then returned to my reading to determine what metrics—“good practices” in his parlance—he would use to determine what safeguards to employ. I found his answer buried in the latter part of the text. Here is what he said:
“For example, an organization may discover that all four of its competitors have installed firewalls in circumstances similar to its own, the product providers have sold 50,000 firewalls of the type it is evaluating, and two independent trade journal evaluators have judged the product to be the best available. No risk assessment is necessary to support a ‘good practice’ conclusion after considering many other factors, such as available funds and acceptance by operations staff, that we should install the firewall along with effective system administration.”
As I understood it, he was advocating a wait-and-see security posture—wait for a significant number of your business competitors to make the security commitment, for the vendor to sell a large number of its products, and then follow everyone else. That would be considered a “good practice” in his approach.
Ironically, he does not address what would happen if every security practitioner followed this advice. I suppose we’d all be sitting around waiting for the other guy to make a move. That would not be effective proactive security management.
The author also calls on security practitioners to be keenly attuned to regulatory compliance. This approach ensures your organization is following laws related to security and privacy. Apparently, lawyers and politicians know what is best for your organizational security.
The author states:
“New legislation and regulations requiring specific safeguarding are proliferating such as the Sarbanes-Oxley Act, California SB1386 Privacy Violation Reporting Act, Gramm-Leach-Bliley Act (GLBA) that applies to the U.S. financial industry, and the HIPAA that applies to the health industry are recent examples. (Some of these sources specify risk assessments; however, due diligence may be easily substituted.)”
In this pronouncement, he seems to imply that you should rely heavily on legal requirements, except when they require a risk assessment. In that case, he wants you to ignore the legal requirement and substitute his process of due diligence and good practice.
When I was learning to drive, my father would chide me for not thinking ahead. I would tell my father he was being unfair, that I was doing everything perfectly legally. He would reply that I could either be right, or I could be dead right. I recall that lesson to this day.
Wait-and-see security may be acceptable for some organizations, but I suspect the decision makers most of you support would not want this approach for their security program. They’d rather be dead right. All security programs are about risk management, and risk management means probabilities and the need to develop and justify quantifiable analyses.
One of the final statements of the article is, “Again, the knowledge and experience accumulated in more than 30 years must be applied to make prudent safeguard decisions, and none of the factors need involve consideration of security risk.”
I gleaned from that statement that he feels young people or those newly involved in security need not apply for security jobs. As I looked out over the faces of the eager and dedicated young men and women in my graduate course finishing their quizzes, I knew he was wrong.
John McCumber is a security and risk professional. He is the author of Assessing and Managing Security Risk in IT Systems: A Structured Methodology from Auerbach Publications. Mr. McCumber can be reached at firstname.lastname@example.org.