Can Security Evolve?

March 11, 2009

Many years ago, I was in charge of the operation of four computer centers for the U.S. Air Force. One center was the first at this location to have a dial-up modem installed so the programmers could access the VAX 11/780 remotely. Before long, some computer geeks had repeatedly attempted to log-in from a university research center, and thus my fascination with security was born. When I decided to make the jump from computer management to computer security, many of my friends, coworkers and colleagues advised against it. My favorite piece of advice (circa 1985): “Don’t get into computer security, dude. No one except a couple of three-letter agencies in Washington, D.C., are ever going to worry about THAT!”

Back before the now-ubiquitous Internet, the science was simply labeled computer security. Today, we have the hazy distinctions of cyber-security, information assurance, information security, network security and information systems security. I have been intimately involved with the evolution of this digital folk art we currently practice, and I continue to marvel at its evolution.

The genesis of much of my profession began with attempts to build a “secure” computer that organizations concerned about data protection could buy. Since the majority of computers in the mid to late 1970’s populated barn-sized rooms, access to the equipment itself was managed by stringent physical controls. Since the hardware was presumably safe, the focus was on ensuring the programmable processes enforced the desired security policies. Millions of government dollars were poured into the development of “secure” operating systems with mathematically provable policy enforcement mechanisms.

While this daunting effort was under way, government researchers, scientists and graduate students were employed building the underlying tools and protocols that would enable computers to talk to one another and share data. As this fascinating functionality began to grow in the late 1980’s, the focus of computer security rapidly shifted from the “secure” computer pipe dream to the “trusted” enclave. If you could not enforce the security policy within a single room, you have to define the perimeter and build the tools to protect it.

The key perimeter technology to arrive on this scene in the early 1990’s was the network firewall. There are several types of firewalls, and there have been many changes over the years, but the basic functionality has been to separate networks by creating an internal, trusted environment distinct from the external digital world. Standards and high-level policies of the period dictated network architectures that would create separate trusted, network enclaves — usually associated with organizational boundaries.

Government and industry standards bodies then rushed to define a “secure” network architecture based on these defined enclaves of information technology components. The goal was to identify the appropriate technologies (especially firewalls) to deploy and how to configure them to be the most “secure.” If it was impossible to build and manage a “secure computer,” perhaps it would be possible to define a “secure information technology architecture” by defining the boundaries of the security enclave.

The last decade, however, has shown the futility of that approach as well. Rapid changes in technology have yet to abate. Organizations have found their information technology infrastructure perimeters to be a moving target. New portable end-point devices easily transport sensitive information outside corporate facilities. Thumb drives, CDs and online backup services can store enormous quantities of data outside corporate control. Additionally, sensitive information often needs to be shared with partners, customers and suppliers whose own data management capabilities may not be able to enforce your security policies.
To deal with the dynamic technology environment, the government has relied on legislation to mandate effective security enforcement. Laws have targeted corporations and require them to protect sensitive personal information of their customers. In addition to the federal edicts, the Department of Defense has also focused attention on mandating effective security through an expensive and time-consuming accreditation process. Recently, the government has announced initiatives to require preventive security and remedial forensic capabilities.

The evolution thus far has been:

• Build a “secure” computer, and barring that…
• Define a “secure” environment, and barring that…
• Legislate a “secure” information environment.

What’s next?
Although most of these efforts may be obviously ill-fated in hindsight, they have each contributed to the robust suite of solutions needed to implement effective and efficient information security programs. The “secure” computer brought us much-needed security functionality in operating systems such as role-based access controls and device management. The “secure” enclave brought about the integration of firewalls, intrusion detection systems and incident monitoring. Lastly, legislative mandates have demanded the recognition of the value we need to place on sensitive information — especially that being held on private citizens.
The evolution of security is clear: there is no silver bullet. Comprehensive security programs require an appropriate mix of technology, procedures and most importantly, human factors. Security is a journey, not a destination.

John McCumber is a security and risk professional, and is the author of “Assessing and Managing Security Risk in IT Systems: A Structured Methodology,” from Auerbach Publications. If you have a comment or question for him, please e-mail John at: [email protected].