John McCumber is a security and risk professional, and author of “Assessing and Managing Security Risk in IT Systems: A Structured Methodology,” from Auerbach Publications. If you have a comment or question for him, e-mail Cool_as_McCumber@cygnusb2b.com.
As an oldster, I have been privileged to work in the information technology industry for nearly my entire working life. I have watched the seasons of IT change, and have witnessed first-hand the evolution from mainframes, to minicomputers, to personal computers, then networks and edge devices. Lining this convoluted evolutionary path are the littered remains of technologies, companies, concepts and programs discarded along the way. As a teenager, I built a plastic and cardboard DigiComp II — it was operated manually, and gave binary answers once you “coded” your program in to the mechanical components. It wasn’t even able to keep up with a four-function calculator that came along a couple years later.
As a young service member, I programmed and operated Burroughs and Univac computing systems, and sent messages to remote computers across a 300 baud acoustic modem. After I got commissioned and was responsible for four large data centers, I was once asked to present a seminar to college students on new and exciting “portable” computers. I lugged a 40-pound Osborne 1 with a 5-inch, 52-cloumn display and two floppy drives. Osborne went bankrupt two years later.
Even after I retired from the military, I was still working with technologies from companies like DEC, Nixdorf, Banyan Systems and Trusted Information Systems — all gone now. Of course, security technologies and solutions have also been forced to change along with the IT systems they protect. Security technologies have never been immune to these same market forces.
When I first was immersed in IT security, the federal government was still pursuing the vision of a “secure” computer: a glorified adding machine that was mathematically validated to process the ones and zeroes in a “secure” manner. They were still working on it when Bill Gates started shipping Windows NT 3.51, which brought networking to the Department of Defense. Every time the security community thought they were close to nailing down the ultimate security solution, visionaries were changing the entire IT landscape.
When the quest for an integrated hardware/operating system solution was abandoned, teams of researchers decided to assume a standardized, if insecure, hardware platform, and wanted to mandate secure software. There are still conferences, working groups and task forces working on standards and guidelines for this estimable goal.
Many years ago, the Department of Defense realized the seminal element of the IT security problem wasn’t the hardware or software — it was the information itself. It officially recognized this fact by folding the practice of communications security (COMSEC, in military parlance) into computer security (COMPUSEC), to create the discipline of INFOSEC. The lessons that spawned that initiative way back in the early 1990s must be learned again by a new generation of programmers, architects, designers, analysts and engineers.
Is it any wonder that the heyday of the big hardware manufacturers have passed? Amdahl, Burroughs, and many others are no longer around as their business models weren’t able to adapt in the long run. They were supported by PeachTree Software, Santa Cruz Operation, Data General and Computervision — giant but now extinct software publishers. In an era of virtual desktops, software-as-a-service, and mobile apps, they found themselves handicapped by legacy code and bloated infrastructure.
When you look at the landscape today, are you surprised to see companies like Google, Amazon, and Terramark at the top of the IT pyramid? What do they have in common? The answer is information — lots and lots and lots of information…tons of it. And these monstrous IT providers are positioning themselves to be public cloud providers.
Cloud computing services present us with an entire new world of security issues. The old security models are simply not adequate to deal with the vulnerabilities and threats presented by cloud environments. Building solid boundaries and partitioning off sensitive data? Forget about it. Defense in depth? It may be deep, but it’s not wide enough. Encrypt it all? Good luck managing those millions of keys.
Who’s responsible for your data when you don’t control it on a day-to-day basis? Are you going to rely on service-level agreements? How do you oversee your data security when your data are minnows swimming in an ocean managed by a mammoth global company?
There are smart people feverishly working to answer these questions. Once they get their arms around the problem, they will be presenting their security solution sets to help ensure the confidentiality, integrity, and availability of your corporate data assets. Will they be ready before the next big IT advance hits? Will they even be in business by then?
John McCumber is a security and risk professional, and author of “Assessing and Managing Security Risk in IT Systems: A Structured Methodology,” from Auerbach Publications. If you have a comment or question for him, e-mail Cool_as_McCumber@cygnus.com.