John McCumber is a security and risk professional, and author of “Assessing and Managing Security Risk in IT Systems: A Structured Methodology,” from Auerbach Publications. If you have a comment or question for him, e-mail Cool_as_McCumber@cygnusb2b.com.
I started my IT career in a mainframe data center designed to meet an entire Air Force base’s computing needs. During the day, our warehouse-sized computing facility hosted online sessions for the numerous dumb terminals scattered around the base. In that huge facility, we had row upon row of core memory machines, refrigerator-sized magnetic tape writers/readers, core processors, printers, card punches, card readers and data entry stations.
These computing wonders had to be moved with forklifts. We also had two other “computer systems” — a separate Univac 1050 for the supply folks, and a remote station for the communications squadron.
At night, we ran batch processing to consolidate the day’s input, and produced mountains of printouts interleaved with carbon paper. We offered departments the option of one-ply to six-ply printouts, and kept up a nightly process of loading a seemingly endless stream of 30-pound boxes of paper into several freezer-sized printers. In the morning, personnel arrived to pick up Hollerith cards and 30-pound boxes of printouts. In the afternoon, many returned with updated cards and instructions for the mainframe computer.
We were the center of all data processing functions, and the precursor of what would become known simply as “IT.” The web of communication we maintained (at 300 baud) with other military installations was about as far as the Internet had progressed in those days. What we didn’t have, however, was good marketing.
Those old Burroughs processors were notoriously sensitive. The slightest electrical anomaly, and the system would crash, requiring a 30-minute reboot. When a storm got to within three miles, we had to begin a well-documented, tedious shut-down procedure. When the lightning moved away, the boot-up process was re-started.
Many times, our technical problems even stumped on-site company engineers. Large system crashes and mangled disk drives called for reinforcements. When things went down, they were normally down for a long time. Once, we even had to ship a box of magnetic tape reels and several boxes of Hollerith cards along with two system operators to another base via helicopter in order to get a critical program run at their site.
When I saw that helicopter — laden with all my data — lift off from our runway and chopper into the clouds, it never occurred to me that “data in the cloud” was going to become a big hit one day.
How was I to know that the big data center would be back with a hip new name? I suppose I really didn’t work in the base data processing center — I was implementing a cloud architecture where my customers could transmit, store, retrieve and process all their critical data in a virtual environment.
These days, I look back and realize my job wasn’t just inserting Hollerith cards and pumping out printouts — I was really providing software as a service! Those twisted copper pairs that were hardwired to those dumb terminals were really part of an ingenious first-generation virtual private network. The classic “dumb” terminals simply needed a modern PR spin. In the 21st century, they would cleverly adopt far less pejorative monikers like “thin clients” and “virtual machines.”
The lesson is simple: everything old is new again. When the personal computer arrived on the corporate scene a big way in the 1980s, IT people quickly abandoned the old mainframe model to build applications for standalone PCs. People now had the computing power of the entire mainframe right on their desk. The only problem was, the technology was advancing faster than the training.
When Bill Gates introduced Windows 3.51, the race to build networks began. If standalone PCs were available to everyone, then networked personal PCs could do wonders. Soon, new architectures and standards were emerging to interconnect everyone in the new network economy. Again, workforce development lagged.
The networked world of PCs and microcomputers ran into some predictable problems as these system lacked effective configuration control, compatibility, distributed management, security and information management. Firewalls needed to be built to separate networks and control access to sensitive data. Configuration control became a nightmare, and nobody was able to identify where critical information was being created, transmitted, used and stored.
In order to deal with these problems, many of the old solutions are once again the new ones — of course, with better marketing. Now we have data in the cloud, SaaS, big data and virtual machines. Of course, new problems are arising from these advances. Search “cloud security” and just see how many hits you get. Are these all really new problems, or simply the old problems returning with the evolution of technology?
John McCumber is a security and risk professional, and author of “Assessing and Managing Security Risk in IT Systems: A Structured Methodology,” from Auerbach Publications. E-mail him at Cool_as_McCumber@cygnusb2b.com.