Lost in the Cloud

March 15, 2011

Even if you only enjoy a few hours a week in front of your television like me, you have undoubtedly seen the advertisements. It’s the latest technology buzzword(s) – cloud computing. Cloud computing as a technology meme began as a way to better define an IT infrastructure that relies on shared services, virtualization and outsourcing without regard to the resources’ physical location. However, that bandwagon is now rolling down Main Street, USA, and everyone is hopping on. Cloud computing is now being brought to the average consumer!
Before you get too jazzed up over this amazing new technology development, it is probably a good idea to take a close look at what is being offered, and determine what you want the cloud to do for you. For our purposes, we will take a special look at the security ramifications as well. Before we begin, I need to point out that this “cloud” concept has been around for more than 100 years.
At the tail end of the 19th century, scientists like Lord Kelvin had pioneered long-distance underwater cable to carry digital signals such as Morse code. It was a visionary new technology, but it ended up being hijacked by an analog guy named Alexander Graham Bell. The advent of the telephone network transformed our communication systems, and along with radio, the world went analog. The “cloud” on a diagram represented the analog telephone network.
After several decades of analog detours, we returned to Kelvin’s original vision of digital communication. During World War II, Alan Turing started building the predecessors of modern digital computers to decrypt Axis messages. Once these and other standalone computing systems were developed, it was time to let these devices communicate among themselves. The networks were soon to follow.
Back in the 1970s (yes, kids, way back then) when I began working with coal- and steam-powered computers and telecommunications equipment, the term “cloud” was still being used on overhead “foils” to represent the telephone network. Later in my career, I was working for the Defense Information Systems Agency at an exciting time — the early 1990s. The inchoate Internet phenomenon was blossoming, and new computing models and architectures were all the rage. Almost every meeting I conducted, attended or passed by featured PowerPoint slides (that had by now replaced the overhead foils) that portrayed a neat little clip-art cloud in the center. Everyone now understood it as a metaphor for the Internet. The slides became ubiquitous, and “cloud charts” popped up in everyone’s presentation.
As we have seen, the cloud metaphor isn’t all that new. It has been used for a century to picture a network of interconnected components. What is new are the many robust capabilities that these points in the network provide. As new endpoint devices and consumer electronics emerge, you won’t have to use them to maintain complex applications, various software tools, or even to store and backup your own data. All these functions can often be performed by service providers who specialize in these functions. By leveraging the ubiquitous network, you are now in a position to let an expert provide software and data management at a lower cost with usually better service than you — the end-user — can do for yourself.
The basic economics of the network model are what’s making cloud computing so attractive to not only consumers, but private enterprises and government agencies as well. Organizations large and small are seeking out service providers to handle functions off-site. In fact, entire state governments and large federal agencies are looking to turn over all their data, applications and software tools to the professionals, and simply reach out over the network for their applications and data. They no longer have to manage data centers, the buildings they occupy or all the specialized IT personnel. It is an enticing proposition from the demonstrable cost savings alone.
But how does an organization looking at cloud computing deal with the elements of security? If your critical information is no longer maintained on your own servers and mainframes, and even the software to manipulate that data is off-site, how can you possibly ensure the confidentiality, integrity and availability of the data? Security managers can delegate many of the functions and protections to an off-site provider, but they can never offload the ultimate responsibility for data they maintain for their organization. If the data is corrupted, stolen, exposed or destroyed, we all know who is going to be held accountable.
The new “old cloud computing model” holds the promise of decreasing costs, more efficient data management and dramatically reduced costs associated with operating data centers and managing specialized IT personnel. For consumers, it’s going to be an exciting new world of sleeker, multi-functional endpoint devices such as Apple’s iPad. However, it is going to be a new challenge for security practitioners, as we can no longer rely on organizational boundary protection as a primary defensive tool. We will need to think more creatively to ensure we can control the relationship between users, and the data we protect on their behalf. ?

John McCumber is a security and risk professional, and author of “Assessing and Managing Security Risk in IT Systems: A Structured Methodology,” from Auerbach Publications. If you have a comment or question for him, e-mail [email protected].

Mccumberjohn 10240004
Mccumberjohn 10240004
Mccumberjohn 10240004
Mccumberjohn 10240004
Mccumberjohn 10240004
Home

John McCumber

March 15, 2011