According to Brivo President and CEO Steve Van Till, being able to discern the characteristics of a real cloud-based security offering is important to end users for a number of reasons.
Photo credit: (Photo courtesy stock.xchng/jodax)
Steve Van Till is president and CEO of Brivo Systems.
Photo credit: (Photo courtesy Brivo)
The cloud computing market is exploding. According to technology research firm Gartner, the market was worth over $100 billion in 2012 and is expected to double within the next four years. With a growing range of "Security-as-a-Service" offerings from access control to video surveillance, recurring revenue business models have grown far beyond their origins in alarm monitoring and now pervade every aspect of our industry.
Vendors responded to this gold rush with product offerings designed to put integrators into the RMR business. They’re also offering to save end users big up-front expenses while reducing total cost of ownership.
Unfortunately, with every gold rush there are opportunists selling false, over-hyped claims. This is especially worrisome for the security industry because inaccurate cloud claims endanger the safety of our customers and their property.
Real Data Center or an Integrator’s Network Closet
You would think that anything advertised as a "cloud" solution would be hosted at a real data center. Today, a number of security systems vying for cloud legitimacy are actually server-based products that integrators install at their own facilities and operate on behalf of customers.
This sounds fine in theory, except that few security integrators have the IT chops to pull it off. Many a "data center" in our industry turns out to be nothing more than an unsecured network closet in an industrial park office with electrical backup from a five-horsepower generator. Contrast this with a real data center that has a hardened perimeter, riot glass, 24/7 guards and staffing, biometric access control, redundant ISP connections, redundant power supplies and cooling equipment, fire suppression, earthquake proofing, network monitoring, and enough batteries and diesel generators to keep the whole thing running for days—if not weeks—in the event of a major emergency.
Nothing less than a real data center can provide the high availability and data security required for security services; anything else places end users at high risk.
Substandard Hosting: Just a Fancier Network Closet
Substandard hosting is a close cousin to the network closet. It has become so easy to spin up a virtual machine with an Infrastructure-as-a-Service company that everyone is doing it and declaring game over for meeting the hosting requirements of a security application. Unfortunately, many end users suffer brand-name blindness in this scenario and can’t see past vendor assurances that their data is hosted at Amazon, Azure, or some other Big Company. Don’t get me wrong. These are great companies with great services, but the quality of an application service has as much to do with the way the software is written and managed as it does with where it is hosted.
For example, it does no good to be hosted at a big company unless you’re diversified across at least two of their data centers. A lot of Amazon Web Services customers learned that lesson the hard way during recent outages in one of their primary availability zones. Ditto for Microsoft’s European cloud customers.
Taking it a step further, it does no good to be diversified across multiple data centers unless you have real-time database replication built into your product architecture. And it does no good to have replicated data unless you also have global traffic management to immediately switch between facilities. The lesson here is that it’s the whole solution that matters. A brand-name hosting service doesn’t guarantee quality unless the application provider has taken all the proper architectural and operational steps.
High availability and top-tier hosting matters because no one knows when security violations will occur, and you cannot afford to be down when they do.
What, No Multi-tenancy?
Software multi-tenancy is the fundamental design approach that allows SaaS systems to operate securely and efficiently. Yet many of the systems touted as "cloud" solutions don’t have it. They are simply the same single-tenant designs as before, with a web browser tacked onto the front end. Yes, the web browser is a welcome improvement over thick clients, but customers and vendors should care about what’s behind the browser.
So, what is multi-tenancy? To quote Salesforce.com, a leading authority on the subject, "Whereas a traditional single-tenant application requires a dedicated set of resources to fulfill the needs of just one organization, a multi-tenant application can satisfy the needs of multiple tenants … using the hardware resources and staff needed to manage just a single software instance."
Multi-tenancy matters to security integrators and end users for two reasons; economics and security.
In terms of economics, multi-tenancy allows major SaaS providers like Salesforce, Google, Netsuite, and many other familiar names to operate their services at massive scale and low cost. It does this by using a software design that enables thousands, even millions of unrelated customers to safely share the same underlying hardware resources. While the cost savings on hardware is obvious, there are equally considerable savings in energy, maintenance, and staffing expenses. Without multi-tenancy, the expense of running applications is virtually the same as traditional IT, and the whole cost-benefit argument for cloud services collapses.
By the same token, supporting millions of customers on a single, highly-scalable instance can only be accomplished if the security provisions were designed into the software from the start. Here a real estate analogy is illustrative. Ever lived in a single family home that was subdivided to support multiple renters? Doesn’t work so well. Not nearly as well as an apartment building designed from the outset to support multiple tenants. The same holds true for software.
Customers must look for multi-tenancy if they expect to achieve the promised cloud savings over the long term. Without it, there is also no adequate data security model in repurposed legacy applications.
The "Private Cloud Ready" Deception
Read a stack of recent sales literature and you’ll come across the terms "private cloud ready" or "suitable for private cloud deployment." Vendors often apply such terms to security appliances and server architectures, both real and virtualized. Sounds good, but what does it actually mean? Not much.
According to the oft-cited NIST definition, a private cloud is an architecture where “the infrastructure is provisioned for exclusive use by a single organization”. This means dedicated servers, storage, network connections, and staff to take care of the whole thing. Sound familiar? It should, it’s exactly the same as the traditional security software delivery model. It might have been moved offsite to another data center and it might have been virtualized for a little hardware efficiency, but at its core it offers the same features as the dedicated client-server model of the past several decades.
That’s why I favor the more descriptive definition that a private cloud "is a marketing term for a proprietary computing architecture that provides hosted services to a limited number of people behind a firewall." That’s probably not what you thought you were buying when the vendor told you it’s a "private cloud ready" product.
When security buyers are trying to free themselves of all the hassles of dedicated server equipment, the single-user "private cloud" fiction leaves customers right where they’ve always been.
The "Hide the Server" Deception
The private cloud claim is closely related to a practice we call "hide the server." This amounts to taking the end user’s current security applications and moving them to where they no longer see them running on their own computers.
Does moving an old application architecture to a new server 1,000 miles away give it any of the characteristics of cloud computing? Of course not. It won’t magically support thousands of end user organizations or suddenly be any faster for new users to provision. The truth is that the service provider will need to move every new customer’s computers to a new location, just like they did with yours. What do you think that will cost them? What will it cost you?
As a technology that promises to lower total cost of ownership, real cloud computing must deliver savings. Hide the server can never do that.
The "Cloud-Based Protocols" Deception
Many old-line software systems vendors are desperate to shoehorn "cloud" into their marketing literature. You can’t blame them. If it was 100 years ago and I had to sell wagons against automobiles, you can be sure I’d find a way to use the term "horseless carriage" in my pitch.
In one of the most egregious abuses of the term, there are systems vendors who are covered by the media as cloud companies because they claim to use "cloud-based protocols." You might as well claim to be an electric company because your products use electricity.
I applaud their PR agency for working "cloud" into their press release, but it turns out this is just a case of old-fashioned remote access.
Citing "cloud-based protocols" leads users to a situation that sums up everything we’ve outlined so far: single-tenant applications that are usually hidden remotely as "private clouds" in a data center that has not been qualified or audited.
How to Recognize a Real Cloud
So, how do you recognize the real thing? Let’s go back to the impartial definitions NIST wrote several years ago:
- On-Demand Self-Service. You can obtain services, or expand existing services, without talking to a human and going through a big provisioning process. This is a good litmus test for determining if an application uses multi-tenancy and a real data center. On-demand self-service is usually only possible with multi-tenant applications designed to serve large populations efficiently.
- Measured Service. You pay only for what you are using; say, per camera, per door, or per alarm point in the case of traditional physical security. This functions as good protection against “hide the server” and “private cloud ready” types of claims because there’s no way you can buy services from those architectures “by the drink”. Instead, you’ll see charges for the server, or a virtual machine, or storage, or a license—not something you can directly relate to the actual business of security.
- Resource Pooling. Sharing a common infrastructure across all customers for maximum economic and computing efficiency. You’ll be able to recognize resource pooling if you are logging into the same system (web address) as everyone else who uses the service. This indicates the vendor is using a true multi-tenant architecture and that you’ll get the benefits of a real cloud design.
The cloud is here to stay, and it offers security buyers numerous advantages over traditional solutions -but only when it’s the real cloud.