Building Your Monitoring Operation

May 8, 2014
How to lay the groundwork for your central station with solid video walls, power supplies, cooling options and more

Video walls, massive monitoring consoles and other high-tech tools will wow anyone who walks into a central station monitoring center; however, it also takes a lot of technology behind the walls and under the floor to keep that station up and running.

Experts agree that mundane considerations such as cooling, cabling and power grooming are the keys to keeping all the technology running effectively no matter what is happening inside or outside the center.

The Wall

There is no doubt that the “ah factor” in any central station is the video wall. Even the most technology-illiterate among us cannot help but be wowed by the screens that dominate the far wall in most monitoring centers; however, even the most technology literate can stumble when designing a wall.

“The first common mistake central stations make when installing video walls is the use of PC-based architectures,” says Vincent Scarpelli, manager of product strategy for video solutions at Tyco Integrated Security. “This will usually require a replacement with purpose-built equipment,” he warns.

While there are rules of thumb, for the most part, any properly designed video wall has to be tailored to meet a unique set of needs, so the number of operators and the screens needed for each application — along with how many feeds each operator needs — can be used as guides to begin the process.

Looking at video wall design, especially when talking about physical dimensions, requires a handle on room size and the number of seats that go into the equation before the engineer figures the size of the video wall. “It is important to understand what the requirements are,” Scarpelli says, which requires good vision by the owner.

The central station needs to decide whether staff will monitor live feeds, recorded feeds, surveillance camera feeds, news outlet feeds, or data from other systems — perhaps a SCADA system. Besides monitoring events and consolidating data from multiple sources, it may be important that the room function as a communications hub or have the ability to record and document actions. “Once the requirements have been decided, the challenge customers often face is with the computer hardware, processors and graphics cards,” Scarpelli says.

Many are tempted to assume that PC-based architectures will suffice. “While on the surface it may seem logical, when introducing multiple streams of graphics into traditional non-real time systems, bandwidth limitations are quickly reached,” Scarpelli says.

Add live video streaming to the mix and demand for processing resources is stressed. For instance, video streams are typically 30 frames per second. “If this is not calculated correctly, you can easily end up with poor image quality and dropped or missed frames,” he says. For a security firm, that is a disaster.

“Today’s technologies for video walls lean toward purpose-built products,” Scarpelli says. This means using computer hardware that is built for specifically for these applications with a properly sized processor and the correct video cards. The right products, Scarpelli says, are designed for 24x7 operation that supports the highest quality processing for video streams.

“The monitors themselves have faster response time, are brighter from LED technology and use less power so they run cooler,” Scarpelli says.

“The budget is also a necessary question, as the amount allowed for video walls affects the choices of equipment,” Scarpelli says, adding that there is a great deal of other equipment required to keep it running.

Keeping it Running 24/7

Survivability of the central station is the overriding concern in all cases. That requires work. First, define which systems are “essential” and which are “critical.” The former are must-haves but can be down for a second or two. Emergency exit door lights or backup servers might fall into this category. The latter absolutely cannot blink even for a second or may require re-boot times that are unacceptable. Systems requiring essential power can wait for a backup generator to kick in after 20 or 30 seconds, but critical systems will not.

“Coordinate your generator and UPS start-up times,” recommends Ed Spears, power quality marketing manager for Eaton Corp. “It is tempting to put everything on one giant system but it is very inefficient. For a critical application like central stations, I would suggest 30 minutes of battery backup time as a minimum,” Spears continues.

While a typical battery backup for a normal data center might be just five minutes, central stations need to be up 24/7/365 no matter what happens. “What if the generator does not start after 30 seconds?” he asks.

The UPS battery system bridges the gap between a power failure and the takeover by a diesel generator. “Make sure you have enough battery with your power grooming equipment,” Spears says.

While batteries are big, heavy and expensive, no central station should buy cheap. “Three to five minutes of backup time simply is not enough for central station monitoring arrangements,” Spears says. “Remember, the central station has service-level agreements that must be met.”

Before installing either a UPS or a generator, get the vendors of each system to communicate. Having a 50kW UPS and a 50 kW generator, for example, is a recipe for failure. “A 50 kW UPS will draw more power than the generator will generate,” Spears says. “In a crisis, it’s a bad time to find out that you need a bigger generator.”

Every good vendor has a staff that will calculate the power requirements for a central station. “Plan for growth,” Spears says. “Don’t undersize the equipment. Plan for the UPS to be loaded 80 percent or less – that gives you 20 percent headroom.”

Know the site wiring details — whether it has 480, 240, 120 or 100 volt requirements. Add up how much power is needed. It could be as much as 200 kW in some instances although a 30 to 50 kW UPS is typical for the average central station. Also keep in mind that, while a central station provides security for customers, it has its own security and survivability needs, too. It is standard to have at least 72 hours of fuel on site for the generator at all times. In hurricane-prone areas, having an arrangement with a diesel provider who will put the central station on a priority list in case of a long-term outage is vital. The fuel company will charge a premium year-round but it is worth it.

The three major trends in power are efficiency, footprint and user interface. Spears concedes that battery backup is “incredibly expensive” as well as requiring a huge footprint. Most central station operators want to devote square footage to revenue-producing functions. For that reason, most UPS companies focus on minimizing the space required; in fact, industry-wide, the most successful products are the smallest.

A close second is high-efficiency systems. A traditional UPS takes AC power from a generator, converts it to DC, grooms it and rebuilds the AC power. A lot of heat is generated in the process — heat that must be removed and heat that represents wasted energy in an inefficient UPS. “There is a huge emphasis on efficiency,” Spears says.

Be aware that load makes a difference. Some units offer 93 percent efficiency at 100-percent load; however, most UPS systems are loaded only to 50 percent. That means efficiency is only at 88 percent, so 12 percent of power is wasted. Today’s high-efficiency units run at 98 to 99 percent.

Another consideration with power systems is user interface. “The reality is that people buy the ‘Christmas tree effect’,” Spears says. As a result, UPS vendors put a lot of thought into providing a usable, user-friendly interface. The days of a simple flashing red failure alarm light are gone. Those who have seen the battery interface in a Prius have an idea of how simple and explicit today’s industrial UPS interfaces can be.

Spears says that it likely will not be the power expert who is on hand when power crashes — it will be a security guard or maintenance person who has to figure out what it means when red or yellow warning lights go off. A good, intuitive user interface means that person can make the right decisions for the central station to stay alive.

“Do not underestimate the amount of heat the equipment generates,” Scarpelli says. This is rectified by separating equipment by putting it in an adjacent room: “Only the monitors and workstation PCs should be in the room with the operators,” he says.

Cooling Off

“Cooling in the security control room is becoming a diminished question,” says Wayne Cook, vice president of Winsted Corp. While in the past it was a major question, the current trends in deployment of assets within the room have dramatically changed over the past few years and the trend is accelerating, he says, adding that this is his own opinion based on personal observations.

“In past control rooms, most major equipment was located within the confines of the security room,” Cook explains. The close proximity to the operator was necessitated because of the older technology. An example would be that time-lapse recorders required the operator to change tapes. Most other equipment required operators to be able to change settings by physically touching them. “In today’s control rooms, the NVRs and most other key components are being located in the server room,” Cook says. The operator has access via GUI to them.

“In the very near term, virtually all assets will be located away from the operator,” Cook predicts. That will reduce the cooling burden on the main room and put it in the rack room where it can be handled. Still, lighting, CPUs and CRTs will continue to be huge contributors to heat within the security control room. New lighting technology, flat screens and more efficient CPUs have greatly reduced heat load within the control room space, Cook says.

“While heat is still a concern, it is radically different than those concerns were only a few years ago,” Cook says. KVMs and blade servers will continue this change and in the future, the control room will function with very little additional cooling required, Cook adds.

Other Considerations

Today’s market offers a variety of modular, scalable products for central station infrastructure that reduces guesswork about what technology will be needed tomorrow, along with providing wiggle room for estimating future growth requirements.

Experts agree that a central station operator should look for a solutions vendor or team of vendors, one that provides hardware, software and consulting services — not what Spears terms “a solution in a box.” That team will be back on site when grounding problems arise or a new set of systems needs to be installed.

Curt Harler is a technology writer and regular contributor to SD&I magazine. You can reach him at [email protected].