Video walls, massive monitoring consoles and other high-tech tools will wow anyone who walks into a central station monitoring center; however, it also takes a lot of technology behind the walls and under the floor to keep that station up and running.
Experts agree that mundane considerations such as cooling, cabling and power grooming are the keys to keeping all the technology running effectively no matter what is happening inside or outside the center.
There is no doubt that the “ah factor” in any central station is the video wall. Even the most technology-illiterate among us cannot help but be wowed by the screens that dominate the far wall in most monitoring centers; however, even the most technology literate can stumble when designing a wall.
“The first common mistake central stations make when installing video walls is the use of PC-based architectures,” says Vincent Scarpelli, manager of product strategy for video solutions at Tyco Integrated Security. “This will usually require a replacement with purpose-built equipment,” he warns.
While there are rules of thumb, for the most part, any properly designed video wall has to be tailored to meet a unique set of needs, so the number of operators and the screens needed for each application — along with how many feeds each operator needs — can be used as guides to begin the process.
Looking at video wall design, especially when talking about physical dimensions, requires a handle on room size and the number of seats that go into the equation before the engineer figures the size of the video wall. “It is important to understand what the requirements are,” Scarpelli says, which requires good vision by the owner.
The central station needs to decide whether staff will monitor live feeds, recorded feeds, surveillance camera feeds, news outlet feeds, or data from other systems — perhaps a SCADA system. Besides monitoring events and consolidating data from multiple sources, it may be important that the room function as a communications hub or have the ability to record and document actions. “Once the requirements have been decided, the challenge customers often face is with the computer hardware, processors and graphics cards,” Scarpelli says.
Many are tempted to assume that PC-based architectures will suffice. “While on the surface it may seem logical, when introducing multiple streams of graphics into traditional non-real time systems, bandwidth limitations are quickly reached,” Scarpelli says.
Add live video streaming to the mix and demand for processing resources is stressed. For instance, video streams are typically 30 frames per second. “If this is not calculated correctly, you can easily end up with poor image quality and dropped or missed frames,” he says. For a security firm, that is a disaster.
“Today’s technologies for video walls lean toward purpose-built products,” Scarpelli says. This means using computer hardware that is built for specifically for these applications with a properly sized processor and the correct video cards. The right products, Scarpelli says, are designed for 24x7 operation that supports the highest quality processing for video streams.
“The monitors themselves have faster response time, are brighter from LED technology and use less power so they run cooler,” Scarpelli says.
“The budget is also a necessary question, as the amount allowed for video walls affects the choices of equipment,” Scarpelli says, adding that there is a great deal of other equipment required to keep it running.
Keeping it Running 24/7
Survivability of the central station is the overriding concern in all cases. That requires work. First, define which systems are “essential” and which are “critical.” The former are must-haves but can be down for a second or two. Emergency exit door lights or backup servers might fall into this category. The latter absolutely cannot blink even for a second or may require re-boot times that are unacceptable. Systems requiring essential power can wait for a backup generator to kick in after 20 or 30 seconds, but critical systems will not.