Operational Testing of Technology

Oct. 27, 2008
Field tests can prove the effectiveness of security solutions

Operational testing is defined as the field testing, under realistic conditions, of an item, component or system to determine its effectiveness and suitability. By “operational testing” we do not mean the formal process known as Operational Test and Evaluation (OT&E) that is employed by the military services for testing of military systems and equipment — such as radar systems and helicopters. The subject of this article is a much less formal approach to operational testing that is appropriate for commercial off-the-shelf security technology.
Operational testing is different from acceptance testing, which is performed after purchase and installation of systems and equipment. Operational testing occurs prior to purchase, and is performed to ensure that the technology is suitable for the purpose and the environment intended. Manufacturers often provide equipment at no charge for such testing, unless the testing will result in wear and tear on (or partial destruction of) the equipment. Depending on the level of effort required to install the technology for testing, it may be appropriate to contract with a systems integrator to install and configure the technology for testing, and to develop a test plan if that is beyond the capabilities of in-house personnel.

The Full Spectrum of Testing
Operational testing is just one type of testing in the full spectrum of tests that should be applied to security technology. Currently, few systems integrators and security practitioners incorporate a phased acceptance test plan in the procurement of security systems. Years ago, when security technology was simpler and less integrated, acceptance testing basically consisted of a short demonstration, for which little preparation was required.
The complexity of today’s security products and systems warrants the full application of acceptance testing at appropriate points throughout the procurement cycle.

The Reason for Operational Testing
In recent years, information technology advancements have provided valuable features whose benefits call for immediate adoption. The traditional practice of waiting until a technology has been installed and field-proven for a few years postpones security benefits, and is also not warranted. Many, if not most of security industry technology advances are developments that have come from other fields, such as medical (vein recognition biometrics technology) or military applications (command-and-control software). In that sense, the technologies themselves are not new — their commercial security applications are new.
However, that does not mean that every technology will necessarily work in every desired security application. For example, video analytics for detection of objects on water may work well in one harbor but may not work in another, due to differences in wave characteristics.
Not all new security technology will require operational testing. For example, card reader technology is mature enough that a simple demonstration can usually provide evidence of its workability. However, a reader that is intended for outdoor deployment under harsh weather conditions should receive operational testing if it has not yet been deployed in such environments.

Field Testing
There are two aspects of field testing. The first is functional testing to ensure that the equipment (or system) can function as expected in the intended environment. The second is performance testing to ensure that the equipment will continue to function at a satisfactory level of performance throughout the expected range of field conditions (such as warm weather and winter weather) and system conditions (light loads and full loads).
If the equipment can be observed in an installation that is, for all practical purposes, identical to the intended environment, then it is a valid and worthwhile approach to “field test” using someone else’s installation. That can be especially helpful for a situation, for example, where outdoor equipment has already been through one or more extremes of weather conditions — otherwise, new field testing would have to be scheduled for whatever duration is required to achieve exposure to the full range of weather conditions.

Functional Testing
Current security technology includes video-enabled mobile devices, high-bandwidth wireless networks, interoperable Web-based system capabilities, on-board processing (such as video camera on-board analytics), and other components that must interact with other systems. Wireless devices and systems must also be tested to ensure that they do not interact with or interfere with other types of wireless systems already in use. For example, one company deployed card readers using wireless transmitters in a manufacturing plant, only to discover that its portable telephone system would not work when the readers were powered up. At one multi-building campus, a line-of-site networking technology worked fine initially, but after a few weeks was found to operate intermittently. No cause could be found, and the technology continued to work as intended at other similar locations.
Sometimes it is not the technology that is problematic, but the environment itself. An air humidifying in a printing plant caused condensation to occur on and inside newly deployed video cameras, as well as on the lenses. The cameras had to be replaced and installed inside protective domes.
How the technology recovers from adverse conditions may require testing. For example, after a power loss, some DVRs automatically recover and resume recording, while others require a manual re-start procedure.
Sometimes there are human factors involved. For example, when changing from one type of proximity card reader to another, the way the card is presented (waved vs. held still, for example) may affect the ability of the reader to read the card. Cardholders may have to be educated on the newly required presentation technique, in order for the card readers to function properly.
Here are a few of the technologies that can require operational testing:

• wireless networks;
• wireless devices (mobile and fixed);
• video analytics (on-board the camera or server based);
• long range card readers;
• biometrics;
• turnstiles;
• weapons detection technology;
• special sensor technology; and
• night-vision and other special condition technology.

Performance Testing
Performance testing is important for extreme environmental conditions and extreme load conditions. Testing for extreme load conditions should be performed for all aspects of the systems that are subject to variable loads. For example, sensors and other endpoint devices — including cameras with on-board analytics — require testing in the full range of expected operation. Additionally, systems must be tested with a high number of operators performing processing-intensive or data-intensive operations.
Sensors and sensor systems require field testing to determine their false alarm and nuisance alarm rates. Most often the equipment or systems can be adjusted for field conditions to eliminate false alarms and provide an acceptable nuisance alarm rate, but that is not always the case.
Load conditions can also be an issue for system integration points. What is an acceptable load condition for one system may not be for another. Where event data is exchanged between systems, the integration must be tested using a higher data rate than is expected to occur under any condition. If an integrated system accepts data from multiple systems, testing should be performed with high data rates simultaneously on all data connections.
One reason that extreme load testing is important is that high load conditions often accompany security incidents — the exact situation under which a security system must not fail. A simple example is a construction site accident, where a large number of security, safety, supervisory and management personnel may all try to view both live and recorded video at the same time. If there is not sufficient network bandwidth to support the operator activity, the video system will have failed its purpose.
The human element also needs to be considered in performance testing. For example, one company installed additional card readers to support a new access control system’s muster feature. The intention was to generate a list of personnel still inside the facility, in support of an evacuation or rescue effort. When a small fire occurred, there were not enough readers to support the swift exit of all personnel from the building. As a result, the muster feature was useless during that incident.

Use Case Scenarios
Both functional testing and performance testing require use case scenarios to be defined. The best time to define use case scenarios is prior to beginning the process of technology selection — not immediately prior to testing. While a description of requirements is an important starting point, it generally takes one or more use cases to provide the context in which the requirements must be fulfilled.

Sharing the Knowledge
It benefits the security industry (manufacturers and service providers) and the security profession (security practitioners) when the results of operational testing are published. Some operational testing takes the form of a selection testing, sometimes called a “shoot-out,” where several technologies are field-tested identically in the same environment. The results are announced in a press release and are often published case study that focuses on the selected technology.
Some practitioners have been reluctant to publish such case study material, out of concern for possible criticism of the testing methodology or the manner in which the results are reported. When the overall plan includes announcement or publication of successful results at the end of the test period, vendors can provide assistance (and even bring in a consultant to help) with test planning, results documentation and publication.
As the pace of technology development continues to accelerate, and the scope of technology capabilities continues to grow, the importance of testing increases. New security technology has a lot to offer. Testing assures that the promise of technology will actually become reality for your facility.

Editor’s note: For more information on testing, refer to Mr. Bernard’s ST&D series of articles on security system testing: “Advanced Technology Requires Advanced Test Planning,” March 2007; “Security Systems Integration Testing,” April 2007; and “Testing Today’s Technology,” May 2007.

Ray Bernard, PSP, CHS-III is the principal consultant for Ray Bernard Consulting Services (RBCS), a firm that provides security consulting services for public and private facilities (www.go-rbcs.com). Mr. Bernard has also provided pivotal strategic and technical advice in the security and building automation industries for more than 20 years. He is founder and publisher of “The Security Minute” 60-second newsletter (www.TheSecurityMinute.com).

James Connor is founder and principal of N2N Secure (www.N2Nsecure.com), a security consulting firm specializing in converged physical and logical security solutions. Mr. Connor is the former senior manager of Global Security Systems for Symantec Corp., where he designed and implemented global strategies for technical security systems.