Video Surveillance: The Pros and Cons of UL 2802

The new UL 2802 safety standard is titled, “Performance Testing of Camera Image Quality.” According to Underwriter’s Laboratories, this is the first of five related standards that will evaluate an entire digital video system: transmission, storage, video analytics and displays.

Last year, UL placed a white paper about UL 2802 on its Facebook page that provides a description of each test, and includes example images for some of the tests.

There are a few serious concerns about the potential negative effects of the standard for security video end users, systems integrators, and design and specifying consultants. The concerns — which I believe are valid — center on misuse of the standard to create false impressions about camera products.

How extensive the problems will be depend on how manufacturers, security industry media, distributors and sales reps can accurately convey what the standard is and what it is not.

 

What the Standard Is

UL 2802 is a way for manufacturers to obtain independently certified laboratory test ratings for a standard set of factors that impact video image quality: image resolution; distortion; relative illumination; dynamic range; maximum frame rate; gray level; sensitivity; bad pixel count; and veiling glare. Thus, the standard provides something valuable that is long overdue — a way to make apples-to-apples comparisons across a selection of products using basic technical information about a subset of the factors that affect camera image quality.

In many industries, manufacturer data sheets are not considered sufficiently reliable to be taken at face value. The hope is that UL 2802 test results will be an improvement on data sheet information, at least insofar as its scope of camera testing is concerned.

Testing is the backbone of UL’s business, and so it should be no surprise that they could develop a smart and technically sound approach to rating the testable aspects of camera image quality. UL knows that the test results will vary depending on what camera configuration is used; thus, the standard states that changes to any of the camera’s critical components (which includes lens, operating system and software) may require reconducting the performance tests for each camera configuration. The test lab must record the camera settings for each test.

However, the standard cannot serve as the sole basis for camera product comparison and evaluation. There are many critical aspects of camera performance that the standard does not cover, which is understandable, as many of these performance factors cannot be feasibly tested in a laboratory.

That does not in any way lessen the value of UL 2802’s test ratings. The performance factors that the standard covers are critical to image quality. That’s why the standard is an important advance in the area of camera testing and evaluation. It provides an indisputable starting point for identifying candidate cameras for a security video surveillance application. But it is only a starting point — and that’s why there can be problems.

 

Why There are Problems

A close review of promotional UL documents, and statements from security industry companies, shows that no real emphasis is placed on the fact that the standard is just a starting point. That fact is glossed over, partly from sheer enthusiasm over what the standard is, and partly from lack of knowledge of the standard itself.

However, the standard document itself takes pains to acknowledge the limitations of its testing. According to Section 1 — titled “Scope” — “The suitability of the camera for a specific use case is not determined by this standard. The resulting test scores are intended to provide objective information that will be useful when determining camera use applications (i.e. a camera that performs well in sensitivity and grey level tests may be a good choice for low light use cases).”

At this time, such clarifying language does not carry over into materials promoting the standard. For example, the Conclusion section of UL’s white paper states, “Video cameras tested and certified to UL 2802  ease the process of identifying video technology that meets the requirements of specific applications, enabling more effective comparisons of price and performance.”

While that sentence is not inaccurate, a non-expert reader won’t catch the critical significance of qualifying words like “ease” and “more.” Many marketers and writers will interpret the last phrase as “providing effective comparisons of price and performance” — making it seem like the test ratings are all that’s needed to make a camera selection.

In last year’s third quarter issue, the UL newsletter, “The Fire & Security Authority,” made the following statement: “The result of a UL 2802 test program is an objective set of performance scores for a camera’s image quality attributes.” Many readers will infer from those words that a camera that scores higher than another camera will provide a better image in all conditions; yet, the standard itself — which end-users will not see — refutes that inference.

Other statements like these and worse are being made already by companies who are enthusiastically embracing the standard, and as Frank Luntz states in his book Words that Work: “It’s not what you say, it’s what people hear.” That’s why there will be some problems.

 

What the Standard is Not

The standard is definitely not a substitute for the hard-won product experience of integrators, who understand the difference between the controlled environmental conditions of laboratory tests and real world deployment environments. It’s not a substitute for the knowledge and experience of security design consultants who invest significant time to obtain in-depth product performance knowledge.

The UL standard does not address these camera performance factors such as: CPU performance; multiple video stream support; video compression; strong operator authentication; basic network protocol support; secure (HTTPS) connections; SNMP and syslog protocols; video motion-detection analytics and other on-board analytics; auto-recovery from network error conditions; performance under various load conditions; camera diagnostic information; and mutually exclusive camera functionality.

Problems involving some of these factors can result in video images being stopped altogether, or can prevent the use of high-image-quality camera configurations. Mutually exclusive functionality is rarely identified in camera data sheets. For example, some cameras cannot support certain features if the camera’s low-light mode is activated. Just because a camera communicates over a network doesn’t mean that its network protocol support is complete — some network cameras crash if anyone performs a standard Nmap or Nessus network scan. Limited CPU processing capability can mean that if you configure multiple video streams with at high resolutions and frame rates, the camera may have trouble performing motion detection. An underperforming camera can mysteriously crash, resulting in no video at all — an obvious score of “0” for image quality.

 

Image Quality

Another problem relating to the standard is the unavoidable use of the term “image quality.” There is more than one meaning for the term, which is why it would be more appropriate to refer to UL 2802 as testing “technical image quality” — what the camera is capable of producing under laboratory test conditions.

Camera end-users, on the other hand, are interested in how well cameras perform in areas of interest for security, safety and business reasons. We could call that “functional image quality” — an entirely different subject. Lab conditions are optimized to support high image quality. From that perspective, most field conditions are compromised.  

How will the camera perform given the job it has to do in the location where it will be installed? The final answer lies outside of the scope of UL 2802 testing. The actual situation is that guidance, rather than testing, is needed — plus expertise to take into account the many varying factors that affect video quality.

 

Video Image Quality Guidance

An excellent guidance document is available that provides a design process to develop performance requirements for real-life situations. The 2013 edition of the Digital Video Quality Handbook, published by the U.S. Department of Homeland Security Science and Technology Directorate, can be downloaded at http://bit.ly/digital-video-quality-handbook-2013. This excellent work received significant contribution from the Security Industry Association (SIA) and the Video Quality in Public Safety Working Group of the Homeland Security Office for Interoperability and Compatibility.

Using this kind of sound design process to develop detailed requirements provides a needed context for evaluating UL 2802 product test results, as well as for evaluating all of the other camera performance factors that can make or break a security video deployment.

 

Ray Bernard, PSP, CHS-III is the principal consultant for Ray Bernard Consulting Services (RBCS) at www.go-rbcs.com, a firm that provides security management and technology support for public and private facilities. For more information about Mr. Bernard and RBCS, go to www.go-rbcs.com or call 949-831-6788. Mr. Bernard is also a member of the Content Expert Faculty of the Security Executive Council (www.SecurityExecutiveCouncil.com). Follow Ray on Twitter: @RayBernardRBCS

Loading