The Metrics of Video Sensor Effectiveness

Innovations move helter skelter in the field of security video software. The ascendance of video tools, particularly behavior analysis software, over the past few years has been astonishing. As always, the gap between expectations and reality in...


Where Is the Value?
How practical is a system with these kinds of error rates? Is an organization comfortable, for example, with the knowledge that every time it labels an actor as "in violation," it knows that it will be wrong 9 times out of 10, or even, as in the previous example, 2 times out of 10? If the number of real violations stays the same, any drop in accuracy raises the false alarm rate significantly, no matter how it is computed. To achieve a low false alarm rate, a system for detecting a common event, such as a car entering a garage, does not need to be as accurate as a system detecting a rare event, such as the violation of policy at a facility with a high rate of policy compliance. Similarly, for situations in which "bad" behaviors occur infrequently, the overall error rate of the system has to be significantly less than the frequency of the "bad" behavior, or the false alarm ratio will be high. In practical terms, users' first priority is to diminish actual violations of policy. As "bad" events become less frequent, the accuracy of the system needs to rise in order to have value.

In short, there is practical value to understanding how metrics interact. They provide another data point that can be used when fashioning a security system that matches your approach to risk to an array of currently available tools and technologies.

Jim Helman PhD was chief architect at SmartCatch, and Nicholas Imparato PhD is a research fellow at the Hoover Institution, Stanford University, and professor of management and marketing at the University of San Francisco. This article was prepared while both were consultants to NEC Laboratories and advisors to SmartCatch.