LPR Installation Best Practices

April 17, 2023
Though often overlooked, deployments of this mature analytic technology require integrators to weigh a number of different factors

This article originally appeared in the April 2023 issue of Security Business magazine. When sharing, don’t forget to mention Security Business magazine on LinkedIn and @SecBusinessMag on Twitter.

The introduction of artificial intelligence (AI) and deep learning-powered algorithms into the security industry have proven to be game-changing in how businesses think about and use video surveillance technology. With all the buzz and excitement generated by these new capabilities, it easy to forget some of the innovations that paved the way for the current crop of analytics. Among these tried-and-true solutions is license plate recognition (LPR), which is also known as automatic number plate recognition (ANPR) in some parts of the world.

One of the benefits of being a mature technology means there is a wealth of industry experience when it comes to deploying LPR with best practices that can be universally applied in the field. Among the lessons learned that will be covered in this article include environmental factors that must be accounted for, technical aspects to consider in the choosing hardware and software solutions, as well as how new innovations are delivering enhanced capabilities to the market.

Strong Demand Persists

Given the pervasiveness of AI today, it is easy to forget that LPR technology has been available in some shape, form or fashion for nearly two decades and continues to see strong global demand. Just like nearly every other video analytic, LPR solutions have undergone significant improvements in the years since they were first introduced. Greater availability of high-resolution cameras coupled with advanced neural network training means that today’s offerings can recognize a much wider array of data from license plates and vehicles with very high degrees of accuracy.

Whereas in the past the license plates of only a handful of countries or even U.S. states could be identified using the technology, today there are LPR products capable of recognizing not only the plates of every state, but also from a plethora of the world’s nations.

Not only have the number of recognizable license plates grown, but the conditions in which the technology can operate have also changed dramatically over the
years. Prior to the proliferation of more advanced network cameras, issues such as frame rate limitations meant that LPR could only be performed in optimal conditions and at relatively low vehicle speeds. Today, end-users can deploy this technology in a variety of adverse weather conditions (rain, fog, snow, etc.) with the ability to read plates on cars traveling at up to 150 mph.

However, for the technology to achieve its true potential, integrators must consider a number of different factors that can greatly influence the performance of LPR. Here is a brief overview of some of these lessons learned from the field and how they can be addressed.

Understand Your Environment

The environment in which an LPR system will be deployed must be clearly defined for the implementation to be successful.

The three most common operating environments for LPR are: parking and facility entrances, city intersections, and highways. Each of these environments requires different technical configurations to meet the demands of the respective deployment. For example, a camera and software package tuned for low-speed applications like parking lots would produce poor recognition results if someone tried to adapt it for an interstate highway.

Additionally, attention must also be paid to the times of day that the LPR system will be used. It is possible that everything could work perfectly fine during the day, but a lack of proper illumination, via infrared (IR) or white light, means that it won’t work as intended during nighttime hours.

For most installations where the camera to vehicle distance is within 50 feet, built-in camera IR illumination should be sufficient. For longer distances (more than 50 feet), external illuminators should be considered. Also check with your camera vendor as they might offer external illuminators as well.

This may sound like a very basic consideration, but integrators can and often do make the mistake of not factoring in nighttime use of LPR and what that will entail from a technical perspective.

Details Matter

Before considering different types of LPR offerings, it is paramount that the technical specifications as well as limitations of the environment are meticulously analyzed to determine what solutions will and won’t work. Perhaps the best place to start is in examining existing network and power infrastructure.

The type of power and network resources at a site will determine what type of server is needed and where it will be installed. For example, if there is fiber connectivity between the camera and the location/data center, then the server can be placed there; however, with limited network connectivity, a small edge device/server might be installed in a cabinet next to the camera.

Additionally, depending on the situation, cameras with embedded LPR analytics might also be a good fit. However, if your customer opts to use a camera with embedded analytics or some type of small edge device, it must still meet the requirements listed here for the deployment to be successful:

  • Proper camera placement is also important to the success or failure of an LPR deployment. Factors you should consider include:
  • Make sure the angle between the camera and the plate is sufficient. Bad camera placement can limit the performance of the system in the future, even if all other things are configured correctly.
  • Try to keep the vertical angle between the camera and license plate less than 45 degrees and the horizontal angle less than 25 degrees.
  • For multi-lane coverage, also consider an overhead camera install. Overhead installs will guarantee you will not face issues with vehicle obscurement.

A common mistake by integrators is assuming that general surveillance cameras can be used for LPR; however, this is mostly not the case. Ensure that the chosen cameras meet all the technical specifications required for the application. For example, if you want to cover one or two lanes of traffic, an image resolution of 1080p will typically suffice; however, if you need to cover three or four lanes of traffic with the same camera, you need one with 5-megapixel resolution or higher.

Other camera specs to consider include:

  • Ensure proper camera frame rates. In stop and go environments (up to 20 mph), 5-15 frames per second (fps) is usually sufficient. For faster applications, 30 fps is recommended.
  • Verify sufficient image pixel density. With a low pixel density, the LPR system will not have enough pixel data to detect a license plate at a high accuracy. Look to have a pixel density of about 100-120 pixels per foot. In the U.S., this typically translates to the width of the license plate being 100-120 pixels. Be aware of any requirements of over 150 pixels as these might be from legacy systems.
  • Regardless of what resolution you use for your camera, it is typically recommended to use a camera with a varifocal lens (up to 35mm or 50mm). It is also very helpful if that lens is motorized, meaning it can be zoomed remotely from a web configuration tool, so you don’t need to climb up to the camera to do the adjustment.

Software Considerations

When all of the hardware specifications have been brought into alignment for the application, it becomes time to decide on a software solution that can generate the results and high levels of accuracy required. While some vendors can offer robust analytics at the edge, the vast majority of deployments, particularly those in high-speed environments, will require a software solution running on the server side to have the sufficient processing power necessary for the job.

Today’s LPR engines should be fully based on AI/neural network training regimens to ensure the highest levels of accuracy. This is not something that can simply be glossed over in the technology selection process, as knowing the difference in how algorithms are trained will help in determining the kind of solution you need. In contrast to generic machine learning solutions that typically require engineers to determine what an algorithm should and should not be looking for, neural networks mimic the human brain in that they are able to learn apart from this manual intervention to know what is and isn’t relevant data. This also enables the analytic to operate much more quickly and efficiently, providing the right information to the right people at the right time.

Integrators should evaluate the robustness of the software. For example, does it have trouble recognizing the plates of a particular country or state? Does it need to correlate license plates with other information about the vehicle itself? In many large-scale LPR deployments, it may be necessary to deploy software that can detect vehicle class (truck, car, van, etc.), color, make, and model along with the direction and speed of vehicle travel.

For example, in law enforcement applications, being able to correlate this data simultaneously with LPR is crucial to identifying vehicles that may have been used in the commission of a crime. Many cities have also opted to leverage autonomous traffic enforcement for a variety of moving violations (speeding in school zones, running red lights, etc.), which make the ability to correlate license plates with vehicle data an absolute necessity. Some end-users may even want the ability to track vehicles traveling in convoy with one another, which is also possible with today’s solutions.

As with all analytics, LPR continues to improve and become more refined with each passing year. End-users need to be able to trust that solutions they are investing in will function reliably, so it is incumbent upon integrators to know what will and will not work in different applications to be able to offer the right products.

Eugene Beytenbrod is Global Director of Engineering at ISS. Learn more about the company at www.securityinfowatch.com/10214716.