Why Edge AI Is Gaining Ground in Physical Security

July 15, 2025
In this executive Q&A, Hailo’s Danya Golan explains how edge-based AI is advancing real-time video analytics, enhancing privacy and reshaping security deployments across sectors like retail and critical infrastructure.

The move to process artificial intelligence (AI) workloads locally is beginning to reshape how security systems are designed and deployed. For video surveillance and analytics, in particular, edge-based AI is emerging as a practical and powerful alternative to cloud-centric architectures. By enabling real-time processing directly on devices, organizations can improve responsiveness, reduce bandwidth costs, and strengthen data privacy protections. 

In this executive Q&A, Danya Golan, vice president of marketing at Israeli AI processor company Hailo, shares her insights on how edge AI is being adopted in the physical security market. She discusses the key drivers behind the trend, real-world use cases in retail and critical infrastructure, and the role that edge computing will play in the convergence of AI, IoT, and physical security over the next several years.

How would you describe the current role of edge-based AI in the security technology ecosystem, particularly for video surveillance and analytics?

Edge AI is quickly becoming a cornerstone of modern security systems, especially in video surveillance and video management systems (VMS). Rather than sending raw footage to the cloud for processing, AI at the edge now enables real-time analysis directly on the device. This can include identifying people and objects, detecting behaviors and anomalies, and understanding situations. This shift is reshaping the ecosystem, and allowing security systems to evolve into intelligent agents that can flag hazardous or suspicious events instantly, reducing reliance on centralized infrastructure and enabling faster response times.

Why edge over cloud? Understanding the shift

What trends are driving increased interest in performing AI inference at the edge rather than relying solely on cloud-based solutions? 

Several key trends are driving increased interest in performing AI inference at the edge rather than relying solely on cloud-based solutions. 

First, bandwidth limitations and latency concerns, especially in security and video processing applications, make cloud-based inference impractical for real-time response. Edge AI addresses this by processing data locally, enabling immediate decision-making without the delays associated with cloud round-trips. 

Second, data privacy and operational resilience are increasingly important. Edge processing allows sensitive information to remain on-device, reducing exposure risks and ensuring systems continue functioning even if cloud connectivity is lost. The rise of powerful, compact, and accessible edge AI hardware is also accelerating adoption, making it feasible to deploy intelligent systems at scale. 

Finally, emerging use cases like AI-powered video enhancement and generative AI for scene understanding, summarization, and real-time alerts leveraging vision-language models (VLMs) and large language models (LLMs) require local inference for performance, privacy, and reliability—further accelerating the shift to edge-based AI. 

What are some of the operational or security advantages that edge AI can offer integrators and end users — especially in environments with connectivity, bandwidth or latency constraints? 

​​For integrators and end users, edge AI lowers costs, by minimizing the amount of bandwidth and compute power required for large workloads. Edge AI processors offer real-time processing and limit risk of service interruptions during network outages or connectivity issues. This makes edge AI particularly useful in remote facilities, retail, hospitality or other environments with unpredictable network conditions. Edge AI also supports simple integration and scaling, allowing easier addition of intelligence across distributed systems without requiring full infrastructure overhauls. 

Real-world use cases across key sectors

In what ways is edge AI being leveraged to support security-related use cases in sectors like retail, hospitality, and critical infrastructure?

We’re seeing edge AI being leveraged in retail and hospitality in a number of ways, particularly as businesses navigate persistent challenges like theft, shrink and operational inefficiencies. 

To support security applications in retail and hospitality, edge AI can refine theft and shrink prevention, allowing real-time monitoring and analytics to analyze customer behavior and detect suspicious behavior. AI at the edge can also be used for cashier-less checkout, allowing an autonomous shopping or hospitality experience powered by real-time AI processing. The real-time processing allows these experiences to be secure without requiring human intervention. 

For example, the Hailo-10H was recently integrated into the HP AI Accelerator M.2 Card, which will be used for these types of applications in retail and hospitality environments, enhancing operational efficiency and minimizing theft. 

What factors should security stakeholders consider when deciding which AI workloads belong at the edge versus the cloud? 

The decision is complex, and balancing edge versus cloud processing can be an initial challenge for many businesses. Most organizations ultimately adopt a hybrid model, with the edge managing time-sensitive tasks such as object detection and motion analytics and the cloud handling more data-heavy tasks such as long-term trend analysis. The decision ultimately depends upon the latency and bandwidth requirements, connectivity availability, and data privacy requirements. 

For generative AI tasks, larger models will need to run in the cloud, while the edge is a good alternative for small-medium models due to the reasons mentioned above – cost, connectivity, privacy. 

How does local AI processing help organizations address data privacy, sovereignty, or compliance concerns? 

It’s critical to balance data privacy with public safety and security, and local AI processing supports that balance. Processing data at the edge means that sensitive content, such as customer information, names, and faces, is never transmitted to the cloud. This limits cybersecurity risk and the chances of bad actors interfering with private data. 

Processing data at the edge minimizes exposure risk, reduces liability, and maintains greater control over where data is handled. 

What are some of the most common challenges or misconceptions that security leaders face when exploring edge AI deployments? 

Security leaders often face challenges in understanding exactly where they can incorporate edge AI. To date, most AI processing has taken place in the cloud, so it can be difficult to understand how purpose-built edge AI chips can manage their workload, and how edge processing can outperform general-purpose cloud compute for certain tasks. 

Today’s edge AI hardware is also accessible and scalable, which offers businesses the opportunity to bring unprecedented security processing to the front lines and deliver next-generation operational efficiency. 

The future of AI at the edge 

How do you see the convergence of AI, IoT and physical security evolving over the next 2-3 years? 

We’re heading towards an era where intelligence at the edge is becoming the new standard for security. Local processing is poised to allow edge-powered AI devices to operate with greater autonomy, identifying potential threats without requiring cloud input, and as AI accelerators become more compact and accessible, more systems will shift toward on-device intelligence. These types of applications will make the convergence of AI, IoT and physical security both practical and cost-effective, and ultimately will transform front- and back-of-house operations.

About the Author

Rodney Bosch | Editor-in-Chief/SecurityInfoWatch.com

Rodney Bosch is the Editor-in-Chief of SecurityInfoWatch.com. He has covered the security industry since 2006 for multiple major security publications. Reach him at [email protected].