Penlink VIA Visual Intelligence Platform

May 29, 2025
VIA offers visual data analysis for public safety and security teams.

Penlink announces the launch of VIA, its next-generation Visual Intelligence Platform.

VIA provides analysts and investigators with access to media intelligence by transforming images and videos into actionable insights. The platform enables users to search and analyze digital media such as open-source intelligence (OSINT), evidentiary, or forensic data. Investigators can now detect relevant images and frames through contextual and natural language queries, extract geospatial intelligence, identify objects and relationships, interpret scene context, verify media sources, and trace how visual content spreads across networks. 

“VIA represents a major step forward in making unstructured visual data searchable, analyzable, and trustworthy,” said Shay Attias, Chief Technology Officer at Penlink. “We’ve engineered it as a multimodal system, leveraging the latest generative AI models to extract and correlate insights across visual, textual, and spatial domains, unlocking layers of intelligence that were previously inaccessible.” 

“Penlink is committed to building solutions that empower our law enforcement, defense, intelligence, and enterprise partners,” said Peter Weber, CEO of Penlink. “With VIA, we are unlocking the potential of visual data at scale—bringing clarity to complex investigations and accelerating the path from raw media to real understanding and better conclusions.” 

VIA is now available to selected partners and will be showcased during upcoming innovation briefings and public safety summits.

Request More Information

By clicking above, I agree to Endeavor Business Media's Terms of Service and consent to receive promotional communications from Endeavor, its affiliates, and partners per its Privacy Notice. I also understand my personal information will be shared with the sponsor of this content, who may contact me about their offerings per their privacy policy. I can unsubscribe anytime.