Because it works through walls, WhoFi could alert staff to unauthorized presence in restricted areas, even in darkness or when cameras are obstructed. It could act as a silent alarm system that detects movement and identity simultaneously.
Retailers could use WhoFi to track returning customers, monitor dwell times, or analyze movement, all without cameras. The system’s ability to recognize individuals through obstacles could produce deeper behavioral data than visual tracking alone.
Technical Hurdles
The technology does have some technical and operational challenges. Real-world use will test WhoFi’s robustness. Everyday environments are noisy, filled with reflections, interference, and multiple people moving at once. These factors can distort signals and degrade accuracy.
Enrollment also presents challenges. To re-identify someone, the system must have a stored “signature,” which means an initial calibration step. Changes in body shape, posture, or clothing could shift that signature and require re-training.
Still, for access control and security, WhoFi represents a potential breakthrough, a step toward ambient, device-free authentication.
Privacy and Legal Concerns
The more powerful WhoFi becomes, the more it challenges privacy and data protection laws. Its ability to detect, locate, and uniquely identify people without their cooperation collides directly with existing privacy frameworks.
Unlike cameras or card readers, WhoFi operates invisibly. People can be tracked simply by moving through a Wi-Fi zone. There’s no obvious indicator of surveillance, no camera lens, no beep, and no opt-out. Over time, these systems could generate movement logs and behavioral profiles, linking physical presence across multiple spaces.
Even if names aren’t stored, unique signatures enable persistent tracking. This is the concerning part – while Personally Identifiable Information (PII) may not be attached, your unique signature can be used to build a profile and detailed history that could be paired with PII.
Under the EU’s GDPR, biometric data used for unique identification counts as a “special category” of personal data, subject to strict rules. Collecting it requires explicit consent or a clear legal basis. WhoFi would almost certainly qualify.
That raises major compliance problems. How can consent be obtained from people who don’t know they are being scanned? How can individuals exercise their rights to access, correction, or deletion if they can’t even tell a system is collecting their data? If deployed carelessly, WhoFi systems could face lawsuits and bans similar to those that hit facial recognition vendors.
The same traits that make WhoFi exciting for access control make it dangerous for surveillance abuse. Cities or corporations could monitor people’s movements continuously, blurring the line between security and spying. In private homes or workplaces, WhoFi could be misused to follow individuals without consent. And if access is denied or granted automatically, users may never know why or how to challenge errors.
To align with privacy law, future deployments would need safeguards: visible disclosure, opt-in consent, encrypted data handling, and limits on range and retention. Systems should store only temporary, anonymized embeddings and destroy raw data after use. Independent audits could enforce accountability.
WhoFi hints at a future where the walls themselves can recognize you, where Wi-Fi becomes a biometric medium. For security and convenience, the implications are enormous. For privacy and civil liberties, they are equally profound.
The researchers at La Sapienza have opened a new frontier in sensing technology. What happens next depends on how policymakers, engineers, and society choose to govern it. The challenge isn’t just technical; it is ethical.
The question is no longer whether our surroundings can recognize us, but whether they should.