This article originally appeared in the September 2020 issue of Security Business magazine. When sharing, don’t forget to mention @SecBusinessMag on Twitter and Security Business magazine on LinkedIn.
Many years ago, when I was a Judge Advocate (JAG) in the United States Air Force, I led an investigation into a young airman who went missing (known in the military as Absent Without Leave or AWOL) from an airbase in northern New York State.
Hoping to gather clues about the airman’s disappearance, we interviewed his wife. Among other things, she stated that, approximately a week before the airman disappeared, a rock band comprised of his high school friends stayed with the airman and his wife at their home. The band was touring – playing at concert venues across the country – and had gained some fame, having appeared on television in the recent past. The wife would not confirm it, but we suspected that her husband ran away with the band to join them as a roadie on their tour.
Whatever this young airman’s motives, he left his family – including his wife and a young baby – and failed to report for duty, a crime under the Uniform Code of Military Justice. Thus, we had no choice but to find him and bring him back to his military unit. Acting on our suspicion that he was touring with the band, we located the tour schedule and hoped we could apprehend the missing airman at the next concert – scheduled for a venue in central Illinois.
As the military is prohibited under well-established federal law from serving in a law enforcement capacity in civilian settings, we had to send the U.S. Marshals to the concert. We prepared them by sending a picture of the airman (actually, we faxed it!) – which they printed and took with them to the concert. At the end of the concert (yes, they actually stayed for the concert), the Marshals moved in, held the picture up to the face of the suspect, confirmed his identity, and apprehended him (he denied it was his picture). The airman was eventually extradited to my base in Massachusetts – where he was criminally charged and, ultimately, involuntarily discharged from military service.
Many lessons can be drawn from this story – don’t leave your family; don’t leave your post; don’t let your rock band buddies publish their concert schedule online. However, it is cited here as a very primitive example of facial recognition. Law enforcement agencies around the world have been using photographs to capture criminal suspects for decades. So, facial recognition is not new – but automated facial recognition on a mass scale is new.
What if our young airman went AWOL today? What if he was wearing a mask at the concert? What if everyone was wearing a mask at the concert?
NIST Study on Masks and Facial Recognition
In my July 2020 column (www.securityinfowatch.com/21143109), I wrote about a landmark study published by the National Institute of Standards and Technology (NIST) alleging bias in facial recognition algorithms. With COVID-19 in full swing, NIST is at it again – recently publishing the first in a series of reports on the performance of facial recognition algorithms on faces partially covered by masks.
The study, known as an Interagency Report (NISTIR 8311), was published in late July 2020 and conducted in collaboration with the Department of Homeland Security’s Science and Technology Directorate, Office of Biometric Identity Management, and Customs and Border Protection. Bluntly, the results were not good. Among other things, this preliminary study found:
Out of the 89 commercially available facial recognition algorithms tested, error rates ranged between 5% and 50% – higher than NIST’s prior study of unmasked images.
Masked images more frequently resulted in algorithms that could not extract a face’s features well enough to make an effective comparison.
The more of the nose a mask covers, the lower the algorithm’s accuracy.
While false negatives increased, false positives remained stable or modestly declined.
The shape and color of a mask may matter. Algorithm error rates were generally lower with round masks and higher with black masks.
While these results are concerning to the extent that facial recognition technologies are deployed in environments where masked people congregate (as most places should be these days), NIST acknowledged that the study was limited in multiple respects:
1. None of the algorithms tested were specifically designed to handle face masks. That is a substantial caveat to these results – because the software can and will be adjusted to account for masks; in fact, NIST intends to conduct such a study later this year.
2. None of the masks used in the sample images for testing were real. Rather, NIST used digital creations – variants that included differences in shape, color and nose coverage. Again, it is expected that images with real masks will be tested in future NIST rounds of this study.
3. None of the images tested were matched against the images of others. Instead, the NISTIR 8311 study assessed how well each algorithm matched digitally-applied face masks with photos of the same person without a mask. Future study rounds will test one-to-many searches and add other variations designed to further broaden the results.
4. Many leading technologies were not offered for testing. As with NIST’s landmark Dec. 2019 study, technologies such as Amazon’s cloud-based Rekognition technology were not included.
NIST has acknowledged these limitations and acknowledged that they expect the technology to continue to improve – particularly as to facemasks. Users, whether they are private companies or public agencies, should be discerning in their use of this technology – particularly in a COVID-19 environment.
Just like that young airman did not represent all military personnel (most of whom make good choices every day), not all facial recognition technologies are created equal. Those with high error rates – with masks or without – should be exposed and discarded. Those that can harness this fantastic power – ethically, legally and responsibly – should be embraced and put to good use to promote the safety and security of our citizens.
Timothy J. Pastore, Esq., is a Partner in the New York office of Montgomery McCracken Walker & Rhoads LLP (www.mmwr.com), where he is Vice-Chair of the Litigation Department. Before entering private practice, Mr. Pastore was an officer and Judge Advocate General (JAG) in the U.S. Air Force and a Special Assistant U.S. Attorney with the U.S. Department of Justice. Reach him at (212) 551-7707 or by e-mail at firstname.lastname@example.org.