Controversy over facial recognition tech grabs headlines as GSX 2019 opens

Sept. 11, 2019
Despite advancements in technology and expanding applications, many ethical and legal questions still surround the technology’s use

GSX 2019 began this week with several of the ASIS Councils meeting to discuss objectives, succession planning and the new move to "virtualizing" interest in the ASIS Council verticals through "communities" or ASIS International's version of member-accessed user forums. In this more centralized Council focus, deliverables are made more widely accessible to the membership, improving value.

Day 1 of the conference had a number of unique education sessions, with one about "Managing Diverse Collections of Visual Data," or how video surveillance and acoustic detection continues to present law enforcement with a staggering amount of information. Public/private partnerships leverage industrial IP camera systems for NGOs to securely share digital multimedia content with agencies.

Recently, the use of facial recognition systems have come under fire in several West Coast cities and have even been the subject of law enforcement bans.  In other words, even with child trafficking unfortunately still in the news, law enforcement would not be able to leverage facial recognition systems previously used to protect children.

Several companies like Blue Line Technology have deployed facial recognition that have yielded measurable crime reduction outside and inside the facilities protected, such as those recently deployed in convenience stores across St. Louis.  Additional exhibitors at GSX, like FaceFirst and AnyVision, also have comprehensive solutions.

With residential consolidated video/audio/communications/energy devices like Ring and Nest that continue to increase market penetration and are widely supported by more robust and secure providers like AWS, new AI-based platforms are enjoying rapid growth.  The mixed recognition market already has CAGR growth higher than any electronic security category.

Amazon's Ring line of video doorbells and home surveillance equipment is particularly popular with one key group: police. More than 400 law enforcement agencies around the country have partnered with Ring to use its apps and help market its security cameras to residents in the name of safer neighborhoods.  In fact, I have used several "shared" Ring cameras with "Neighborhoods," via the public/private app. 

"The nature of Ring's products and its partnerships with police departments [Ring Neighborhoods] raise serious privacy and civil liberties concerns," Sen. Ed Markey (D-Mass.) said in a letter addressed to Amazon CEO Jeff Bezos.

Yesterday, CALmatters.org, a nonprofit, nonpartisan media venture explaining California policies and politics reported that facing police opposition, a bill that originally sought to permanently ban California law enforcement agencies from using facial recognition technology in police body cameras has been drastically watered down as it heads for its final votes in the state capitol.

Assemblyman Phil Ting revised the bill last month to limit the ban to just seven years. He amended it again Friday to last for three years.

“We talked to a number of senators and they had a concern with the length of time, so we decided to shorten the length of time,” said Ting, a San Francisco Democrat who introduced the bill saying the technology amounted to an invasion of privacy that could subject innocent people to police surveillance.

Police have been testing the technology in Oregon, where it’s allowed them to compare suspects’ faces to hundreds of thousands of mugshots and, in some cases, find their Facebook page, visit their home or make an arrest, the Washington Post reported earlier this year. California law enforcement agencies do not currently use the technology, but Ting said his bill is necessary to get ahead of its introduction as facial recognition software becomes more common at airports and schools.

“We’re doing the legislation to be proactive, because we know the software is not accurate and should not be deployed for law enforcement purposes,” Ting said. “We wanted to make sure we didn’t end up in a situation where we were falsely accusing or falsely arresting people.”

Law enforcement groups like the IACP have objected to the measure, arguing it will impede them from using emerging technology that could enhance public safety.

“Huge events such as the annual Coachella Music and Arts Festival, the upcoming Los Angeles Olympics, World Cup Soccer Tournament, Rose Bowl, Disneyland and scores of popular tourist attractions should have access to the best available security-including the use of body cameras and facial recognition technology,” the Riverside Sheriffs’ Association argued in a bill analysis. “By banning this technology, California will be announcing to the nation and world that it doesn’t want our law enforcement officers to have the necessary tools they need to properly protect the public and attendees of these events.”

Ting’s hometown of San Francisco became the first major city in the nation to ban facial recognition in police body cameras when it passed an ordinance in May. Oakland followed suit in July. The statewide bill faces final votes on the Senate and Assembly floors this week.

And if this back and forth legislation over the use of facial recognition in the U.S.  weren't enough, a new GDPR enforcement tally is out in the EU and facial recognition got stuck with the biggest fine.

According to the BBC, A watchdog has penalized a local authority for a facial recognition trial on high-school students in Sweden to keep track of attendance.

As a result, the Swedish Data Protection Authority (DPA) found that Skelleftea's local authority had unlawfully processed sensitive biometric data, as well as failing to complete an adequate impact assessment, which would have included consulting the regulator and gaining prior approval before starting the trial.  In addition, there was not "balanced" consent as students may have felt compelled to submit to facial recognition registration.

The trial involved tracking 22 students over three weeks and detecting when each pupil entered a classroom.  This is the first time that Sweden has ever issued a fine under GDPR. The General Data Protection Regulation, which came into force last year, classes facial images and other biometric information as being a special category of data, with added restrictions on its use.  The DPA indicated that the fine would have been bigger had the trial been longer.

According to technology magazine ComputerSweden, Swedish authorities decided to investigate after reading media reports of Anderstorp's High School's pilot project, which took place in autumn 2018 and had been so successful that the local authority was considering extending it.

Teachers had been spending 17,000 hours a year reporting attendance, and the authority had decided to see whether facial-recognition technology could speed up the process.

About the Author:

Steve Surfaro is Chairman of the Public Safety Working Group for the Security Industry Association (SIA) and has more than 30 years of security industry experience. He is a subject matter expert in smart cities and buildings, cybersecurity, forensic video, data science, command center design and first responder technologies. Follow him on Twitter, @stevesurf.