By Tracey Dowdy
Public schools in Lockport, New York, are planning to add facial recognition software to their security systems to alert staff and security when individuals who have been flagged as a security risk enter school grounds. The new software will check each face against the school’s database of expelled students, sex offenders and others who have posed a security threat.
Rob Glaser of RealNetworks announced last week that he would offer a free version of its facial recognition system, SAFR, to schools nationwide. On its website, it’s described as “a highly accurate, AI-based facial recognition platform architected to economically scale at high performance with rapid processing to detect and match millions of faces in real time.”
“We shake our heads that we’re having to deal with and talk about these kinds of security issues,” said Robert LiPuma, technology director for the Lockport district, “but here we are.”
It’s an effort to give security personnel advance notice of potential threats in the hope that it will reduce the number and impact of school shootings by identifying troubled students. Officials referred to individuals like Nikolas Cruz, a former student who had been expelled, now charged with killing 17 at Marjory Stoneman Douglas High School in Parkland, Florida.
By mounting cameras throughout the building to track movement through the halls, “This would have identified (Cruz) as not being able to be in that building,” said Tony Olivo, a security consultant who recommended the system for Lockport.
But, not everyone is in favor of the plan. The New York Civil Liberties Union has asked the state Education Department to block the technology from all schools in New York state, asserting it would “have a chilling effect on school climate.” NYCLU Executive Director Donna Lieberman says, “Lockport is sending the message that it views students as potential criminals who must have their faces scanned wherever they go.”
Another argument that’s been raised is the technology’s cost and effectiveness. Recent research by MIT and Stanford University found gender and skin-type bias in some AI/facial recognition systems, meaning software like SAFR like won’t work as well on racial minorities and women.
Proponents of the plan, including district officials, insist the Aegis system they are installing will not create or log a database of student and faculty facial images that could be shared with the government or marketers. The program will be funded by a state technology bond and will not pull from the state education budget for staff or supplies.
Officials admit it’s not a perfect plan nor a one size fits all system. “There’s no system that’s going to solve every problem,” said Robert LiPuma, technology director for Lockport. “It’s another tool that we feel will give us an advantage to help make our buildings and our communities a little safer.”
Tracey Dowdy is a freelance writer based just outside Washington DC. After years working for non-profits and charities, she now freelances, edits and researches on subjects ranging from family and education to history and trends in technology. Follow Tracey on Twitter.