When students in suburban Atlanta returned to school for in-person classes amid the pandemic, they were required to mask up, like in many places across the US. Yet in this 95,000-student district, officials took mask compliance a step further than most.
Through a network of security cameras, officials harnessed artificial intelligence to identify students whose masks drooped below their noses.
“If they say a picture is worth a thousand words, if I send you a piece of video – it’s probably worth a million,” said Paul Hildreth, the district’s emergency operations coordinator. “You really can’t deny, ‘Oh yeah, that’s me, I took my mask off.’”
The school district in Fulton county had installed the surveillance network, by Motorola-owned Avigilonyears before the pandemic shuttered schools nationwide in 2020. Out of fear of mass school shootings, districts in recent years have increasingly deployed controversial surveillance networks like cameras with facial recognition and gun detection.
With the pandemic, security vendors switched directions and began marketing their wares as a solution to stop the latest threat. In Fulton county, the district used Avigilon’s “no face mask detection” technology to identify students with their faces exposed.
Remote learning during the pandemic ushered in a new era of digital student surveillance as schools turned to AI-powered services like remote proctoring and digital tools that sift through billions of students’ emails and classroom assignments in search of threats and mental health warning signs. Back on campus, districts have rolled out tools like badges that track students’ every move.
But one of the most significant developments has been in AI-enabled cameras. Twenty years ago, security cameras were present in 19% of schools, according to the National Center for Education Statistics. Today, that number exceeds 80%. Powering those cameras with artificial intelligence makes automated surveillance possible, enabling things like temperature checks and the collection of other biometric data.
Districts across the country have said they had bought AI-powered cameras to fight the pandemic. But as pandemic-era protocols like mask mandates end, experts said the technology will remain. Some educators have stated plans to leverage pandemic-era surveillance tech for student discipline while others hope AI cameras will help them identify youth carrying guns.
The cameras have faced sharp resistance from civil rights advocates who question their effectiveness and argue they trample students’ privacy rights.
Noa Young, a 16-year-old high school junior in Fulton county, said she knew that cameras monitored her school but wasn’t aware of their hi-tech features like mask detection. She agreed with the district’s now-expired mask mandate but felt that educators should have been more transparent about the technology.
“I think it’s helpful for Covid stuff but it seems a little intrusive,” Young said in an interview. “I think it’s strange that we were not aware of that.”
Outside Fulton county, educators have used AI cameras to fight Covid on multiple fronts.
In Rockland regional school unit 13 in Maine, officials used federal pandemic relief money to procure a network of cameras with “face match” technology for contact tracing. Through advanced surveillance, the cameras, made by California-based security company Verkadaallow the 1,600-student district to identify students who came in close contact with classmates who tested positive for Covid-19.
At a district in suburban Houston, officials spent nearly $75,000 on AI-enabled cameras from hikvisiona surveillance company owned in part by the Chinese government, and deployed thermal imaging and facial detection to identify students with elevated temperatures and those without masks.
The cameras can screen as many as 30 people at a time and are therefore “less intrusive” than slower processes, said Ty Morrow, the Brazosport independent school district’s head of security. The checkpoints have helped the district identify students who later tested positive for Covid-19, Morrow said, although a surveillance testing company has argued Hikvision’s claim of accurately scanning 30 people at once is not possible.
“That was just one more tool that we had in the toolbox to show parents that we were doing our due diligence to make sure that we weren’t allowing children or staff with Covid into the facilities,” he said.
Yet it’s this mentality that worries consultant Kenneth Trump, the president of Cleveland-based National School Safety and Security Services. Security hardware for the sake of public perception, the industry expert said, is simply “smoke and mirrors”.
“It’s creating a façade,” he said. “Parents think that all the bells and whistles are going to keep their kids safer and that’s not necessarily the case. With cameras, in the vast majority of schools, nobody is monitoring them.”
When the Fulton county district upgraded its surveillance camera network in 2018, officials were wooed by Avigilon’s AI-powered “appearance search”, which allows security officials to sift through a mountain of video footage and identify students based on characteristics like their hairstyle or the color of their shirts. When the pandemic hit, the company’s mask detection became an attractive add-on, Hildreth said.
He said the district didn’t actively advertise the technology to students but they probably became aware of it quickly after students got called out for breaking the rules. He doesn’t know students’ opinions about the cameras – or seem to care.
“I wasn’t probably as much interested in their reaction as much as their compliance,” Hildreth said. “You don’t have to like something that’s good for you, but you still need to do it.”
A Fulton county district spokesperson said they were unaware of any instances where students were disciplined because the cameras caught them without masks.
Among the school security industry’s staunchest critics is Sneha Revanur, a 17-year-old high school student from San Jose, California, who founded the youth-led group Encode Justice to highlight the dangers of artificial intelligence on civil liberties.
Revanur said she was concerned by districts’ decisions to implement surveillance cameras as a public health strategy and that the technology in schools could result in harsher discipline for students, particularly youth of color.
Verkada offers a cautionary tale. last year, the company suffered a massive data breach when a hack exposed the live feeds of 150,000 surveillance cameras, including those inside Tesla factories, jails and at Sandy Hook elementary school in Newtown, Connecticut. The Newtown district, which suffered a mass school shooting in 2012, said the breach didn’t expose compromising information about students. the vulnerability hasn’t deterred some educators from contracting with the California-based company.
After a back-and-forth with the Verkada spokesperson, the company would not grant an interview or respond to a list of written questions.
Revanur called the Verkada hack at Sandy Hook elementary a “staggering indictment” of educators’ rush for “dragnet surveillance systems that treat everyone as a constant suspect” at the expense of student privacy. Constant monitoring, she argued, “creates this culture of fear and paranoia that truly isn’t the most proactive response to gun violence and safety concerns”.
In Fayette county, Georgia, the district spent about $500,000 to buy 70 Hikvision cameras with thermal imaging to detect students with fevers. But it ultimately backtracked and disabled them after community uproar over their efficacy and Hikvision’s ties to the Chinese government. In 2019, the US government imposed a trade blacklist on Hikvision, alleging the company was implicated in China’s “campaign of repression, mass arbitrary detention and high-technology surveillance” against Muslim ethnic minorities.
The school district declined to comment. In a statement, a Hikvision spokesperson said the company “takes all reports regarding human rights very seriously” and has engaged governments globally “to clarify misunderstandings about the company”. The company is “committed to upholding the right to privacy”, the spokesperson said.
Meanwhile, regional school unit 13’s decision to use Verkada security cameras as a contact tracing tool could run afoul of a 2021 law that bans the use of facial recognition in Maine schools. The district did not respond to requests for comments.
Michael Kebede, the ACLU of Maine’s policy counsel, cited recent studies on facial recognition’s flaws in identifying children and people of color and called on the district to reconsider its approach.
“We fundamentally disagree that using a tool of mass surveillance is a way to promote the health and safety of students,” Kobede said in a statement. “It is a civil liberties nightmare for everyone, and it perpetuates the surveillance of already marginalized communities.”
‘We’ve got some false positives’
In Fulton county, school officials wound up disabling the face mask detection feature in cafeterias because it was triggered by people eating lunch. Other times, it identified students who pulled their masks down briefly to take a drink of water.
In suburban Houston, Morrow ran into similar hurdles. When white students wore light-colored masks, for example, the face detection sounded alarms. And if students rode bikes to school, the cameras flagged their elevated temperatures.
“We’ve got some false positives but it was not a failure of the technology,” Hildreth said. “We just had to take a look and adapt what we were looking at to match our needs.”
With those lessons learned, Hildreth said he hoped to soon equip Fulton county campuses with AI-enabled cameras that identify students who bring guns to school.
In a post-pandemic world, Albert Fox Cahn, founder of the non-profit Surveillance Technology Oversight Projectworries the entire school security industry will take a similar approach.
“With the pandemic hopefully waning, we’ll see a lot of security vendors pivoting back to school shooting rhetoric as justification for the camera systems,” he said. Due to the potential for errors, Cahn called the embrace of AI surveillance in schools “really alarming.”
This report was published in partnership with the 74a non-profit, non-partisan news site covering education in America
George is Digismak’s reported cum editor with 13 years of experience in Journalism