Recent years have seen a rise in schools utilizing artificial intelligence to monitor students, with the stated goal of preventing violence. However, an investigation by The Seattle Times and The Associated Press has revealed the security risks associated with these systems.
One of the core issues is how parents are made aware of these technologies. Tim Reiland, of Clinton, Mississippi, was surprised to learn that his children’s previous schools in Oklahoma used surveillance technology to monitor students. “He said he had no idea,” the parent stated.
These AI-powered systems are designed to monitor school-issued devices around the clock, searching for potential threats and signs of distress in students. The goal is to address rising concerns about student mental health and the threat of school shootings.
However, investigations have revealed significant privacy and security risks associated with the use of such technology. For example, reporters from The Seattle Times and The Associated Press gained access to almost 3,500 unredacted student documents through a public records request. This included sensitive information such as students’ writings about depression, suicidal thoughts, and struggles with bullying.
Experts have warned that the lack of student name redaction and poor firewall protection in these systems create a significant security risk. Even though the technology allows counselors to identify at-risk students and provide help, the Vancouver case highlights some of the unintended negative consequences of AI-driven surveillance in schools.
One impact is that the technology can inadvertently expose LGBTQ+ students and diminish trust between educators and pupils. “In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe,” the report notes.
One of the companies providing this software is Gaggle Safety Management. Their CEO, Jeff Patterson, believes that not monitoring children is akin to leaving them unsupervised in a “digital playground.” The technology has grown in popularity, especially since the pandemic, when many students received school-issued laptops or tablets. According to a U.S. Senate investigation, in 2021, over 7,000 schools or districts used GoGuardian’s surveillance products.
Vancouver Public Schools, which uses Gaggle, apologized for the document release but stressed the importance of the system in protecting students’ welfare. Principal Andy Meyer stated: “I don’t think we could ever put a price on protecting students.”
Gaggle uses machine learning algorithms to scan students’ online activity on school-issued devices. The system identifies phrases associated with bullying, self-harm, violence or suicide. If the system detects potential problems, it sends a screenshot to human reviewers, who alert the school in serious cases. In emergencies, Gaggle contacts school officials directly or, if necessary, law enforcement.
The technology, however, can also lead to false alarms. For instance, a student may be flagged for writing a story with violent imagery. School officials assert that they must act on all alerts, even if they prove to be mistaken, to ensure that potential issues are promptly addressed.
Surveillance technology’s long-term consequences are unclear. A 2023 study from RAND found only “scant evidence” supporting its benefits or risks. Research co-author Benjamin Boudreaux stated, “If you don’t have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention.”
Concerns about student privacy rights were raised by Glenn Thompson, a graduate of Durham School of the Arts, who witnessed a classmate being “blindsided” when Gaggle had alerted school officials about an assignment the student had been promised confidentiality for.
The debate over school surveillance technology is ongoing, and parents are often unaware that it exists. While technology can help teachers intervene, critics emphasize that teens need private spaces online to explore their thoughts and emotions. According to Boudreaux, “The idea that kids are constantly under surveillance by adults — I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in.”