AI Surveillance in Schools: A Balancing Act Between Safety and Privacy
Across the United States, schools are increasingly turning to artificial intelligence (AI) to monitor students’ online activities. This technology, designed to identify potential threats and signs of distress, is intended to keep children safe in an era marked by a mental health crisis and the threat of school shootings. However, these tools raise significant concerns about student privacy and data security.
In a recent case, Vancouver Public Schools in Washington state inadvertently released almost 3,500 sensitive student documents, exposing the potential risks associated with these surveillance systems. The documents, obtained through a public records request by reporters from The Seattle Times and The Associated Press, revealed personal student communications about depression, heartbreak, suicide, and other sensitive topics. This incident highlights the unintended consequences of these monitoring tools and the potential for sensitive information to be compromised.
Unintended Consequences
While surveillance technology promises to protect students, it can lead to unforeseen negative outcomes. Critics argue that the systems have the potential to out LGBTQ+ students and erode trust between students and school staff. Additionally, there’s growing concern that these tools are not always effective in preventing harm.
Gaggle Safety Management is one of several companies, including GoGuardian and Securly, that offer AI-assisted web surveillance. Gaggle’s CEO and founder, Jeff Patterson, defends the practice, stating that not monitoring children is like leaving them on a “digital playground without fences or recess monitors.” Approximately 1,500 school districts nationwide utilize Gaggle’s software, tracking the online activities of about 6 million students. The demand for this technology surged during the pandemic when schools distributed laptops and tablets to nearly every student.
The released documents from Vancouver Public Schools showed that students utilize school devices for more than just coursework. The information included poems, college essays, and even excerpts from role-play sessions with AI chatbots. The lack of redaction of student names and insufficient security measures raised serious cybersecurity concerns. Despite these risks, district officials maintain the technology is essential for safeguarding student well-being. “I don’t think we could ever put a price on protecting students,” said Andy Meyer, principal of Vancouver’s Skyview High School.
Dacia Foster, a parent in the Vancouver district, while supporting efforts to keep students safe, expressed concern about privacy violations. “That’s not good at all,” Foster said. “But what are my options? What do I do? Pull my kid out of school?”
How Surveillance Works
Gaggle’s AI algorithm scans students’ online searches and written content on school-issued devices 24/7, or whenever they log into their school accounts on personal devices. This algorithm identifies potential indicators of problems like bullying, self-harm, suicide, or school violence, and then flags these instances for review. If human reviewers at Gaggle confirm the issue, the company alerts school officials, and in cases of imminent danger, Gaggle may contact law enforcement for a welfare check.
A school counselor in Vancouver, who requested anonymity, receives several Gaggle alerts per month, leading to immediate parental contact in approximately half of the cases. “A lot of times, families don’t know. We open that door for that help,” the counselor said. However, some students have reportedly found ways to circumvent the system once they learn their activity is being monitored.
Seattle Times and AP reporters were able to view the type of writing that triggers Gaggle alerts. Following the accidental release of student records, Gaggle updated its security measures. Now, only those logged into a Gaggle account can view screenshots after 72 hours. The company states that the links must be accessible without a login during those 72 hours so emergency contacts can respond quickly.
In Vancouver, the technology flagged over 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, the system also triggered false alarms. Bryn, a sophomore at Vancouver School of Arts and Academics, was called into the principal’s office after the AI flagged an alert as a result of short story with mildly violent imagery she wrote. School officials maintain that alerts are warranted even in less severe cases and false alarms, ensuring potential issues are addressed promptly.
School counselors use the alerts as an opportunity to engage with students. Between October 2023 and October 2024, nearly 2,200 students, or about 10% of the district’s enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, about 1 in 4 students triggered a Gaggle alert.
Uncertain Long-Term Effects
While schools are increasingly using surveillance technology, its long-term impact on student safety and well-being remains unclear. There’s no definitive research demonstrating that it lowers student suicide rates or reduces violence. A 2023 RAND study found only “scant evidence” of either benefits or risks from AI surveillance. “If you don’t have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,” said report co-author Benjamin Boudreaux, an AI ethics researcher.
LGBTQ+ students are especially vulnerable to the potential negative consequences of surveillance.
Risks to LGBTQ+ Students
Screenshots released by the Vancouver schools potentially outed at least six students. LGBTQ+ students already experience significantly higher rates of depression and suicidal thoughts, and may turn to the internet for support. “We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver,” said Katy Pearce, a University of Washington professor who researches technology in authoritarian states.
In North Carolina’s Durham Public Schools, a pilot program of Gaggle led to concerns within the community. An LGBTQ+ advocate reported that a Gaggle alert had resulted in a student being outed to their unsupportive family. Glenn Thompson, a graduate of Durham School of the Arts, expressed concerns about the lack of transparency at a school board meeting. The Durham Board of Education voted to stop using Gaggle in 2023, fearing the risk of outing students and eroding trust.
Parental Awareness and Concerns
The debate over student privacy and safety is often opaque to parents, who may not be aware of ongoing surveillance practices. Even when they are informed, parents may not be able to opt their children out of these programs. Tim Reiland, a parent of two teenagers in Owasso Public Schools in Oklahoma, didn’t realize his children were being monitored by Gaggle until he asked if his daughter could use her personal laptop. After Zoe, his daughter discovered the surveillance, she felt uncomfortable using her Chromebook. “I was too scared to be curious,” she said.
School officials believe the technology has saved lives, but they admit that it’s insufficient. Nex Benedict, a nonbinary teenager at Owasso High School, died by suicide after bullying. A U.S. Department of Education Office for Civil Rights investigation found the district responded with “deliberate indifference” to reports of sexual harassment, mainly in the form of homophobic bullying.
The Need for Balance
Technological surveillance provides a tool, but it is not a complete solution, and long-term effects are unknown. Middle school students in the Seattle-area Highline School District used Gaggle to communicate their needs to staff. Yet, the research indicates that teens need private online spaces to explore their thoughts. “The idea that kids are constantly under surveillance by adults — I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in,” offered Boudreaux.
Gaggle’s Patterson believes school-issued devices are not the space for self-exploration if exploration leads in a dangerous direction. He said, ”If you’re looking for that open free expression, it really can’t happen on the school system’s computers.”