AI Surveillance in Schools: A Double-Edged Sword
Across the United States, school districts are increasingly turning to artificial intelligence (AI) to monitor students’ online activities. The intention is to safeguard children from potential threats and address a growing mental health crisis. This technology, however, raises serious concerns about privacy and can have unintended consequences, as recently demonstrated in Vancouver, Washington.
Vancouver Public Schools, like many others nationwide, employs surveillance software to monitor school-issued devices 24/7. These systems, powered by AI, scan students’ online communications for any signs of danger, such as suicidal ideation or threats of violence.
The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools.
The Vancouver Case: A Privacy Breach
In a troubling incident, reporters from The Seattle Times and the Associated Press inadvertently gained access to almost 3,500 sensitive, unredacted student documents following a public records request regarding the district’s surveillance technology. The documents revealed the intimate details of students’ lives, including discussions of depression, heartbreak, suicide, and issues related to their identities. Poems, college essays, and even excerpts from role-play sessions with AI chatbots were accessible to anyone with the link, raising significant cybersecurity concerns.
The monitoring tools used by Vancouver schools are designed to alert staff to potential threats and offer support to at-risk students. Gaggle Safety Management, the company that developed the software, believes not monitoring children is like letting them loose on “a digital playground without fences or recess monitors,” according to CEO and founder Jeff Patterson. Gaggle’s software is used in approximately 1,500 school districts across the country, tracking the online activity of roughly 6 million students.
Vancouver school officials apologized for the release of information but emphasize the necessity of the AI tools in protecting student well-being. Schools also attempt to monitor outside of class to protect their students. “I don’t think we could ever put a price on protecting students,” said Andy Meyer, principal of Vancouver’s Skyview High School. “Anytime we learn of something like that and we can intervene, we feel that is very positive.”
How Surveillance Works
Gaggle utilizes a machine-learning algorithm to scrutinize students’ online activity, scanning searches and written content on school-issued devices. The algorithm is designed to detect potential indicators of problems like bullying, self-harm, or suicidal thoughts, and then generates screenshots for human review. If the review confirms a serious matter, Gaggle will alert the school, and in urgent situations, notify school officials directly, even contacting law enforcement when deemed crucial.
One Vancouver school counselor, who requested anonymity, reported receiving three or four alerts each month. In approximately half the cases, the district contacts parents immediately, providing a crucial gateway for help. The counselor noted that the software often helps prevent tragedies, especially those related to suicide.
False Alarms and Unintended Consequences
Despite its intended purpose, the monitoring technology frequently generates false alarms, which can range from a student’s fictional story involving mild violence to casual online conversations. Parents and students have expressed concerns that frequent alerts and the subsequent intervention are sometimes unwarranted or excessive, which could erode student trust.
Bryn, a sophomore at the Vancouver School of Arts and Academics, was called into the principal’s office after writing a short story that was flagged by the system. Regarding the constant monitoring, she commented, “I’m glad they’re being safe about it, but I also think it can be a bit much.”
In Vancouver, between October 2023 and October 2024, nearly 2,200 students were the subject of a Gaggle alert. Alarmingly, at the Vancouver School of Arts and Academics, where one in four students had communications which triggered a Gaggle alert, the use and effectiveness of this surveillance technology remains a subject of debate.
Impact on LGBTQ+ Students
A significant concern is the potential for surveillance to expose LGBTQ+ students. Screenshots from the released Vancouver school documents indicated instances where students could have been potentially “outed” to school officials, leading to significant distress. LGBTQ+ students are already at higher risk for depression, suicidal thoughts, and internet use to help navigate their personal struggles.
Experts like Katy Pearce, a University of Washington professor, note that LGBTQ+ youth often rely on the internet as a critical source of support. Cases like that in Durham Public School, where an alert about self-harm led to a student being outed, highlight the potential for surveillance to inadvertently harm the students it intends to protect.
The Parent Perspective
Parents often remain unaware of these surveillance systems. For example, Tim Reiland, a parent in Oklahoma, only discovered his children were being tracked because his daughter wanted to use her personal laptop. The school refused, citing a need for digital monitoring. His daughter, Zoe, was made uneasy by constant oversight. The situation left her feeling inhibited and curtailed her curiosity. In her words, “I was too scared to be curious.”
Long-Term Effects and the Need for Balance
While acknowledging its benefits, the long-term effects of surveillance technology on student safety and well-being are not fully understood. Studies have provided very little evidence of benefits or risks from the utilization of AI in an environment of constant supervision.
There is no conclusive evidence that this technology lowers suicide rates or violence. Researchers also note that this digital tool should not be used to replace the vital role of having adequate mental health counselors and creating safe spaces for students to explore their feelings and thoughts.
As the debate over AI surveillance continues, it is increasingly clear that schools must strike a balance between student safety and the protection of student privacy and autonomy. Finding this balance requires a careful consideration of the benefits, the potential risks, and the needs of all students—especially the most vulnerable.