MONTPELIER – With educators on heightened alert to prevent the next school shooting, and under increasing pressure to address cyber-bullying, self-harm, and teen suicide, schools are turning to a new tool for help: artificial intelligence.

Students spend much of their lives online. Now, carefully calibrated algorithms can patrol the hallways of the internet to alert school officials when they might need to intervene. At least that’s the pitch from a burgeoning industry.

“It takes a village (and their bots),” reads one company’s tagline.

But privacy advocates say these technologies risk getting students in trouble for benign activity. And some experts wonder whether AI will help or hamper efforts to intervene when necessary.

“Surveillance almost always begets more surveillance. It’s never enough, right?” said Amy Collier, associate provost for digital learning at Middlebury College. “And if we haven’t asked hard questions about data and privacy at the outset, we’re going to just keep doing more and more to detract from people’s freedom and privacy rights.”

VTDigger sent a public records request to all 52 superintendents in Vermont to ask for any contracts signed by their districts for social media monitoring services. The majority said they didn’t have agreements.

Five said they had current or prior contracts with Social Sentinel, a Burlington-based firm that scans public social media posts within a certain geographic area and sends alerts to school officials when keyword-based algorithms detect signs of trouble.

Another eight said that while they didn’t scan social media specifically, they did contract with vendors – including Securly, Bark, or Lightspeed Systems – to monitor activity on district devices and school-sponsored email. (For an added fee, Social Sentinel will also scan student emails.)

These technologies send alerts to school officials when algorithms flag browsing habits, chats, or emails that indicate a student is in distress or could hurt others.

One popular company, GoGuardian, whose “Admin” product is used by the middle schools in the Burlington school district, allows school officials to keep granular tabs on what students search, watch and read when they’re on district devices.

In an online demo for the product, a GoGuardian representative explains how school officials can query the program for detailed profiles on the habits of every student. One tab is labeled “Most flagged students.”

“Experience shows that expanded police presence in schools and online surveillance of students does real harm, undermining student privacy and resulting in rights violations that disproportionately impact students of color,” James Duff Lyall, the executive director of the American Civil Liberties Union of Vermont, said in a statement.

Lyall isn’t the only the only one to raise concerns about who is targeted by surveillance technology. Research out this summer found Google’s hate-speech detecting technology was more likely to be triggered by posts from African Americans.

“The algorithms of these technologies are hard-coded with biases,” Collier said.

Carolyn Stone, a professor at the University of North Florida and chair of the American School Counselor Association’s ethics committee, has written that counselors should advocate against monitoring software, unless alerts are routed directly to parents or guardians.

Putting educators in the middle will lead to an “unneeded and unwarranted liability” for schools, Stone argues. False positives from the alerts could overwhelm counselors and distract from more pressing work. And assigning mental health staff to follow up when a student’s searches or emails turn up red flags would could unwittingly put them in a disciplinarian’s role and impair their ability to get students to open up.

“When the school counselor is the first line of communication with them about online activity that might have stemmed from an innocuous research paper, students will look at the trusting relationship with a jaundiced eye,” she wrote in 2018.

In an interview, Social Sentinel founder Gary Margolis bristled at questions about privacy.

“We built a technology that actually helps prevent bad things from happening. By giving information that can give context to what’s going on, in a way that respects privacy, and all I do is get questioned by you and folks in the media about privacy issues,” Margolis said. “It’s mind-bogglingly frustrating.”

“You either want to save a kid’s life or you don’t want to save a kid’s life,” he said. (Margolis later called a reporter back to apologize.)

Social Sentinel officials insist that their product stands apart — while, like its competitors, it sends alerts to school officials based on triggers detected by its algorithms, it doesn’t profile students.

“We’re not surveilling, we’re not monitoring. We’re not following,” Margolis said. “Monitoring is when I’m paying attention to a specific individual. Your communications. You.”

Amelia Vance, the director of education privacy at the Future of Privacy Forum, said the most frequent complaint she hears from school districts about social media monitoring is that it floods officials with useless information to wade through.

“You can’t be privacy protective and good at social media monitoring and identifying, you know, potential threats,” she said.

In Hyde Park, Lamoille Union High administrators contracted with Social Sentinel for a year in 2015. But Brian Schaffer, the school’s principal, said the daily alerts he received consisted mostly of irrelevant posts – including from Quebec tourists bragging about the packs of Heady Topper they’d bought on trips to Vermont.

“It wasn’t as functional as I had hoped it would be,” he said.

He remembered one instance where the software’s net had caught evidence of a student from a neighboring district in apparent distress, and at risk of self-harm. Schaffer reached out to that student’s school, but they were already aware of the situation.

Margolis said Social Sentinel’s algorithm is continually improving, and has already dramatically reduced the number of false positives sent to school officials.

“Early fire detection systems used to go off all the time. Technology gets better,” he said.

In the Slate Valley Unified School District, school officials signed a three-year contract with Social Sentinel in January. Superintendent Brooke Olsen-Farrell said the district took on the service as part of a larger package of security reforms — totalling near $1 million — put in place in the two years since a former student’s shooting plot was foiled at Fair Haven Union High.

“It’s one more tool in our toolbox,” she said.

The service sends her an alert about once a month. None have included any “actionable” information so far, Olsen-Farrell said. The district has also long used Securly to monitor student activity on district devices and accounts. The service not only blocks certain content but also scans student emails and Google docs for language that suggests self-harm or bullying.

That service has been useful, Olsen-Farrell said, at finding students in distress, although in several instances school officials already knew the child needed help.

A task force created by Gov. Phil Scott after the Fair Haven incident to help prevent school shootings this spring recommended that the state invest in monitoring software to scan social media posts statewide. Margolis testified before the committee.

Among the task force’s members were Rob Evans, a school security consultant who works for Margolis Healy, a firm co-founded by Gary Margolis. Margolis has stepped away from the firm, but Steven Healy, its current CEO, also sits on the board of directors at Social Sentinel.

Asked about the apparent conflict, Evans referred comment to the task force’s chair, Deputy Mental Health Commissioner Mourning Fox, and co-chair, Daniel Barkhuff, an emergency physician at the University of Vermont Medical Center. Barkhuff said Evans played no role in bringing Social Sentinel to the panel’s attention.

“In hindsight, I totally understand how the optics look weird,” he said. “But that honestly is not what happened. It was me who brought it up to the task force.”

And while Evans didn’t recuse himself from conversations about social media monitoring, Barkhuff said it wouldn’t have swung the panel’s recommendations one way or another. The recommendation, he said “was pretty unanimous.”

For his part, Barkhuff thinks the software would be a good idea. Many mass shooters — including the teen in the Parkland, Florida massacre — take to the internet before they act to post disturbing clues about their intentions. But while Barkhuff said he thinks things written on public forums should be considered “fair game,” he also understands many disagree.

“There’s a legitimate conversation about civil liberties which we think policy makers should have,” he said.

It’s unlikely social media monitoring on a statewide scale will get picked up as a strategy by the Scott administration. A spokesperson with his office said the governor was still deciding what policies to bring forward to the Legislature next session, but added Scott felt some discomfort toward the idea.

“The Governor shares some of the concerns that have been raised about social media monitoring software and privacy considerations,” Rebecca Kelley, Scott’s communications director, wrote in an email.

And the Democrat-controlled Legislature is even less likely to push the concept.

Sen. Phil Baruth, D/P-Chittenden, chair of the Senate Education Committee, said online monitoring technology raises a slew of ethical questions for school districts. Do parents and students know that kids are being so closely watched? And are there equity concerns to consider when poor students must rely on district devices for all of their computing needs?

“If you follow that logic out, it seems more likely to turn up problems with low-income students. And that’s only one of a number of complications that I see with this kind of technology,” Baruth said.

The lawmaker said he believed these debates should mostly left to local communities, not legislated from Montpelier. But he added that he worried these services were being sold to schools and politicians as “duty-free work-arounds” to sidestep questions of gun control.

“Every time a gun safety bill surfaces, we can say ‘we took care of that.’ We’ve got a cheaper, easier, more universal solution. And I just don’t think that’s it at all. I think what it will produce are false positives and an over-abundance of surveillance of young people to no good end,” he said.

Advocates say lawmakers ought to protect student data privacy.

“I definitely think that would make sense for Vermont to join many other states that have sort of baseline privacy protections,” Vance said.