Lightspeed’s response to AP’s coverage of school safety tech
We appreciate the Associated Press for shining a light on the rising use of safety monitoring technology in schools. These are complex, emotional topics and they deserve open, honest discussion.
At Lightspeed, we believe student safety and student privacy are not at odds. In fact, both must be protected with equal vigilance.
This isn’t just a product line for us, it’s a mission. And it’s one we take seriously.
私たちの ライトスピードアラート™ solution is used by more than 1,700 schools and districts, protecting over 6 million students across 21 countries. Every day, this technology helps school communities identify signs of crisis in students- such as self-harm, violence, bullying, abuse–and intervene before tragedy strikes.
We’re proud of the role our technology plays in saving lives. But we’re just as proud of how it does so with transparency, humanity, and respect for the students and families we serve.
How Lightspeed Alert Works
Lightspeed Alert is not a mass surveillance tool. It’s a student safety solution built only for school-managed devices and accounts. We do not monitor personal devices, nor do we monitor everything a student does. And it’s not just keywords but the context that our AI looks at, ensuring that we minimize false alerts and unnecessary interactions. Our AI flags activity tied to specific safety signals like searches, documents, messages, or posts related to harm, weapons, abuse, or suicide risk. That’s it.
Districts choose how they want to handle alerts. Some have their own staff (counselors, safety officers, mental health leads) review alerts directly. Others rely on Lightspeed’s Human Review team, which includes professionals with backgrounds in school safety, mental health, and law enforcement.
から January through July 2025, our Human Review team evaluated over 582,000 incidents, flagging 51,000 as high-risk、 そして 2,555 as imminent. Only 0.09%, fewer than 1 in 1,000, resulted in Lightspeed contacting law enforcement directly, and only after multiple failed attempts to reach the district.

In other words: we don’t jump to law enforcement. We support school teams in preventing escalation and intervening with care, not punishment. But when an imminent situation is happening and students or staff are at risk, law enforcement is an important part of preventing tragedy.
Why This Work Matters
Today’s students are growing up in a pressure cooker. CDCによれば, 42% of high school students feel persistently sad or hopeless. 22% have seriously considered suicide. Yet many of those students never speak up—especially those at highest risk.
In this environment, schools are being asked to do more with less. Teachers aren’t mental health professionals. Counselors are stretched thin. And safety teams can’t be everywhere at once.
Technology, used responsibly, can help.
With Lightspeed Alert, schools can spot the signs that too often go unseen: a journal entry that hints at suicide, a draft message lashing out in pain, or a search that indicates fear or abuse. This isn’t about catching kids doing something wrong. It’s about reaching them when they’re most in need.
As Michele Gay, founder of Safe & Sound Schools and mother of Josephine Gay, who was killed at Sandy Hook, puts it:
“While the digital world can pose risks, it can also offer opportunities for early identification. We need to meet students where they are and leverage technology in ways that proactively support their safety and wellbeing.”
Clearing Up the Confusion: What Counts as a “False Positive”?
There’s an important distinction that often gets lost in conversations about student safety monitoring: a false positive isn’t the same as a joke, a passing comment, or a situation that doesn’t lead to discipline.
When people talk about “false positives,” they often mean alerts that didn’t turn out to be a real threat. But that doesn’t mean they weren’t worth flagging and reviewing.
Sometimes a student is joking about violence or self-harm. Sometimes they’re referencing a song lyric, venting frustration, or using language that raises concern but isn’t necessarily dangerous. That’s not a system failure. That’s a system doing its job.
Lightspeed Alert’s AI is trained to be highly precise, surfacing only content that matches known safety signals. Our AI false positives is low and continuously getting lower, thanks to ongoing improvements and real-world feedback from thousands of districts.
Then comes the human layer.
When Human Review is enabled, trained Lightspeed Safety Specialists evaluate each alert in real time. They’re not just checking for keyword matches—they’re looking at full context: the surrounding conversation, the student’s previous activity, even tone and intent where possible. This layer ensures that students aren’t penalized for venting, joking, or just being teenagers, while still making sure potential red flags get the attention they deserve.
Educators tell us that even “non-critical” alerts often lead to meaningful conversations: about mental health, peer pressure, stress, or bullying. And that’s the goal. Support before punishment, context before escalation.
The system isn’t perfect—no system is. But Lightspeed Alert is built to be as accurate, responsible, and human-centered as possible.
Not Surveillance – Support
We understand the concerns raised about over-monitoring. They’re valid, and they’re why we built our system with district control, clear policy customization, and rigorous privacy compliance (COPPA, FERPA, state laws, and more).
In fact, monitoring student online activity on school-issued devices is not optional. It’s required by CIPA, the federal Children’s Internet Protection Act. But how that monitoring is handled makes all the difference. And that’s where Lightspeed stands apart.
We don’t mine data for profit. We don’t sell student information. We don’t overreach. And we don’t treat safety like a checkbox. We treat it like the sacred responsibility it is.
The Human Impact
Behind every alert is a child. And behind every saved life is a team of educators, counselors, and parents who were given the insight they needed just in time.
Don’t just take our word for it:
“I can tell you beyond a shadow of a doubt that we have saved students.” — Christi Frias, San Marcos USD
“We’ve been able to identify students that really needed help but weren’t asking for it.” — John Gonzalez, Hays CISD
“Our students are well aware of [Lightspeed Alert], and they know we’re here to support their needs.” — Judy Burgess, Visalia USD
“Lightspeed Alert stopped several suicide incidents. One student had 3 previous attempts. Great support, always works, and little effort to keep kids safe.” — Dave Jenkins, Director of Operations, EHOVE Career Center
“I have used Lightspeed Alert to successfully identify and get help for three students, in two years time, that were considering suicide.” — John Cannon, School Resource Officer, Morgan County School District R1
“We have been able to identify searches for self-harm that allowed us to get the information to the building administration and contact parents. In at least one case it led to parents being able to get some help for a student that perhaps prevented a tragic situation.” — John Sedwick, Technology Integration Specialist, Anderson Community School District
今後の展望
This work isn’t easy. But it is essential. And we’re committed to doing it better every day—because the stakes are too high not to.
We’ll continue working closely with our district partners, mental health experts, and privacy advocates to ensure that safety technology empowers support, not fear.
Because every student deserves to be seen. Every warning sign deserves attention. And every school deserves tools they can trust.
If you’d like to learn more about how Lightspeed Alert supports student wellbeing while respecting privacy, reach out to our team.