Protecting students online includes blocking appropriate content as well as deterring threats like cyberbullying, self-harm, radicalization, and violence.
The Children’s Internet Protection Act (CIPA) is a federal law passed by Congress in 2000 that requires an internet filter to protect students from obscene content.
You need to know how devices and resources are being used, when students are at risk, how your 1:1 health is — and you need details to drill into when issues arise.
Sometimes students need help staying on-task and productive, especially with the allure of distracting internet right in front of them. Blocking non-educational content can help.
In addition to CIPA, acceptable use policies; privacy requirements; state and local laws; and other regulations can require filtering to be in place to protect students.
A filter can block access to sites that host and distribute malware, spyware, and viruses — otherwise your students are bound to stumble upon them.
What are the most important things when it comes to filtering?
Over the years, filtering has become more challenging for schools as they strike the balance between safety and appropriate access on devices that go everywhere with students. In addition, no two schools (or students!) are the same, so filtering policies vary across the board.
With an increasingly encrypted web, getting the information schools to need to protect students is trickier. Search term reports, selective access to Google services, YouTube controls — schools need SSL decryption for all of them. The traditional solution of a trusted-man-in-the-middle proxy is hard and expensive: more hardware, PAC files, trust certificate management. Sadly, over 60% of K-12 IT personnel say they aren’t decrypting SSL at all. (Fortunately, Lightspeed Systems® offers a better way with patented smart agents!)
Blocking access to pornographic and obscene content is the most fundamental thing a filter needs to do — and for good reason. Children who access pornography have an increased risk of engaging in high-risk sex; sex addiction; perpetrating or suffering sexual violence; anxiety and depression; and many more harmful consequences. 74% of K-12 IT personnel have identified students attempting to access pornographic content. Yet some filters block less than half of the pornography on the web! (Don’t worry, Lightspeed Filter effectively blocks pornography even when it’s in other languages or multimedia.)
The web is not static, and a lot of sites are neither good nor bad — until they have content in them. Google Docs, email, social media: all these things can’t be categorized by domain. Instead, they need to be analyzed in realtime as students work, type, and communicate. Reports and real-time alerts are essential to protect against cyberbullying, suicide, and inappropriate content but solutions need to provide safety without the false alerts that bog down administrators. (Student safety monitoring uses advanced real-time AI and alerts to keep appropriate school personnel informed of concerning activities to proactively protect students.)
YouTube & Social Media
For most schools, simply blocking this content today impedes learning opportunities. However, 40% of schools allow YouTube access for everyone, which is problematic due to inappropriate, disturbing, and distracting content that can cause students to lose focus. Schools need ways to provide differentiated access and read- and view-only — with automated access controls able to block individual videos or restrict specific pages. (Lightspeed Filter offers advanced YouTube and social media controls to provide just the access you need.)
Technology isn’t confined to the server room or the classroom; student devices go everywhere. Schools need filtering that works seamlessly on-campus or off-campus, any- where at any time of day. Separate filtering solutions for when devices are on campus or off add to the complexities of student safety and effective filtering. (Lightspeed Filter Smart Agents sit on devices and work wherever students take their devices.)