Scotland Has a New AI Strategy and Safeguarding Is at the Heart of It  



Scotland’s AI Strategy 2026 – 2031 landed quietly in early 2026, but its implications for schools, local authorities, and technology providers working in education are anything but quiet. This is a landmark document, one that signals a fundamental shift in how the Scottish Government expects AI to be adopted, governed, and held to account across public services. 

For those of us working at the intersection of safeguarding and digital transformation, it reads like a call to action. 

Scotland has published AI strategies before, but this one is different. Where previous frameworks focused primarily on economic opportunity and innovation, the 2026–2031 strategy centres something far more important: trust. And specifically, trust in relation to the most vulnerable members of our society; children. 

The strategy is anchored to Scotland’s incorporation of the UN Convention on the Rights of the Child (UNCRC) into domestic law, making Scotland the first devolved nation in the world to do so. This isn’t symbolic. It has direct policy implications for every AI system deployed in a school, a local authority, or any public service that touches children’s lives. 

This isn’t just guidance. It’s the policy backdrop against which procurement decisions, governance frameworks, and institutional strategies will increasingly be judged. 

Across five strategic priorities; public services, business and the economy, research and innovation, public engagement, and workforce skills, the strategy weaves a consistent thread: responsible AI is non-negotiable

A few commitments stand out as particularly significant for education and safeguarding: 

A public Scottish AI Register. The government will publish a register of all AI systems used in the Scottish public sector, disclosing what data is used, how it is used, and who is accountable. This level of transparency sets a new benchmark. Schools and local authorities will be expected to know what AI systems they are running and to be able to account for them. 

A ‘chain of trust’ model for AI governance. The strategy explicitly requires AI to be explainable, subject to human scrutiny, and governed through visible accountability structures. Automated decisions must be open to challenge. This is not just good practice; it will become expected. 

Ten early actions to be delivered before March 2027. The clock is already ticking. Among these is the implementation of a rigorous, trusted framework for safe and ethical AI use in health and social care, with education firmly in scope. Public sector bodies have less than a year to begin demonstrating alignment. 

A nationwide public engagement programme. Recognising that public trust in AI is fragile, the government is launching a listening exercise to understand concerns and build confidence. Schools and local authorities will be part of that conversation, and they will need to show they are deploying AI that communities can trust. 

For schools and local authorities in Scotland, this strategy is not a distant policy aspiration. It is an imminent practical reality. 

Governance and accountability are coming into sharp focus. The expectation that public bodies can identify, explain, and justify every AI system they use is a significant ask. Many schools currently have limited visibility into the AI embedded in their platforms, from content filters and monitoring tools to management information systems. That will need to change. Leaders will need to conduct AI audits, review data governance frameworks, and ensure they can answer basic questions: What AI are we using? What data does it touch? Who is accountable when something goes wrong? 

Safeguarding is a compliance issue, not just a values issue. With the UNCRC embedded in Scots law and UNICEF’s AI guidance adopted at the national level, failing to protect children from online harm or failing to have adequate AI-powered safeguarding in place, carries legal and reputational weight. Managers, headteachers, and directors of education need to be thinking now about whether their current tools and processes meet this emerging standard. 

Procurement decisions will carry greater scrutiny. When a school or local authority buys an AI-powered tool, they are now implicitly making a safeguarding decision. Is the system explainable? Is it auditable? Does it protect children’s data? Does it flag risk to a human or make autonomous decisions without oversight? These questions are no longer optional extras in a procurement checklist. They are central. 

Digital transformation must be led by values, not just efficiency. The strategy is clear that AI adoption should be driven by outcomes for people, not by cost savings alone. For education leaders, this means that digital transformation strategies need a safeguarding lens built in from the start, not bolted on afterwards.  

At Lightspeed, we have been building safeguarding and digital wellbeing tools for schools for many years. Scotland’s AI Strategy 2026–2031 doesn’t change what we do, but it does provide the clearest possible articulation of why it matters, and the most compelling mandate we’ve seen for schools and local authorities to act. 

Across our product suite, every tool we offer is built on the same foundational principle: responsible, transparent, child-centred technology that puts people, not algorithms in control. 

Lightspeed Alert is our AI-powered safeguarding tool, and it is the most direct response to what Scotland’s strategy demands. Alert monitors student activity across school devices and flags risk; self-harm, abuse, exploitation, radicalisation, instantly to the Designated Safeguarding Lead. Critically, AI identifies the concern and a human decides what to do next. No automated decisions about children. No black boxes. Every alert, action, and outcome is logged and reviewable, giving schools and local authorities the auditable safeguarding record that the Scottish AI Register will require. When Scotland’s strategy talks about explainability, human oversight, and accountable AI, Alert is what that looks like in practice. 

Scotland’s strategy demands that AI in public services is transparent, evidence-based, and built to support informed decision-making. Lightspeed Signal delivers exactly that for school IT teams and local authority leaders. Signal is a real-time monitoring platform that tracks device health, internet connectivity, and application performance across every school device, on campus and at home. It detects security risks such as unauthorised users, VPN and proxy usage, and flags students with poor home connectivity so that digital equity gaps can be identified and addressed with precision. When a critical application goes down, Signal alerts the right people immediately, before teachers and students are left waiting. For Scottish local authorities working to demonstrate responsible technology governance and ensure that every young person can participate fully in an AI-enabled world, Signal provides the operational visibility to make that a reality rather than an aspiration. 

The strategy calls for AI literacy and leadership capacity across the public sector, and for public bodies to be able to account for how data is used and what decisions are made. Lightspeed Insight gives school and local authority leaders a clear, evidence-based view of how students are engaging with devices and platforms; what they’re accessing, when, and how. This isn’t surveillance; it’s the kind of transparent, purposeful data use that supports informed decision-making, demonstrates responsible governance, and helps leaders identify where digital risks are concentrated. In the context of the Scottish AI Register and growing accountability expectations, Insight gives schools the evidence base they need. 

Underpinning everything is Lightspeed Filter; our AI-powered web filtering solution that prevents students from accessing harmful, inappropriate, or dangerous content in the first place. Filter isn’t just a blocklist; it uses intelligent, real-time categorisation to keep pace with the rapidly evolving online environment. For Scottish schools operating under the UNCRC’s legal framework, having a robust, auditable, and responsive filtering solution is a baseline safeguarding requirement, not an optional add-on. Filter ensures that schools can demonstrate active, proactive protection of children’s digital environment, with the governance transparency the strategy demands. 

What makes Lightspeed’s offering particularly powerful in the context of Scotland’s AI Strategy is that these tools work together. Filter creates a safer digital environment. Insight gives leaders visibility. Signal identifies vulnerability early. Alert triggers human action when a child is at risk. Together, they form a coherent, auditable, child-centred safeguarding ecosystem, exactly the kind of responsible AI infrastructure Scotland is asking public sector bodies to build. 

Scotland’s AI Strategy 2026 – 2031 is part of a broader shift across the UK and beyond. Governments are moving from aspiration to accountability when it comes to AI. The days of deploying AI tools without scrutiny; particularly in settings involving children are coming to an end. 

For education leaders in Scotland, that shift creates both challenge and opportunity. The challenge is real: governance frameworks need to be built, AI audits need to be conducted, and staff need to understand what responsible AI use looks like in practice. But the opportunity is equally real: schools and local authorities that get ahead of this now will be better placed to navigate the regulatory landscape, retain community trust, and, most importantly, keep children safer. 

At Lightspeed, we want to be part of that journey. If you are a school or local authority in Scotland thinking about what AI governance and safeguarding look like in the context of this new national strategy, we would love to have that conversation. 

Because when it comes to protecting children in the digital world, Scotland has just made its expectations very clear. And we think that’s exactly right. 

If you have any questions about how Lightspeed can help prepare your school or trust for the AI classroom, please get in contact with a member of our team here.