3 Key Takeaways
- AI adoption is not linear—meet people where they are. Teachers, students, and parents will all move at different speeds. That’s normal—and it needs to be supported, not forced.
- Clarity beats complexity early on. A simple stance and basic guidance can prevent far more issues than waiting for a perfect policy.
- Students are already using AI—education must lead the how. Blocking doesn’t stop usage. Teaching responsible, ethical use is what actually prepares students for the future.
If there’s one thing I’ve learned over the past couple of years, it’s this: AI didn’t arrive in education gradually—it landed all at once.
One day districts were evaluating edtech tools on a normal cadence. The next, they were fielding questions from teachers, parents, students, and boards—often without the luxury of having all the answers.
That’s exactly why we partnered with EdWeek to host this conversation. Huge thanks to their team for bringing this together, and to the district leaders—Ashley Jones (Temple ISD), Katie Berry, and Amber Robinson (Northwest ISD)—for sharing what’s really happening on the ground. This wasn’t theory. This was real work, happening in real districts, right now.
And if you joined us, you probably saw yourself somewhere in the journey.
It Started Fast—and a Little Chaotic
One of the most important themes that came up—again and again—is that there is no single “stage” of AI adoption inside a district.
You don’t have one journey. You have many.
- Teachers experimenting… and others opting out
- Students using AI daily… even when it’s blocked
- Parents curious, concerned, or both
- Leaders trying to balance innovation with safety
Amber said something that stuck with me: you can’t change people’s minds until you change their experiences.
That’s why forced adoption doesn’t work here.
This has to be an invitation, not a mandate.
Katie put it well: when tools like ChatGPT reached massive adoption in months—not years—it flipped the traditional model on its head. Districts were expected to lead before they’d had time to fully learn themselves.
So what did many districts do first? They paused. Some blocked access. Not out of fear—but to buy time.
Time to think.
Time to communicate.
Time to define what “good” looks like.
And that pause turned out to be critical.
Blocking AI Isn’t the Strategy—Buying Time Is
Let’s talk about the elephant in the room: blocking.
Several districts shared that they initially blocked AI tools. And yes, students still found ways to use them. That’s reality.
But the goal wasn’t control—it was clarity.
Ashley described it as creating space to:
- Build a shared understanding
- Align to district goals
- Develop training before opening access
Temple ISD, for example, took a phased, multi-year approach—starting with staff readiness before rolling out to students.
And that’s the key distinction:
- Blocking isn’t a solution.
- It’s a temporary strategy to prepare for a better one.
Guidance Matters More Than Tools
If there’s one mistake districts are trying to avoid now, it’s this: Letting AI usage happen before guidance exists.
Because what happens then?
- Students experiment without guardrails.
- Teachers react after misuse.
- Policies follow problems instead of preventing them.
Katie shared a simple but powerful early move: a short statement clarifying that AI is a learning tool—not a replacement for thinking.
That’s not complicated. But it removes ambiguity.
And ambiguity is where most problems start.
Training Isn’t One-and-Done—It Evolves
Professional development came up as one of the biggest ongoing efforts—and challenges.
What we’re seeing:
- Early adopters want more, faster
- Others want low-pressure entry points
- Everyone wants clarity on “appropriate use”
Temple ISD evolved from optional training → to required PD → to more structured, use-case driven learning.
Northwest ISD built living resources—like their AI website—that started small and grew over time.
And that’s the lesson:
- Don’t wait until training is perfect.
- Publish. Then improve.
AI Literacy Is Now Core to Student Readiness
One of the biggest mindset shifts is this: AI is no longer optional for student readiness.
Katie tied this directly to their district mission—preparing students to confidently navigate their future.
And that future includes AI.
So the question isn’t: “Should students use AI?”
It’s: “How do we teach them to use it well?”
That includes:
- Ethical use
- Knowing when not to use it
- Understanding its limitations
- Building real thinking alongside it
Because without guidance, students equate usage with mastery—and that’s a dangerous assumption.
Measurement Changes the Conversation
One of the more practical (and often overlooked) pieces: visibility.
You can’t manage what you can’t see.
Districts are using tools like Lightspeed Classroom and Lightspeed Insight to:
- Identify when AI is being used
- Understand patterns across schools
- Support conversations—not just enforcement
And that last part matters.
This isn’t about “catching” students.
It’s about context.
Because the same behavior might be:
- Productive collaboration in one classroom
- Misuse in another
And only the teacher can determine the difference.
Start Small—but Start
We closed with advice for districts just getting started.
And honestly, it was refreshingly consistent:
- Don’t wait for perfect
- Don’t try to solve everything
- Don’t stay silent
Start with:
- A clear stance
- A simple resource
- One use case
Then build from there.
Because if you don’t provide guidance, your community will create their own.
If there’s one thread through all of this, it’s that no one has it fully figured out—and that’s okay.
What matters is that we’re learning together.
And again, thank you to EdWeek for hosting, and to the district leaders who are doing the hard work every day to get this right.
Q&A
Should districts block AI tools for students?
Blocking can be useful temporarily to buy time for planning, communication, and training. However, it is not a long-term strategy. Students will still access AI outside school, so the focus should shift to guided, responsible use.
How should districts introduce AI to teachers?
Start with optional, low-pressure professional development, then evolve into structured training focused on real classroom use cases. Tie AI usage back to district goals and instructional frameworks.
What’s the biggest risk of not guiding AI usage early?
Students and teachers develop poor habits—such as over-reliance or misuse—before understanding appropriate use. This leads to inconsistent expectations and reactive discipline instead of proactive learning.
How can districts ensure consistent AI use across classrooms?
Develop a clear framework that defines what AI use looks like across grade levels, while still allowing teacher autonomy. Provide shared language and expectations to reduce confusion for students.
How do you monitor AI usage without policing it?
Use visibility tools to provide context (not punishment). Monitoring should support conversations between teachers and students about how AI is being used—not just flag misuse.
What’s the best way to get started with AI in a district?
Start small:
- Publish a basic stance on AI
- Provide one or two example use cases
- Build resources over time
The key is to begin—even without having all the answers.
How should districts involve parents in AI adoption?
Provide accessible resources that explain AI in simple terms and offer practical examples (even non-educational ones). Parents need entry points just like teachers and students.