AI education in a Denver classroom
AI-Edu

Denver Public Schools Opens the Door to AI in Classrooms — But Teachers Aren't Sure They're Ready

Denver Public Schools has approved student access to Google Gemini, NotebookLM, and MagicSchool on school devices. Teachers say the rollout caught them off guard.

ai-educationdenver-public-schoolsgoogle-gemininotebooklmmagicschool

Denver Public Schools Opens the Door to AI in Classrooms — But Teachers Aren’t Sure They’re Ready

On April 8, Denver Public Schools sent an email to staff announcing that students could now access a suite of AI tools — including Google Gemini, NotebookLM, and MagicSchool — on school-issued devices, effective immediately.

For a district serving over 90,000 students, the policy shift is significant. It’s one of the largest U.S. school districts to move from restricting AI to actively enabling it, and it reflects a growing recognition among educators that AI is already part of students’ daily lives, whether schools sanction it or not.

But the rollout has revealed a tension that districts across the country will face: the technology arrived faster than the training to support it.

📋 What Changed

Under the new policy, students can access three AI platforms on school devices:

  • Google Gemini — a general-purpose AI assistant with broad capabilities, including writing, research, and problem-solving
  • NotebookLM — Google’s AI research tool that can summarize, analyze, and synthesize documents and sources
  • MagicSchool — an education-focused AI platform designed for classroom use, with teacher controls over what students can access

The district framed the decision as pragmatic rather than ideological. Billy Sayers, DPS’s director of STEM, said the goal was to get students off unapproved AI platforms and onto ones with contractual data protections.

“We wanted to get as many students off unapproved platforms and tools to approved tools that are contractually bound to protect our students’ data. We want to get ahead and pretty much set some pretty firm guardrails of what students are allowed to use and not allowed to use.” — Billy Sayers, DPS Director of STEM

The district is using what Sayers described as a “walled garden” approach: student activity can be monitored through school accounts, with flags for self-harm and visibility into AI inputs and outputs.

🎓 The Teacher Response

Not everyone was ready for the change. Amber Wilson, an English teacher at Thomas Jefferson High School with over two decades of experience, learned about the policy through the same April 8 email — with no prior warning.

“I feel like releasing Gemini — just like, poof. Now the kids have Gemini. That’s a big thing. I wasn’t ready for it, and now it’s there.” — Amber Wilson, English Teacher, Thomas Jefferson High School

Wilson isn’t opposed to AI in the classroom. She’s been using MagicSchool for two years, specifically because it allows teachers to control and limit how students interact with AI. She uses text levelers and summarizers to help students break down complex writing, while controlling how much information the system provides.

“I can program the tools not to give them too much information,” Wilson said. “They have to eventually go out and read the article themselves so they can develop that critical thinking.”

Her concern with Gemini is that it lacks those guardrails. Unlike MagicSchool, which gives teachers fine-grained control over student interactions, Gemini is a general-purpose tool.

“I do worry about, you know, what guardrails have they put on Gemini. If the version that they’ve given the kids is the same one I have on my district computer, there were no guardrails there to keep it from writing a whole essay in two seconds.” — Amber Wilson

🪤 The Cheating Question

Wilson raised a concern that teachers across the country are grappling with: determining what work is genuinely a student’s.

“Teachers are spending hours going through papers trying to decide, is this real? Is this not?” she said.

The district’s response is to encourage a shift in how teachers think about assignments. Sayers described a move away from binary cheating frameworks toward conversations about how students used AI tools.

“Cheating is not as binary as it used to be,” he said. Under the new policy, teachers can designate assignments as no-AI, limited-AI, or full-AI — and assess student work accordingly.

It’s a framework that makes sense in theory. In practice, it requires teachers to fundamentally redesign how they assess learning, and many haven’t been given the time or training to do that.

🏫 Training Gaps

The district has made training available — AI 101 sessions for beginners, AI 201 for more advanced integration, and tool-specific workshops for NotebookLM. A Gemini-specific training is planned for this summer.

But Wilson noted that the training requires teachers to seek it out on their own time. There’s no structured, mandatory program. For a district rolling out a change this significant, that gap matters.

“That worries me, that it’s just going to appear on kids’ computers and people won’t know what to do with it.” — Amber Wilson

🌍 Why This Matters Beyond Denver

Denver is not the first district to embrace AI in classrooms — New York City recently released its own AI “traffic light” framework for classroom use — but it’s among the largest to do so, and the dynamic it reveals is universal.

The core tension is this: AI tools are already in students’ hands. Schools can either pretend they’re not, or they can try to shape how they’re used. Denver chose the second path. But choosing it and being prepared for it are different things.

The district’s “walled garden” approach — monitored accounts, approved platforms, data protections — is a reasonable starting point. The question is whether the garden walls are high enough, and whether the people tending the garden have the tools they need.

Sayers acknowledged the uncertainty: “If it doesn’t lead to improved student outcomes, we’ll be the first ones to say that we’ve got to find a better way.”

That kind of institutional honesty is rare, and it’s worth noting. But so is Wilson’s point: the tools arrived before the training. For teachers across the country watching Denver’s experiment, that’s the detail that matters most.

🔄 The Broader Pattern

What Denver is doing reflects a shift happening across education globally. After two years of bans, restrictions, and panicked op-eds about ChatGPT in classrooms, school systems are moving toward the pragmatic middle: accept that AI exists, try to guide its use, and hope that guided integration produces better outcomes than prohibition.

The evidence so far is thin. There’s very little longitudinal data on whether students who use AI tools in guided settings learn more, learn differently, or simply learn to depend on AI. Denver’s willingness to evaluate outcomes honestly — and to reverse course if the data doesn’t support continued use — is a model that other districts would do well to follow.

But the training gap is real, and it’s not going to close itself. Districts that approve AI tools without investing in teacher preparation are rolling out a curriculum change without a curriculum. The result won’t be failure — it’ll be inconsistency. Some teachers will thrive. Others will struggle. And students will get wildly different experiences depending on which classroom they walk into.

Sources


🔍 THE BOTTOM LINE: Denver Public Schools just gave 90,000+ students access to AI tools on school devices — but teachers found out the same day students did. The “walled garden” approach (monitored accounts, approved platforms) is reasonable in theory, but the training to support it is voluntary and self-directed. The result? Some classrooms will integrate AI thoughtfully. Others will wing it. If your kid goes to DPS, ask their teacher which scenario they’re in. And if you work in education elsewhere, Denver just became your cautionary tale and your playbook — mostly the former.

Sources: 9News, Google for Education