When Teachers Train Each Other to Use AI
The $23 Million Experiment to Put Teachers in Control of AI
In March 2026, 50 teachers gathered in New York City to learn something new: how to build “agentic” AI tools that can do complex, multi-step tasks with reasoning capabilities.
They weren’t being trained by AI company engineers. They were being trained by other teachers.
This is the National Academy for AI Instruction — a five-year, $23 million partnership between the American Federation of Teachers and three major AI developers (Anthropic, Microsoft, and OpenAI) with an ambitious goal: train 400,000 teachers to use AI effectively in their classrooms.
The key difference: Teachers train teachers, with limited support from AI companies. The goal is to keep educators “in the driver’s seat” for AI, rather than having technology companies or districts control how AI enters schools.
The Problem: Teachers Are Using AI, But Not Well
The share of teachers using AI tools nearly doubled from 2024 to 2025. Six in 10 teachers now say they use AI in their practice.
But here’s the gap: most teachers use AI for basic tasks — lesson plans, administrative work, calendar organizing. Surface-level applications.
What’s missing: Teachers using AI for instructional improvement, personalized learning, and sophisticated pedagogical tasks.
What “Agentic AI” Means for Teachers
“Agentic AI” refers to AI systems that can autonomously perform multi-step tasks with reasoning — not just respond to single prompts.
| Basic AI Use | Agentic AI Use |
|---|---|
| ”Create a lesson plan" | "Analyze my lesson for gaps, suggest improvements, and track what works over time" |
| "Write a parent email" | "Write personalized parent communications that follow district policy and reference each student’s specific needs" |
| "Generate quiz questions" | "Create differentiated assessments for students with varying reading levels and learning needs” |
The teacher’s role changes from AI user to AI director. Teachers use their professional judgment to narrow scope, direct AI to trusted sources, define rules and tone, and evaluate suggestions against their expertise.
Real Teachers, Real Applications
Jing Liang Guan, science teacher, Brooklyn Science and Engineering Academy: Building an AI agent to stress-test his lessons for content gaps and confusing wording.
Yasheema Cook, special education teacher, NYC public schools: Creating AI agents to help develop and monitor individualized education programs (IEPs) and differentiate lessons for 12th graders with autism and other needs.
Jennifer Watters, 3rd grade teacher, PS 229 Queens: Using AI since 2019 for higher-order activities — but switched from ChatGPT to Claude over privacy concerns following Anthropic’s refusal to let the Pentagon use its model for domestic surveillance.
The Privacy Challenge
Teachers noted a careful balancing act: providing enough context about student needs to get meaningful AI help, while protecting student privacy. NYC public schools hadn’t released formal AI use guidance yet, and teachers said it wasn’t always clear what data might identify students across different tools.
What Effective Teacher AI Training Looks Like
- Bite-sized modules, not marathon sessions — Google’s parallel initiative uses short, flexible modules for busy teachers’ schedules
- Immediate classroom application — Real-world examples teachers can use the next day
- Teachers as trainers — Teachers trust other teachers to understand classroom realities
- Professional judgment stays central — AI augments teacher expertise, doesn’t replace it
- Privacy and ethics embedded — Not an afterthought but a core component
The Bigger Picture: 74 Million Students
Google’s initiative with ISTE+ASCD aims to reach all 6 million K-12 and higher education faculty in the U.S., ultimately affecting over 74 million students.
The Honest Take
This is what effective AI education integration looks like: teachers training teachers, practical applications, privacy concerns addressed, and professional judgment remaining central.
What’s working: Training is teacher-led, not company-driven. Focus is on sophisticated use, not just basic AI literacy. Real classroom problems drive the AI tool development.
What’s still emerging: Formal guidance on student data privacy. Clear rules about what data can be used across different AI tools. Long-term effectiveness data — does this actually improve student outcomes?
The risk: Offloading common teacher tasks to AI can have unintended consequences. Research shows AI-generated letters can make teachers less likely to remember important details about students — the act of writing itself helps retention.
The opportunity: Teachers who learn to use AI effectively now will be positioned to shape how AI enters education. Those who wait will have tools and policies imposed on them.
Sources
- EdWeek: “Teachers Move Beyond AI Basics to More Sophisticated Instructional Uses”
- Google for Education: “Our commitment to make AI training available to all 6 million U.S. educators”
- American Federation of Teachers: National Academy for AI Instruction
- EdWeek Research Center: Teacher AI usage survey 2024-2025