California State University paid OpenAI $17 million for unlimited access to ChatGPT Edu across its 23 campuses — the largest public university system in the US, with 460,000 students. One year later, only 0.7% of students and 16% of faculty have completed the optional training. Professors have started a petition to cancel the contract outright. Students say they weren’t consulted. And everyone is confused about when AI use is cheating and when it’s the assignment.
If a system serving nearly half a million students can’t figure out how to integrate AI ethically, what chance do smaller institutions have?
The Numbers That Should Embarrass Everyone
Cal State commissioned a survey of 94,000 students, faculty, and staff. The results are damning:
- 52% of faculty reported AI having a negative effect on their teaching
- 67% of students said their professors don’t teach them how to use AI effectively
- 0.7% of students completed the voluntary AI training
- 16% of faculty completed the training
- Only 64% said AI affected their learning positively — and that’s the stat the Chancellor’s office leads with
Let’s sit with the 0.7% number for a moment. Cal State spent $17 million on AI access — roughly $37 per student — and not even 1% of students bothered to learn how to use it. That’s not a student engagement problem. That’s a deployment problem.
The Chancellor’s office points to the 64% positive response as evidence the program is working. But when you dig into the survey, that 64% includes students who just said AI has “affected” their learning experience — not necessarily that it made it better. And when your training completion rate is below 1%, “affected” could just as easily mean “made more confusing.”
The Revolt
Faculty at San Francisco State University started a petition calling on Chancellor Mildred Garcia to end the partnership. The petition frames the issue in terms that resonate far beyond California: why are we spending millions on AI tools while faculty are still figuring out the basics of academic integrity with AI?
Ryan Jenkins, chair of the AI Task Force for Cal Poly San Luis Obispo’s faculty union, summed it up: “I’m not sure [Cal State] realized how much new work it would require, how much revision to the old way of doing things it would require.”
The contract was signed in January 2025 and runs 18 months through July 2026. The university has not announced whether it will renew.
Students have their own complaints. Katie Karroum, a Cal State Northridge communications major and student association VP, said: “We were not consulted when the contract was signed, and we weren’t even given a heads up.” The student association published an open letter calling for a consistent AI policy — right now, each professor sets their own rules, creating chaos for students navigating five different AI policies across five classes.
Assemblymember Mike Fong has introduced AB 2392, which would require Cal State and California community colleges to provide training on any AI product deployed on campuses. The bill is a direct response to the revelation that campuses adopted AI tools “without consistent guidance or training.”
The Deeper Problem: Access Without Literacy
The core failure here isn’t the tool — it’s the assumption that access equals literacy. Cal State negotiated a bulk deal for ChatGPT Edu, made it available to everyone, and assumed the rest would sort itself out. It didn’t.
Compare this to how universities teach research skills: you don’t just give every student a library card and hope for the best. You teach them how to evaluate sources, how to cite properly, how to construct arguments. AI literacy requires the same scaffolding — and it’s almost entirely absent.
OpenAI’s ChatGPT Edu version includes safeguards — data is defaulted not to train OpenAI’s models — but that’s a privacy feature, not a pedagogical one. It doesn’t help a student decide when it’s appropriate to use AI versus when it’s cheating. It doesn’t help a professor redesign assignments that AI can’t do for you.
What This Means for NZ
New Zealand universities are watching this closely. If a system with 460,000 students, 23 campuses, and a $17 million budget can’t get AI integration right, what hope for the University of Waikato?
NZ universities are smaller, have fewer resources for training and policy development, and are facing the same fundamental questions: Is AI use in assignments cheating or a workplace skill? Who decides? How do we teach critical AI literacy when faculty themselves don’t agree?
The NZ Tertiary Education Commission has been slow to provide national guidance. Individual universities have taken different approaches — some banning AI, some embracing it, most muddling through in the middle. Cal State’s experience suggests that the ad hoc approach doesn’t work.
The lesson for NZ: don’t just buy the tool. Build the literacy first. Train faculty. Develop consistent policies. Involve students in the decisions. Otherwise you’re spending $17 million for confusion.
🔍 THE BOTTOM LINE
0.7% student training completion is damning. Institutions are spending millions on AI access and $0 on AI literacy — and the chaos at Cal State is what happens when you confuse buying a tool with implementing a strategy.
❓ Frequently Asked Questions
Q: Is Cal State renewing the OpenAI contract? Not yet announced. The 18-month contract ends July 2026. A faculty petition to cancel has gained traction, and a state bill (AB 2392) would require training standards for any AI tool deployed on campus.
Q: Why is student training completion so low? The training was optional, poorly communicated, and not integrated into courses. Students weren’t told why it mattered, and faculty weren’t equipped to incorporate it into their teaching.
Q: Should NZ universities avoid similar deals? Not necessarily — but they should invest in AI literacy and policy development first, before signing large contracts. The tool is useless without the infrastructure to use it well.
Q: What’s the alternative to campus-wide AI deals? Some universities are taking a department-by-department approach, developing AI use policies tailored to specific disciplines. Others are creating AI literacy requirements as part of core curriculum. The key is intentionality, not just access.
📰 SOURCES
- CalMatters — “Cal State struck a deal with OpenAI. Some students and faculty refuse to use it” (May 2026)
- Inside Higher Ed — “Despite Skepticism, Survey Shows Widespread AI Use at Cal State” (April 2026)
- Inside Higher Ed — “Faculty Push Back Against OpenAI Deals” (March 2026)
- KPBS — coverage of Cal State OpenAI contract
- Institute for Policy Studies — “How Cal State Became Ground Zero for the Fight Over AI in Higher Education”