The people most alarmed by AI’s takeover of high school classrooms aren’t administrators, policymakers, or tech executives. They’re the students sitting next to classmates who no longer think for themselves.
A Concord Monitor investigation published April 18 features interviews with eight students from four New Hampshire high schools — all of them unsettled by what they’re witnessing. Their accounts paint a picture of an education system under siege from within: students who once loved learning now questioning why they bother, teachers grading for completion rather than quality, and a generation’s cognitive capabilities visibly eroding in real time.
84% and Rising
The numbers are stark. A College Board study found that 84% of high schoolers reported using AI for schoolwork last year. Students in New Hampshire say the real rate of unauthorized use is even higher than official figures suggest.
“Among students, it’s kind of open how much kids use AI for assignments,” said Concord High School senior Martin Pennington, who estimated that 80% of his classmates have used AI on schoolwork in ways that are not allowed.
The problem isn’t limited to English essays. Students described using Snapchat’s AI and Google Lens to photograph math and science questions for instant answers. In AP art, classmates are asking ChatGPT to generate reference images and brainstorm assignments that will score highest — a shortcut that Concord High senior Andrew Pfitzenmayer finds particularly galling: “I feel like if you’re in an advanced art program, you should be creative enough to be able to come up with your own ideas.”
The Motivation Collapse
Perhaps the most troubling finding isn’t the cheating itself — it’s what the cheating is doing to students who still do honest work.
“I used to be motivated to learn, and now there’s just really not much to be motivated for when you feel that school is more about just receiving a grade than learning, and there’s such an easy way for people to do cheat through that,” Pennington said.
Concord Christian Academy junior Faith Dudley used to love writing. Now, she says, the joy is gone. Her teachers used to give substantive feedback on her work. Now they just say “good work” — because they can’t distinguish her effort from AI-generated text.
“Even if I get 100s, it’s like: What was the point of writing it if not for an audience to enjoy it?” Dudley said. “There’s other things that teachers — which is our audience — are enjoying, even though it’s not my fellow classmates writing it.”
Concord High senior Caledonia Mahon experienced the demoralization firsthand. She poured herself into a personal reflection about growing as a leader during an outdoor education course. A classmate then delivered a speech that he openly admitted ChatGPT had written. “He didn’t even know what he was saying,” Mahon said. “You could tell it was stuff that did not come from him and his experiences, and it felt a little disheartening.”
Visible Cognitive Decline
The students aren’t just frustrated. They’re frightened by what they’re observing in their peers.
“I worry about almost a mind-deadening future,” said Londonderry High School junior Lauren Damota, who recently placed third in a statewide writing competition on AI regulation.
Vaibhav Rastogi, a Bishop Brady senior who tutors younger students in math, reported noticing cognitive decline among the children he works with: “The general populace of students — their critical thinking, their ability to really stick with something for an extended period of time — that is something I’ve found lacking.”
The decline isn’t subtle. Pfitzenmayer recounted a classmate in his advanced history class who “mentioned that he didn’t know the Declaration of Independence was even attached to American history, because he had used AI during that whole class.”
Detection Is Broken
Students described an enforcement landscape that has essentially collapsed. AI detection tools are unreliable. Students who should be caught aren’t, and students who do honest work sometimes face wrongful accusations.
“It makes teachers not trust students as much because they automatically assume the worst,” Mahon said. “It can be really damaging to the students who do care about what they’re writing about and do want to give their best work.”
Students evade detection by prompting AI to “make this sound like a human wrote it” or “dumb it down a bit.” At Concord High, ChatGPT remains accessible on school-issued Chromebooks while sites like Spotify are blocked. Concord’s director of communications, Terry Wolf, said ChatGPT should be blocked too — but clearly, it isn’t.
The district has formed an AI implementation team and is developing a new policy, but students report that the current approach amounts to doing nothing. Concord High’s official policy bars all AI use — a rule that students say is universally ignored.
Teachers Are Giving Up
In the absence of reliable detection, some teachers have effectively surrendered. Students reported that grading has shifted from assessing quality to marking completion — a change that further incentivizes AI use. Why struggle with an assignment when a completion grade rewards the same outcome regardless of effort?
Some teachers are fighting back. Dudley’s history teacher now requires students to turn their chairs so he can see their screens during in-class writing. Others have increased in-class assignments. But these are isolated measures against a systemic problem.
What Students Actually Want
The students interviewed were clear-eyed about what needs to change — and it isn’t more detection software or blocked websites.
They called for more in-class writing. More creative assignments that can’t be completed by a language model. “Even having students do what would seem like a younger child thing, like do a comic strip and write a paragraph on each box,” Dudley suggested.
Most fundamentally, they want a rethinking of what school is for.
“Embed it in their minds that the point of school is not just to pass and get the highest grade,” said Concord High junior Zadie Taylor. “It’s really to understand the core concepts of what they’re learning — how to retain that information and use it later on in life.”
This is the student perspective that’s been missing from the AI-in-education conversation. While organizations and experts debate policy at the systemic level, the kids in the classroom are watching their peers’ cognitive capabilities erode in real time — and they’re the ones sounding the alarm.
Whether anyone is listening is another matter entirely.