Educators have been warning about it. Parents have been worrying about it. Now there’s peer-reviewed evidence: students who become dependent on AI for learning show measurable declines in interpersonal competence.
On April 9, 2026, Scientific Reports (Nature) published a study titled “AI learning dependence and interpersonal incompetence in EFL contexts: The moderating role of ethical awareness” — and its findings should be required reading for every school administrator, education minister, and parent currently rushing AI into classrooms.
The Study
Researchers Qinqing Zhang and Hung-Wei Feng surveyed 608 non-English major university students across Chinese universities, examining the relationship between AI learning dependence and interpersonal incompetence in English as a Foreign Language (EFL) education.
Using the Unified Theory of Acceptance and Use of Technology (UTAUT) framework and structural equation modelling (SEM), the study identified four factors driving AI dependence:
- Effort expectancy — AI makes learning feel easier
- Performance expectancy — AI seems to improve outcomes
- Facilitating conditions — AI tools are readily available
- Social influence — peers and institutions encourage AI use
All four factors were positively associated with increased AI dependence. And that dependence, in turn, was positively associated with greater interpersonal incompetence.
This isn’t speculation. It’s statistical correlation from a rigorous methodology, published in one of the world’s most cited scientific journals.
What “Interpersonal Incompetence” Actually Means
The study measured interpersonal competence across five domains adapted from established psychology frameworks:
- Initiating relationships — difficulty starting conversations or making connections
- Self-disclosure — reduced ability to share appropriately with others
- Emotional support — diminished capacity to provide or receive emotional help
- Conflict management — struggling to navigate disagreements
- Negative assertion — inability to set boundaries or express disagreement
These aren’t soft skills in the pejorative sense. They’re the foundational capabilities that determine whether someone can function in a workplace, sustain relationships, or participate meaningfully in a democracy. And AI dependence is eroding them.
The Mitigating Factor: Ethical Awareness
There is a silver lining — and it’s significant. The study found that AI ethical awareness moderates the relationship between AI dependence and interpersonal incompetence.
Students with higher ethical awareness — those who understood the limitations, risks, and appropriate use boundaries of AI — showed a weaker positive association between AI dependence and social skill decline. Ethical awareness doesn’t eliminate the correlation, but it dampens it.
This finding has immediate practical implications. It suggests that simply adding AI to classrooms without concurrent ethics education isn’t just incomplete — it’s actively harmful. The students most vulnerable to social skill erosion are those who use AI uncritically, without understanding its boundaries.
Why This Matters Now
The study arrives at a critical moment. Schools worldwide are racing to integrate AI tools, driven by FOMO, government mandates, and vendor pressure. China’s own five-ministry “AI + Education” action plan, released just days before this study, mandates AI literacy for every student by 2030.
But this study asks the question that most policy documents skip: what happens to students’ social development when they outsource learning to AI?
The answer isn’t theoretical anymore. It’s empirical. And it’s concerning.
Media Dependency Theory — the framework underpinning the study — has long documented how over-reliance on any medium reshapes human behaviour. We saw it with television, social media, and smartphones. AI is the latest medium, but its effects may be more insidious: AI doesn’t just consume attention, it substitutes for human interaction in learning contexts where social development traditionally occurs.
When a student asks ChatGPT for help instead of a classmate, they gain an answer but lose a conversation. When an AI tutor replaces peer feedback, efficiency improves but collaborative skills atrophy. The study quantifies what educators intuitively feared: the more students depend on AI, the less capable they become at relating to other humans.
Practical Suggestions for Sustainable AI Integration
The study’s authors don’t argue for banning AI from education. Instead, they propose a framework for sustainable integration:
-
Teach AI ethics alongside AI tools. Ethical awareness isn’t a luxury — it’s a protective factor. Students need structured education about AI limitations, appropriate use, and the social risks of over-reliance.
-
Design AI-augmented learning that preserves human interaction. Use AI for individual practice and feedback, but protect collaborative learning, peer review, and group discussion as non-negotiable elements of every course.
-
Monitor dependence, not just adoption. Schools tracking AI tool usage should also track social development indicators — and intervene when the former starts displacing the latter.
-
Train teachers to recognise AI dependence signals. Reduced classroom participation, avoidance of group work, and over-reliance on AI-generated responses are early warning signs.
The Bigger Picture
This study is the first, but it won’t be the last. As AI becomes ubiquitous in education, the research community is only beginning to grapple with its developmental consequences. The finding that ethical awareness mitigates social skill decline is particularly important for policymakers: it means the solution isn’t to slow AI adoption, but to ensure ethics education keeps pace.
The alternative — rushing AI into every classroom without teaching students how to think critically about it — now has peer-reviewed evidence showing measurable harm. Ignorance of AI’s social costs is no longer an academic position. It’s a choice.
SOURCES
- Zhang, Q. & Feng, HW. “AI learning dependence and interpersonal incompetence in EFL contexts: The moderating role of ethical awareness.” Scientific Reports (2026). DOI: 10.1038/s41598-026-47158-6
- Nature Scientific Reports (published April 9, 2026)