AI in Schools Would ‘Dehumanise' Classroom Interactions, Education Specialist Warns

Another expert said that with the possibility for pupils to use generative AI, coursework is no longer a reliable form of assessment.
AI in Schools Would ‘Dehumanise' Classroom Interactions, Education Specialist Warns
Grok, DeepSeek, and ChatGPT apps displayed on a phone screen in London on Feb. 20, 2025. Justin Tallis/AFP via Getty Images
Victoria Friedman
Updated:
0:00

An education specialist has warned the use of artificial intelligence in schools would dehumanise classroom interactions and increase children’s digital overload.

Christopher McGovern, chairman of the Campaign for Real Education (CRE), told The Epoch Times that educators tend to embrace technology because they see it as an improvement; however, they have not fully considered the implications of education-enhanced AI.

Some of these concerns involve how it would reduce the elements of human interaction that are integral to the learning experience.

“AI dehumanises the traditional classroom interaction between a teacher and the children, but also between the children themselves. That’s all taken away,” McGovern said.

McGovern, a retired head teacher and former adviser to the policy unit at 10 Downing Street, made the comments in the context of the education sector exploring the ways in which AI can aid pupils in the classroom and teachers with administration.

The Ada Lovelace Institute (ALI), a research centre which aims to ensure that technology works for the benefit of society, says that education-specific AI tools for schools are “barely emergent.” But there are some that are being used.
David Game College, a private school in London, launched a pilot scheme in September where core GCSE subjects are taught by AI with their progress being monitored by “learning coaches” rather than traditional teachers; a model the school claims is the first of its kind in the UK.

Children Could Reject AI

Younger generations who have have grown up in a world of technology would reasonably be expected to be the most open to AI taking over the classroom.
But according to the ALI, that is not necessarily going to be the case. In its January review of the AI in education landscape, the institute noted that pupils who participated in the Department for Education’s (DfE’s) generative AI hackathon in 2023–24 “were not keen on AI tutors, deeming the idea to be impersonal, error prone and not as helpful as a real teacher.”

“The importance of the pupil-teacher relationship matters as much to the pupil as it does to the teacher,” the think tank observed.

Similarly, teachers who were invited by the DfE to test a proof-of-concept AI marking tool expressed concerns over the impact that automated feedback could have on the pupil-teacher relationship. While automated marking could remove bias, educators said that its use might also demotivate pupils.
“[Pupils] want you to read their work. They want you to know and understand who they are as an individual. They want to impress you often. They want to interest you in who they are,” one secondary school teacher said in feedback to the department.

Tech Overload

McGovern said he does recognise that AI can be used constructively in certain situations and has the capacity to match learning tasks to the individual needs of pupils.

However, he said that if schools are going to introduce AI into a classroom, the use of technology needs to be reduced elsewhere.

The educator warned that AI would contribute to the “massive overload” of technology that is already impacting children, not least since smartphones and social media have become such a prominent part of young peoples’ lives.

“It’s an overdose of AI which is going to be the problem. As we are going further along the path overdosing our children, they become increasingly addicted to their screens,” he said, adding it could be a further detriment to children’s mental health.

Teachers Already Using AI

Despite there being few education-specific AI tools available, teachers are using generic AI products like ChatGPT in administrative tasks.
In 2023, 42 percent of teachers in England reported to the DfE that they had used generative AI in their role, including for creating learning resources and planning lessons.
File photo of a maths exam in progress at Pittville High School, Cheltenham, England, on March 2, 2012. (David Davies/PA Wire)
File photo of a maths exam in progress at Pittville High School, Cheltenham, England, on March 2, 2012. David Davies/PA Wire

ALI has pointed out that using generic products comes with its own problems, including generating content that is not age-appropriate or relevant to the curriculum. AI can also “hallucinate,” producing inaccurate outputs that it presents as facts.

The DfE has said that teachers are allowed to use AI to help with tasks like planning lessons, marking work, providing feedback, and creating resources. But they still need to ensure that anything AI generates is accurate and appropriate.

Schools can also set their own rules on AI use—including whether and how pupils can use it—as long as they follow legal requirements around data protection, child safety, and intellectual property.

The DfE is already supporting Oak National Academy, an online hub for digital teaching resources which includes an AI-powered assistant to help teachers plan personalised lessons.

Concerns Over Cheating

Last month, a survey of school support staff who belong to the GMB union said that AI is being used in almost one in five schools, with respondents citing concerns over the possibility of increases in cheating and plagiarism.

Cheating is not a new phenomenon, but educators have said that generative AI has made it much easier for children to do so, particularly in non-supervised assignments like coursework.

Education specialist Tom Richmond told The Epoch Times, “Coursework was already recognised as an unreliable form of assessment well before ChatGPT came along, but it is now abundantly clear that unsupervised assignments cannot be treated as a fair and trustworthy form of assessment.”

Richmond, the former director of the EDSK think tank, said that it is not possible to say with certainty how many children are using AI to cheat, as there are no reliable detection tools available to schools and colleges.

He added, “No form of assessment is immune to cheating, but some assessments are much harder to manipulate than others.”

“The most obvious way to reduce cheating is for schools to change the types of tasks and assessments that they set for pupils. Any task and assessment completed at home without supervision is now wide open to cheating, so schools can switch to more in-class assessments to prevent cheating,” he added.

File photo of Education Secretary Bridget Phillipson, dated Feb. 03, 2025. (Lucy North/PA Wire)
File photo of Education Secretary Bridget Phillipson, dated Feb. 03, 2025. Lucy North/PA Wire
An EDSK report from 2023 recommended that written, supervised examinations continue to be the main method of assessing learners’ knowledge and understanding. But in order to allow children to develop and demonstrate a range of skills, pupils should also undertake courses with different assessment methods, such as those with oral exams and extended projects.

£1 Million for EdTech

The government has a wider strategy to advance the usage of AI, including in education.
On Monday, Education Secretary Bridget Phillipson announced the government would be investing more than £1 million to test educational technology (EdTech) in schools, in an effort to improve outcomes for pupils and lighten the administrative workload for teachers.

In her speech at the Education World Forum, she confirmed that the department’s new Content Store Project will see curriculum guidance, teaching resources, lesson plans, and anonymised pupils’ work made available to AI companies to train their tools “so that they can generate top quality content for use in our classrooms.”

However, she emphasised that EdTech “can’t replace great teachers” and that “AI isn’t a magic wand.”

She also said the DfE will be working closely with international partners in the development of global AI guidelines for generative AI in education, in order to shape “the global consensus on how generative AI can be deployed safely and effectively to boost education around the world.”

The UK will host an international summit on generative AI in education in 2026.