Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology

Preparing students for a world shaped by artificial intelligence

College student sits at a desk with his laptop.
‘The task is not to ignore AI, but to teach students how to use it critically.’ Photograph: Dmitriy Shironosov/Alamy

Prof Leo McCann and Prof Simon Sweeney are right to warn that uncritical reliance on artificial intelligence risks bypassing deep learning (Letters, 16 September). But that does not mean large language models have no place in higher education. Used thoughtfully, they can enhance teaching and learning.

Graduates will enter a workforce where AI is ubiquitous. To exclude it from education is to send students out unprepared. The task is not to ignore AI, but to teach students how to use it critically.

AI can also reinforce learning. Take the example cited by McCann and Sweeney of students mischaracterising Henry Ford as a “transformational leader”. Instead of banning AI, lecturers could ask students to generate an AI response and then critique it against the 1922 text. This highlights the technology’s limitations – anachronistic terms, lack of historical context – while underlining the value of close reading and primary sources.

The real problem lies not with AI, but with outdated assessment models. If ChatGPT can easily answer a coursework question, that says as much about the weakness of the assessment as the strength of the tool. Redesigning tasks to test process as well as product can help ensure these tools develop rather than diminish critical skills.

Misuse is a genuine concern. But rejecting AI outright risks leaving students ill-equipped. Universities should lead in shaping its ethical, critical and creative use, ensuring it strengthens rather than undermines learning.
Dr Lorna Waddington
Dr Richard de Blacquière-Clarkson
University of Leeds

• The claim from Prof Leo McCann and Prof Simon Sweeney that generative AI “sabotages and degrades students’ learning” risks repeating a familiar pattern in higher education: treating new technologies as threats rather than catalysts for change. When calculators arrived, many feared they would destroy numeracy; instead, curricula shifted to mathematical reasoning. Word processors raised worries about spellcheck eroding writing ability, yet pedagogy evolved to emphasise structure and clarity. Even the internet, once derided as a source of plagiarism and misinformation, ultimately pushed universities to stress information literacy and source evaluation.

AI presents real challenges, but history shows the problem is not the tool itself, but assessment practices that fail to adapt. If universities continue to reward only polished products, AI will inevitably be seen as a shortcut. What we need is a shift toward process-based evaluation. This means valuing learning journals that capture students’ decision-making, reflective essays that unpack research strategies, or oral defences where they explain how they reached a conclusion. These approaches do not bypass reflection and criticality – they make them unavoidable.

To dismiss AI as “generic” or “factually incorrect” is to overlook its pedagogical potential. Its flaws can themselves be teaching tools: comparing AI drafts with original sources sharpens critical reading, while asking students to critique or refine outputs fosters precisely the analytical and creative skills universities claim to champion.

Education has always evolved alongside technology. This moment is no different. The challenge is not to wall off AI, but to rethink our goals so that students graduate both able to use these tools and to think critically about them.
Prof Robert Stroud
Hosei University, Japan

• The situation regarding university students’ use of AI in the arts and humanities may be even worse than Profs McCann and Sweeney report. In a nutshell: many students don’t attend class (attendance is typically around 30%), don’t sit any in-person exams (which vanished with Covid and haven’t returned), and have their coursework (that is, 100% of their grade) written wholly or largely by AI. Perhaps over half of all students on arts and humanities courses fit this bill. They won’t get the top grades, but a good degree (a mid-2:1) is very gettable in this way. All that their degree certificate will guarantee is that they handed over nearly £30,000.

That being so, you might wonder why we don’t cut out the middleman and just hand out degree certificates immediately after payment. And that might well be an option favoured by some university administrators, who then wouldn’t need to worry about such pesky things as having to pay their teaching staff. But for those of us who care about both the quality of higher education and the cultural role of universities, the current situation regarding AI is a disaster.
Prof Mark Jago
University of Nottingham

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.