Close

Revealed: Thousands of UK university students caught cheating using AI | Higher education

Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

The data highlights a rapidly evolving challenge for universities: trying to adapt ***essment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

In 2019-20, before the widespread availability of generative AI, plagiarism accounted for nearly two-thirds of all academic misconduct. During the pandemic, plagiarism intensified as many ***essments moved online. But as AI tools have become more sophisticated and accessible, the nature of cheating has changed.

The survey found that confirmed cases of traditional plagiarism fell from 19 per 1,000 students to 15.2 in 2023-24 and is expected to fall again to about 8.5 per 1,000, according to early figures from this academic year.

A series of charts showing proven misconduct cases per 1,000 students. Plagiarism rises from 2019-20 to 2022-23 then drops back again, while AI-related misconduct rises from 2022-23 to almost the same level as plagiarism. ‘Other misconduct’ remains fairly stable.

The Guardian contacted 155 universities under the Freedom of Information Act requesting figures for proven cases of academic misconduct, plagiarism and AI misconduct in the last five years. Of these, 131 provided some data – though not every university had records for each year or category of misconduct.

More than 27% of responding universities did not yet record AI misuse as a separate category of misconduct in 2023-24, suggesting the sector is still getting to grips with the issue.

Many more cases of AI cheating may be going undetected. A survey by the Higher Education Policy Institute in February found 88% of students used AI for ***essments. Last year, researchers at the University of Reading tested their own ***essment systems and were able to submit AI-generated work without being detected 94% of the time.

Dr Peter Scarfe, an ***ociate professor of psychology at the University of Reading and co-author of that study, said there had always been ways to cheat but that the education sector would have to adapt to AI, which posed a fundamentally different problem.

He said: “I would imagine those caught represent the tip of the iceberg. AI detection is very unlike plagiarism, where you can confirm the copied text. As a result, in a situation where you suspect the use of AI, it is near impossible to prove, regardless of the percentage AI that your AI detector says (if you use one). This is coupled with not wanting to falsely accuse students.

“It is unfeasible to simply move every single ***essment a student takes to in-person. Yet at the same time the sector has to acknowledge that students will be using AI even if asked not to and go undetected.”

Students who wish to cheat undetected using generative AI have plenty of online material to draw from: the Guardian found dozens of videos on TikTok advertising AI paraphrasing and essay writing tools to students. These tools help students byp*** common university AI detectors by “humanising” text generated by ChatGPT.

Dr Thomas Lancaster, an academic integrity researcher at Imperial College London, said: “When used well and by a student who knows how to edit the output, AI misuse is very hard to prove. My hope is that students are still learning through this process.”

Harvey* has just finished his final year of a business management degree at a northern English university. He told the Guardian he had used AI to generate ideas and structure for ***ignments and to suggest references, and that most people he knows used the tool to some extent.

“ChatGPT kind of came along when I first joined uni, and so it’s always been present for me,” he said. “I don’t think many people use AI and then would then copy it word for word, I think it’s more just generally to help brainstorm and create ideas. Anything that I would take from it, I would then rework completely in my own ways.

“I do know one person that has used it and then used other methods of AI where you can change it and humanise it so that it writes AI content in a way that sounds like it’s come from a human.”

Amelia* has just finished her first year of a music business degree at a university in the south-west. She said she had also used AI for summarising and brainstorming, but that the tools had been most useful for people with learning difficulties. “One of my friends uses it, not to write any of her essays for her or research anything, but to put in her own points and structure them. She has dyslexia – she said she really benefits from it.”

The science and technology secretary, Peter Kyle, told the Guardian recently that AI should be deployed to “level up” opportunities for dyslexic children.

Technology companies appear to be targeting students as a key demographic for AI tools. Google offers university students a free upgrade of its Gemini tool for 15 months, and OpenAI offers discounts to college students in the US and Canada.

Lancaster said: “University-level ***essment can sometimes seem pointless to students, even if we as educators have good reason for setting this. This all comes down to helping students to understand why they are required to complete certain tasks and engaging them more actively in the ***essment design process.

“There’s often a suggestion that we should use more exams in place of written ***essments, but the value of rote learning and retained knowledge continues to decrease every year. I think it’s important that we focus on skills that can’t easily be replaced by AI, such as communication skills, people skills, and giving students the confidence to engage with emerging technology and to succeed in the workplace.”

A government spokesperson said it was investing more than £187m in national skills programmes and had published guidance on the use of AI in schools.

They said: “Generative AI has great potential to transform education and provides exciting opportunities for growth through our plan for change. However, integrating AI into teaching, learning and ***essment will require careful consideration and universities must determine how to harness the benefits and mitigate the risks to prepare students for the jobs of the future.”

*Names have been changed.

Source link

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *