Subscribe to our newsletter for exclusive editorial emails from the Byline Times Team.
Sam (not his real name) was in his final year at Durham when a dreaded email landed in his inbox: “Plagiarism has been detected in your essay.” He is one of at least 400 students facing a similar fate. The cause? Generative AI.
The advent of large language models like ChatGPT has ushered in waves of excitement and hysteria. The education sector is no exception: across the board, educators are grappling with the infiltration of technology.
Education Secretary Gillian Keegan announced plans to introduce a ban on mobile phones in schools this week. Meanwhile, ChatGPT has just released its most powerful update yet, generating up-to-date information in real time.
The rapid development of generative AI tools is giving rise to situations where even their creators struggle to understand the conclusions of certain outputs. Professor of Higher Education at Bristol, Richard Watermeyer, implores universities to look “beyond the buzz” and approach the AI boom with caution.
As is the case with many disruptive technologies, Gen Z are among the earliest adopters – and academic institutions are scrabbling to keep pace with AI’s increasing sophistication. However, universities’ dual roles as technological innovators and upholders of academic rigour are at odds with one another.
Nearly half (48) of UK universities surveyed have investigated students for using AI bots to cheat in assignments. The biggest offender, the University of Kent, investigated 47 students.
Anti-plagiarism software giant Turnitin released its AI detector in April, although a majority of universities soon opted-out because of inaccuracies and false positives. A handful of other AI detectors scan writing for ‘burstiness’ (sentence variety) and ‘perplexity’ (the randomness of word choice) to distinguish human from computer.
Russell Group universities have since released a statement supporting the ethical use of AI tools while maintaining academic integrity, although this lacked specifics about what it means for students.
In June, 61% of universities had no clear guidance or policy, 12% allowed its use with a citation, and 8% banned them outright.
Widespread student confusion comes as no surprise.
UCL applicant Emma (not her real name) ran her personal statement through a number of AI checkers. “One came back 100% AI generated, others came back 0% and 49% AI generated,” she says. “How can websites which claim they are very accurate come back with such different numbers? Now I’m concerned that my personal statement will be flagged for using AI when I didn’t use it to write.”
Online forums are littered with similar threads of student panic about AI detector use and accuracy. LSE professor and director of the JournalismAI initiative, Charlie Beckett, says educators must work with students rather than leaving them to make mistakes: “Most sensible newsrooms have drawn up guidelines and should now be looking to put in place resources such as training to help people to use the tools safely. Universities should do the same.”
Prioritising academic integrity while embracing novel technologies is difficult when the tech in question is a threat to it.
And there’s a certain irony of campuses competing in the AI ‘arms race’ while penalising student adopters, given the proliferation of AI and computing courses advertised. This year saw record numbers applying for computing degrees. For the first time, artificial intelligence was ranked the most sought after degree by employers, overtaking economics, law and medicine – while AI expertise was the most in-demand skill in the workplace.
Universities are also grappling with larger, existential questions about the value of a degree. Job opportunities not requiring a degree surged by 90% in a year (2021-22), and student turnover hit record highs last year with one in 37 students dropping out. In light of the Government crackdown on ‘rip-off’ and ‘low-value’ degrees, capping those like the creative arts where earning potential is less, “the old adage of learning equals learning”, Prof Watermeyer says, “is broken”.
Higher education institutions are no stranger to technology-driven plagiarism.
Generative AI marks a third wave in a 23-year plagiarism war, which commenced with the advent of the internet around the turn of the millennium. A 2003 study found that 38% of students had used the cut-and-paste technique to complete academic work, and 44% regarded the practice as “trivial or not cheating at all”.
There was a second spike in 2020-21, where the pandemic forced all learning and assessment online virtually overnight – a shift that has outlasted COVID-19. Essay writing services exploded in popularity and universities called for legislation to ban them.
Don’t miss a story
Higher education institutions, then, are contending with disruption from all directions.
Universities are playing catch-up with the combined effects of the pandemic, ceaseless rounds of teaching strikes and an ongoing marking crisis, while wrestling with questions about educational value in light of the shift towards more remote learning.
Today, it would be perfectly possible to complete a degree without setting foot in a library – gone are the days of students burying themselves in books, leafing through obscure texts until the small hours. They can now recruit AI to generate succinct summaries of dense and lengthy journal articles within seconds.
As technology continues to reshape the landscape of higher education – from Ctrl+V to ChatGPT – adapting to its transformative power is crucial to ensure the value of university degrees remains intact.
My final year at university was interrupted by COVID, and virtual exams meant that original thought and argumentation briefly became things of the past. We sat our finals from childhood bedrooms behind screens, paraphrasing academics’ words into semi-coherent essays. Laborious and time-intensive though it was, it took away the need to think for ourselves. At least we had to do this manually: today, chatbots have automated this process.
“We’ve lost that cognitive space for critical reflection, the clunky work” where real learning and engagement takes place, Prof Watermeyer argues. I let out an emphatic sigh of relief that ChatGPT wasn’t around during my degree.