Australian academics say they are being pressured into passing hundreds of students suspected of plagiarism and other forms of cheating in order to maintain their universities’ revenue streams, threatening the integrity and international reputation of the entire sector.
The combination of commercialised cheating and the rise of AI now threatened to devalue degrees until they were “handed out like expensive lollies”, one academic said.
Guardian Australia spoke to multiple academics and students, who described wholesale use of genAI going largely unchecked at many institutions.
A humanities tutor at a leading sandstone university said she was “distressed” to find more than half of her students were flagged to have used AI in their first assignment for all or part of their work this year – a “huge increase” on 2023.
She believed the real number was much higher. But any repercussions were minimal.
“We’re not holding students to a standard,” she said. “It’s not fair on anyone who thinks a degree is worthwhile – a lot are not at the moment. It’s just proof they’ve been paid for.”
She has worked at a number of universities over three decades and said she had seen a “huge dependence” on the international market in recent years, so much so that tutors felt under pressure to pass students in order to keep the revenue coming.
“Nobody is blind to it,” she said. “It’s not a social or educational environment; it’s a box-checking exercise. A master’s degree is not worth what a bachelor’s used to be.”
Up to 80% of her courses were composed of full fee-paying overseas students, she said. Many struggled with English language skills in classes and meetings yet produced perfectly written essays.
But she said she was unable to fail them when they did not meet the requisite level in assessments or were detected for probable AI use, because the detection technology was incapable of providing a definitive answer.
Since the late 1990s, student responses at most Australian universities have been automatically plugged into the plagiarism detection tool Turnitin, which searches for signs of copy pasted from other sources, the use of ghost writers or recycling of previous essays.
Now it is also scanning for the use of emergent genAI, including ChatGPT, but cannot provide conclusive evidence in most cases.
At most institutions, when the technology alerts staff to material that may not be original, the matter is raised with the student and then referred to an academic advisory panel for adjudication if the student denies wrongdoing.
“All you can do is ask questions, and if you get a blanket denial they didn’t do AI, there’s nothing you can do,” the tutor said.
It was not just a matter of whether faculties could adequately detect cheating but whether they really wanted to. She said a member of her department explicitly told her “we need to pass students”.
“As a tutor, you don’t have the authority to fail students … departments don’t want fails because they want money, and if you fail someone, you’re cutting off their money stream, and there’s so many lines of appeal.
“If students persist, they will pass no matter how bad it is. At some point, we have to say this is not sustainable. The university as we used to know it is dead. Covid nearly killed it … and AI has dealt it its death blow.”
The academic said she had begun telling her children that getting a degree was a waste.
‘I almost lost my job raising concerns’
Academics told Guardian Australia they often felt unsupported or discouraged when they spoke up about alleged cheating.
A science tutor at a sandstone university alleged they faced repercussions last year when raising concerns over papers during the first wave of AI.
“There was a near mutiny among the teaching staff when we were told that we had to mark [apparently] bot-written papers as if they had been written by students,” they said.
“I almost lost my job raising our common concerns about this to the subject coordinators. About one-fifth of the papers were plagiarised that year. I don’t think many people, if any, got seriously disciplined in the end.
“Far from discouraging AI use, they’re doubling down.”
“Our current directives are not to report them without a smoking gun,” they said.
“With the rise of genAI, we have half the cohort treating it as a low or no-cost contract cheating service … this is in addition to the other contract cheating services that are advertised online.”
The academic said if they had had time, they would have submitted reports suspecting plagiarism for about 40% of their cohort this year. But they said every report had to be approved by an integrity officer, of which there were just two per department – catering to thousands of students.
“There are not enough staff in the world to process that with our current procedures,” they said.
The universities gave a largely uniform approach when asked how they were mitigating the risks of AI, emphasising the need for balance in embracing the technology while implementing new methods for detecting inappropriate use, including risk modules and updated academic integrity policies.
Luke Sheehy, the chief executive of Universities Australia, the peak body representing the tertiary sector, said artificial intelligence created both “opportunities and challenges”.
Asked whether the integrity and reputation of universities were threatened by the pressure on academics to pass students suspected of plagiarism, he said institutions were “continuing to navigate the use of this evolving technology”.
“Universities work closely with the relevant government agencies to ensure they are meeting the standards required of higher education providers in Australia and are committed to delivering a high-quality teaching and learning experience for students,” he said.
Level of cheating ‘beyond belief’
Current students who spoke to Guardian Australia agreed with the views of many academics, saying the level of cheating they had witnessed from both domestic and international students was extreme, assessment methods had failed to keep up and the risk of adverse consequences was minimal.
“Some of the cheating and plagiarism I have seen over the past 18 months is actually beyond belief,” one second-year undergraduate student said.
“Pretty much everyone uses it,” another undergraduate student, Zac (not his real name), said of ChatGPT. “[It] doesn’t matter which degree they’re taking. As for whether they’re using it as a learning tool or blatantly copying and rewording its results, it’s hard to know.”
Zac said he had friends who copied and pasted questions into ChatGPT during unsupervised, closed-book tests and wrote their answers based on what the program spat out. Others used it as an “easy way out” when an assignment was due in a few hours and they didn’t know how to start it, he said.
One student who admitted using AI to write assessments said: “Nobody can ever prove it unequivocally. The [AI] tools are incredibly useful, and any evidence of use is easily hidden.
“The ways course coordinators try to mitigate AI use are incredibly naive – to actually eliminate AI use, they’d have to fundamentally change how assessments run.”
An undergraduate student studying commerce at a university in Sydney said AI was “like a plague”. He said while assessment outlines now had sections explaining whether generative AI was allowed and on what grounds (such as for brainstorming), it was just a “surface level change”.
“The integral design of assessments, in my experience, has not changed,” he said.
“We’re still assigned typical essays, or videos, or presentations, or have exams. I’ve had a tutor say to me: ‘In the video format, we don’t have tools to detect AI, so it’s OK to get it to do your assessment.’
“Why should they invest in properly testing us if they can just get us to pass with AI, and make money while doing so?”
The student said it was even easier to cheat in online exams, which allowed students full access to the internet.
“I know so many people who opted to use AI during their exams and managed to receive a distinction result purely from the use of ChatGPT,” he said.
“At least 50% of students I’ve talked to use generative AI – and are not learning.”
Another student, completing a master’s at a Melbourne institution, said he had encountered just one teacher during his degree who had been concerned about academic integrity.
“I was in a class essay writing task worth 30% where you could clearly see students using ChatGPT or other AI programs – the teacher could see it [too] but didn’t do anything,” he said.
But Dr Rebecca Awdry, an expert in academic integrity and honorary fellow at Deakin University, said the emergence of genAI had forced the sector to have “frank conversations” it had avoided for too long about other forms of cheating.
“I felt like for years I was speaking into a void and people weren’t paying attention to the problem shown by the research,” she said. “But it took genAI to terrify everyone for there to finally be a focus on integrity.”
“Now, Australia is more open to frank conversations that institutions are not and have not been doing enough.
“Two essays and a final exam is antiquated, we need to be more innovative,” Awdry said.
“Assessment should include learning as well as testing knowledge. Repetitive, rote learning isn’t what students have in the world of work. We need work-integrated learning, placements, real world problems.
“Make it real and engaging, not a tick box.”