AI is here. New models like ChatGPT can take a simple prompt and turn it into in-depth essays, articles – or even songs.
So what does this mean for schools? Will the new tech make it easier than ever before for students to cheat on homework and exams? How are teachers and parents supposed to stop them? We put the question to a jury of experts in education – and ChatGPT itself – to get the answers.
Amber Lloyd, Head of English Faculty, St Cuthbert’s College
Lloyd, who has been in education for 30 years, described the new software as a “game changer”, and a tool she was keen to explore.
“Especially with my seniors, we can get them to co-write an essay using Chat GPT, look at it together as a class – ask them what are its flaws? How can you identify its structure and engage with it?”
* Universities expand controversial AI monitoring of students taking online exams
* Honesty and technology needed to crack down on online exam cheats
* Every school in New Zealand needs a data scientist, Microsoft says
Teachers concerned about cheating could encourage handwritten exercises, she said, not just to avoid unoriginal work, but because it’s better for learning overall.
“We’re getting them to do handwritten tasks, not just because of authenticity but because the research suggests handwriting is better in terms of being able to synthesise and learn.”
Lloyd uses software to pick up plagiarism – but ChatGPT works around this by generating new text itself, so it can evade the software.
Ultimately, Lloyd said, it became a matter of how well teachers knew their students.
“It’s not completely worrying, because it comes down to how well you know the students and what they’re capable of.”
She asks students to submit a handwritten sample task at the beginning of each year as a benchmark of ability.
Lloyd suggested encouraging oral assessment and comparative text assignments to get around the use of Chat GPT.
Dr Andrew Chen, Research Fellow at the University of Auckland
Dr Chen, who has worked in AI for nine years, said
the concern is that students can cheat using AI tools is a valid one, but teachers can use AI to reach more students, more effectively.
He suggested the tools could be used to benefit students with learning disabilities.
“They can use these tools to augment their abilities to meet the standards of society.
“When we think about resources for those students, it is very resource intensive to support those children – tools like this may be able to help with that.”
Dr Chen also advised teachers to break up their assessment in new ways, such as providing more oral or video assignments.
“The technology is enabling,” argued Dr Chen, “It asks questions like – is writing an essay the only way to learn skills? Is it necessary to write an essay to logically think through or analyze a text? Is that the only way to communicate that?
“I would say over the last 20 years there’s been a shift into the idea that writing may not be the best way to communicate – videos and podcasts might be.”
Dr Chen suggested teachers use more local texts so ChatGPT does not have the resources to write about them.
“Teachers can choose more localized content and pick subjects that the bots are less likely to have seen before.”
We put the question to GPT – here’s what it had to say for itself:
“There are several ways to prevent students from cheating with AI technology. One approach is to use proctoring software that uses artificial intelligence to monitor students during online exams. This software can detect suspicious behavior, such as the use of prohibited resources or the presence of another person in the room.
“Another approach is to use AI-generated questions or tests that are unique to each student, making it difficult for students to share answers.
“Additionally, institutions can also create policies and procedures to address academic dishonesty, including the use of AI technology, and provide regular training and education to students and faculty on academic integrity.”