This guest blog post has been written by Professor Adam Gibson from the Institute of Sustainable Heritage in the Bartlett School Environment, Energy & Resources. It complements the video case study from Dr Josep Grau-Bove, who teaches on the same programme, on how he is using AI in assessments.
What we did and why
This year, we set the coursework early in Term 2 just as ChatGPT was gaining notoriety in academia. We addressed this by encouraging the students to use ChatGPT or any equivalent AI-based support, but by requiring them to explain fully how they used it. Subsequently, UCL published its “AI, education and assessment staff briefing” which supported this approach by saying “our current advice to teaching staff is to be clear with students what you regard as a permissible use of AI in your particular assignment, and how they should acknowledge that use.”
In the end, seven students out of 45 included a statement explaining how they used ChatGPT (and all specifically mentioned ChatGPT rather than any other AI assistance). Marking was anonymous, so I can’t associate the comments with specific students.
Overall, students who used ChatGPT were critical, partly for well publicised reasons (it can’t be relied upon when searching for references or specific facts and it was only trained on articles before 2021). One student used it to write a 300 word abstract for the main article but found it needed correcting.
Some students used ChatGPT to write or correct computer code. One, interestingly, fed their own computer code into ChatGPT and asked it to explain it step by step, which it did successfully. The student then used this commentary to improve the original code.
Others used ChatGPT to generate a specific list of instructions for how to use commercial software. In one case, this appeared to be successful, but in another the commercial software had been updated too recently for ChatGPT to be aware of the changes so the suggestions it made were not helpful.
They found ChatGPT useful as a tutor, explaining basic complex concepts, perhaps before moving to a more advanced and reliable source for further explanation.
This cohort of students mainly has English as a second language and found ChatGPT useful for grammatical editing.
Students frequently commented on the challenges of writing good prompt questions for ChatGPT.
I was also surprised how little advantage students felt they had gained. Students seemed to see ChatGPT as yet another tool with strengths and weaknesses which I feel is a pretty realistic attitude.
All in all, given the hype around ChatGPT, I was surprised how few students said they had engaged with it, given that I encouraged them to do so. This might be partly because during this period, ChatGPT was frequently unavailable. I was also surprised how little advantage students felt they had gained. Students seemed to see ChatGPT as yet another tool with strengths and weaknesses which I feel is a pretty realistic attitude. However, I was impressed by the broad range of applications that students found for ChatGPT. Commenting on their own code, and using it for personalised instructions, were both interesting ideas, which could be developed further.
In employment, students are likely to be asked to write reports so this is an authentic assessment with an authentic application of ChatGPT. Staff need to be able to advise students in the appropriate use of these tools, both from the point of view of integrity and also so that they can write good prompt questions and use the tools effectively.
Of course, this was all done using ChatGPT from early 2023. ChatGPT-4 is already available and generative AI models specifically aimed at academia are becoming available. The goalposts will continue to move.
This post is republished from Adam’s blog mainlymedicaphysics.