Whether AI will have a positive or negative impact on higher education learning needs to be evaluated in the years to come. AI will certainly have an impact on how we assess student learning both for course work and assessments. The need for more in person assessment in various forms might arise – coming full circle towards more skills tests and applied assessments.
Oral exams are types of exams which are more resistant to the use of AI tools but can be unpopular for various reasons, students find them stressful and consider them a high-stake format. There is also evidence that key underrepresented student groups underperform. Staff find the delivery of the format not suitable for large cohorts. Here, we present a formative oral assessment format developed to provide feedback to student groups and large cohorts. We reflect on the process of adapting the format to be more inclusive and how oral examinations could be innovatively adapted to other teaching topics in the future
Rationale for the development of the new format
AI is currently acting as a catalyst to hot up and re-invigorates discussions on how, what and why we assess students but these aspects already were discussed during the CoVid19 Pandemic. Discussions included not only new formats but often centred around scalability, particularly where large cohorts are involved. This includes not only teaching events based on lecturing but is particularly relevant where practical and professional (sometimes referred to as soft) skills are taught in a specific teaching/learning environment such as laboratory or outdoor space.
On this aspect, we seem to come full circle towards more skills tests, applied knowledge tests and ways to give feedback on professional development as these require more process orientated feedback and charting student development over product knowledge test. This type of assessment/feedback also covers an area of knowledge and development were AI solutions are less successful when comparing this to factual knowledge recall formats.
Process orientated feedback formats often are logistic and staff intensive to run and it is often difficult to upscale these to large cohorts. We developed such a formative assessment format for students of the MBBS programme, years 1 and 2 to give feedback on their practical anatomy knowledge gained from their dissecting classes, which also tested the students’ professional skills of how well they have integrated as a team and other professional skills (see below).
For the MBBS anatomy curriculum, delivered by staff of the Division of Biosciences, students learn topographical anatomy and professional skills in lab classes where the large cohorts (340-400 students) are split into smaller teams working together. Formative feedback was delivered in the form of viva voce, where members of the small work teams were singled out each and quizzed by tutors about their knowledge and their learning approaches.
Viva voce format
Viva voce (viva) or oral examinations are regularly used in medical and clinical professional training. Done well (i.e. proper dialogue between learner and assessor), they are an excellent tool to assess student communication skills and can flexibly explore individual knowledge base, ability to integrate knowledge and observe thought processes. They remain popular in medical education as they can measure student’s progress in being able to communicate clearly in high pressure situations.
However, viva voce are unpopular with students as they can be very stressful (Scott & Unsworth 2018). Students consider viva voce a high stake format as everything rides on their performance. They are also high-risk with regards to a multitude of biases such as including examiner opinion on correct answers and students from specific student groups and anybody with protected criteria at UCL (e.g. Norris et al. 2018).
Finally, viva voce can be a costly process if students have to be assessed sequentially and logistics can be very difficult to achieve, particularly for large cohorts.
The specific scenario
In 2013, the viva voce formative format for MBBS year 1 and 2 students had undergone an update from individual to group assessment in order to simplify logistics and ease student anxiety.
However, the format was still unpopular as students felt put on the spot and found it very unfair if they knew the answer to questions posed to their teammates but not themselves and perceived variations in question difficulty very sharply. Tutor performance also varied widely and was another source of resentment.
The team therefore redeveloped the format into a student-led, inclusive format, with the students being given a task to solve themselves, which they then immediately after completion presented to an examiner, to receive feedback and any clarification where the team had any gaps in their knowledge.
The assessment format
The work flow of the formative assessment is introduced in the narrated PowerPoint presentation here.
Outcome and reflection
We delivered the format for the first time in the academic year 2019/20, prior to the general lockdown due to CoVid19 in March 2020.
- Student feedback was overwhelmingly positive. Students liked the working as a team aspect but some of the feedback hinted at issues with this as well. On closer inspection, teams who already had developed a good teamwork culture liked the format, those who had not yet managed to develop their teamwork culture found it more challenging.
- Students reported being made aware of the importance of procedures and professional skills through the assessment format
- Students also appreciated the potential the format has as an approximation to their CPSA (clinical and practical skill assessment) experience based on the curriculum material, including both the donor cadavers and the 3D printed models
- Examiners reported being much more confident with the format as well as they could concentrate on providing good feedback for the whole group, without having to single out individual students
- Examiners also highlighted satisfaction with being able to discuss with students their professional skills and how to improve on them
Overall, the experience was one of calm and concentrated working during the assessment, with students demonstrating their skills well and more satisfaction for both students and staff with the new.
At the time of lockdown, the team considered further development of the format, but these plans were derailed by the pandemic as all anatomy teaching moved to online formats. Once anatomy teaching moved back to face to face delivery post-pandemic, focus was on rewriting and updating the practical class curriculum.
However, with the rise of AI powers and their potential role in assessment, the format is now being considered to be brought back for further development and this is interesting as the new anatomy curriculum is structured in a way that the delivery of the exercise can be facilitated with less logistical input.
Areas of delivery for the existing format are:
- Updating and creating new and more questions and structures to pin and the addition of imaging tasks as we have added an imaging curriculum to the lab classes
- Potential development of a reflective practice task addressing group work practise to highlight its importance
- Checking and adding more professional skill tests
This format can be adapted to other practical skills teaching events as well. For example, lab practical skills taught in groups can be assessed in this manner as well. This is particularly attractive where there are large cohorts involved