Maxexam - The future for questions

Background

The School of Medicine at the University of Nottingham is a large medical school with around 1200 students.

They had been running OSCEs using paper examination sheets which were run through an optical scanner, but were aware that there were digital systems on the market and had had positive feedback from other medical schools about how these could improve the process of administering OSCEs.

Students had also made it clear that they would like more feedback following their exams, but this was proving difficult and time consuming as it was tricky to interpret marks from a checklist in a meaningful way that would help students improve their skills performance.

The decision to move systems was accelerated during last year’s final exams when they had to send the mark sheets to the company providing the system for them to scan them – but the data they received back took a long time and wasn’t even correct when it was received.  This meant that the team at Nottingham had to spend a long time sorting out what the correct data was before they could even analyse it and at that point decided they would not run any more exams using their current system.

Choosing a digital OSCE system

The School of Medicine looked at a number of systems on paper, but then more seriously at Maxexam plus one other external supplier and an internal system.

They were looking for a supplier who could work with them to make the system work as effectively as possible for them – including providing the feedback their students were keen to have. It was also very important to them that the system was configured such that if the WIFI went down and/or a tablet broke mid exam, the data would still be kept safe.

On both of the above criteria, Maxexam came out ahead of both the other systems being considered – and in particular the team at Nottingham were impressed by the customer focus displayed by the team at Maxinity, and were confident that the two teams would work together to ensure the best set up of the system possible for the School of Medicine.

Result

The first OSCEs were run using Maxexam in February 17. 330 final year students took two 8 station OSCEs over 6 consecutive days. The administration of the exams went very smoothly, and the following benefits were realised:

  • Positive feedback from examiners. The examiners had training before running the exam, but the team at Nottingham had still wondered if they would find it hard to adjust. In fact the examiners were very enthusiastic about the system and none wanted to return to the previous system.
  • More complete and accurate examination papers. The exam was set up so that examiners could not miss off any of the marks by mistake, which is something that had happened occasionally on the paper based system. Checking by an administrator would have picked this up when the sheets were handed in - but often by this stage the examiner could not remember what the mark should have been and in this case full marks would have to be awarded. This human error was removed when using Maxexam as if an answer was missed the examiner was prompted at the time by the system.
  • Quicker results compilation, analysis and feedback. Historically time taken to release results to students had taken around 4 weeks from when the final examination was taken and on this occasion it was completed within 2 weeks. In fact the results themselves were available to the team within minutes of the final exam finishing – while in the past scanning the papers alone would have taken around 4 – 5 days.
  • Better, more specific feedback for students. Not only did the students receive their marks from the exam quicker, but the feedback they received was more detailed. Maxexam was configured so that examiners were able to mark down feedback beyond whether a student had passed or failed a question – with pre-set categories for example around whether the student needed to give more explanation when they were history taking etc. There was also an option for examiners to include free text feedback beyond these pre-set categories if necessary. This feedback was then passed on to students with their marks so it was much clearer than it would have been in the past why they had got the marks they did.

 

'Better, more specific feedback for students. Not only did the students receive their marks from the exam quicker, but the feedback they received was more detailed. Maxexam  was configured so that examiners were able to mark down feedback beyond whether a student had passed or failed a question – with pre-set categories for example around whether the student needed to give more explanation when they were history taking etc. There was also an option for examiners to include free text feedback beyond these pre-set categories if necessary. This feedback was then passed on to students with their marks so it was much clearer than it would have been in the past why they had got the marks they did.'

 Gill Pinner, Director of Examinations & Assessments, School of Medicine, University of Nottingham