A collaboration of universities has recently launched EXAM INVIGILATION software to support formative and summative student assessments.
This provides a robust means of assessing students and includes:
Student set up is automated creating minimal administration of video calls and allows classes of students to be set up seamlessly:
What can invigilators see?
What kind of assessments can this system be used for?
Or email Kieran for more information: firstname.lastname@example.org
PHYSIO: Anatomy and Physiology Knowledge base exam
Anna Ziemer, Programme Leader MSc (Pre-Reg) Physiotherapy
Q. Why did you use the invigilation tool?
We could not bring students on campus for assessment due to the national lockdown, therefore we needed alternative form of assessment.
At short notice we needed to change the assessment date, however Sn@p were very accommodating and were able to slot us in within one week for a mock and two weeks for a Summative assessment.
Q. What were your initial thoughts around doing online assessments?
At first I was worried that there would be disruption for the students and cause them additional anxiety. I was also concerned that it would be a difficult process – more hassle, and that things could easily go wrong.
However, the support we got from the Snap team was amazing. He helped us through every step, and was able to resolve any technical issues that arose. It was so much easier than I thought it would be.
Q. What was the experience like?
I thought it was going to be very stressful, but in fact everything just flowed on the day of the assessment. As we'd done the mock, all the students and invigilators knew what to do. The system is very intuitive. We had an invigilator join us on the day who hadn’t done the mocks, and they managed fine.
I was concerned that students would report additional anxiety, however all feedback was positive. Students liked the experience.
Getting the examinations set up was easy, as I just sent a list of the questions to the Snap team to upload.
I was impressed with the way students with special needs could be accommodated for, and how the system allowed for extra time for these students.
Q. Did anything go wrong?
There were a couple of students on apple devices which weren’t straight forward, however the Snap team dealt with these and resolved the issue within minutes.
One of the students didn't receive the login link. However, were given access to the test on the day. After, we discovered that their email address had been misspelled on the student list sent to Sn@p. An error which could easily happen, but provision was in place for this eventuality.
One of the invigilators had an issue which meant they needed to leave the session temporarily. Because of the invigilator chat facility, we were able to organise cover to watch the students, sharing them out between us. Students were unaware of the temporary change. And the students (as well as there screens) were being recorded all the time which gave additional confidence of integrity of the exam.
When looking through the marks, the system identified that a high proportion got a question wrong. When checking the question, it was ambiguous, and so after correspondence with the awarding body we agreed to award a mark for an answer that had previously been categorised as incorrect.
The Snap team made an amendment on the platform and students marks were all altered immediately and automatically. This saved so much time. The results were then released from the platform, which was a very efficient way of communicating results to the students.
Q. What improvements would you suggest?
We saved each student results individually as PDFs. It would be more efficient if you could click on a 'save as pdf' button.
Q. Will you be using it going forward?
Yes, on this assessment we just used multiple choice questions. In future assessments we will use the open-ended text fields.
Interview date: 21st March 2021