Before looking at the results of the pilot, it might be helpful to understand why Leeds Law School decided to invest in eAssessment. A little while ago we inserted summative assessment points early in the academic year to check on student engagement and these fell outside of ‘normal’ University assessment periods.
As a result, finding a room (or rooms) to host exams for around 350 students during teaching time was difficult. We faced stark choices, either pay commercial rates for rooms outside of the University, cancel the exams or assess our students in a different way; we chose the last. As a large University, and one with excellent teaching accommodation, we are fortunate to have a considerable number of computer labs. It was, therefore, a short journey from identifying free and available labs to running large-scale eAssessments within them.
There were some other subsidiary reasons that shored up our decision to embark on our eAssessment journey:
Disappointment with our current eAssessment solution, where previous experiences had completely removed the staff appetite for eAssessment, meant we needed a new technological solution. Through conversations, testing and finally, and perhaps most importantly, a personal recommendation, we settled on a pilot with Inspera.
Moving from hard copy to eAssessment is no mean feat; it takes time, effort and energy. Following our decision to move to Inspera the Law School created a small project team; met and planned for all eventualities, created topic question/item banks, and checked, proofed, moderated, scrutinised and re-scrutinised each question.
At Leeds, we are fortunate to have a keen, innovative and conscientious group of both academic and support staff. As such, calling in the extra effort, at short notice, was possible. However, if we learned anything from the swift roll-out of eAssessment, we’d have preferred a little more time to plan and most importantly create the various MCT question banks. Whilst you know it will take time, it will probably take a longer time than you think.
Significant worries revolved around assessment integrity and the protection of the question bank. When you consider how long it takes to create a multiple-choice question, multiplied by the totality of the bank, losing these through either student impropriety or carelessness can be expensive. Using the Safe Exam Browser, which locks down the student computer, helped allay these fears.
Informal feedback from staff was overwhelmingly positive. Our conclusion was that the system was excellent and where mistakes happened, these were because of human error rather than systemic ones.
Our ‘lessons learned’ included:
We decided to take the eAssessment temperature and surveyed our students. We carefully considered who to survey and concluded the best cohort were our level 5 students studying property law. This cohort had prior experience of paper-based MCQ assessment and had also been exposed to testing with our previous eAssessment provider. We ran the survey immediately following their Inspera summative MCQ assessment and received feedback from 191 students.
The survey revealed that students found the Inspera eAssessment platform to be user friendly, that they were confident with using eAssessment and believed the system recorded their answers accurately. Students were satisfied that eAssessment was as rigorous as paper-based assessment and over 60% of students stated they would prefer more of their written exams to be available in an eAssessment format.
In fairness, some free text answers revealed a small minority of students who preferred ‘traditional’ hard copy assessments, with one writing ‘nothing beats paper and pen’! However, overall the feedback received undeniably positive and the majority of students were delighted by their eAssessment experience.
With the benefit of hindsight, there are often things in life we might wish we had done differently, however, our decision to swiftly move to eAssessment was absolutely the right thing to do. For those yet to embark on their eAssessment journey, integrating single sign-on is highly recommended and having a little longer to create MCQ question banks would have been helpful.
Our success with Inspera would not have been possible without an excellent team and so thanks must be duly paid to Adam Gulley, Malcolm Hirst, Dr Victoria Hamlyn and Dr Louisa Ashley.
Leeds Law School looks forward to more eAssessment developments in 2020 and beyond.
Deveral is the Dean of Leeds Law School. He joined Leeds Beckett in 2015 and has been involved in Higher Education for over 20 years, 18 of which were spent at Northumbria. As an expert in Quality Assurance, Deveral has a broad understanding of academic quality and assurance processes. He is an experienced external examiner, including lead external examiner, for all types of academic award and also solicitors' Higher Rights.
You are welcome to visit Deveral Capps' profile on LinkedIn or view his research profile.
These Stories on Customer story