Work smarter, not harder: Digital exams and learning analytics

Anja Sisarica
September 5, 2019

As the new school year is gaining momentum, educators are asking themselves - can we do it better this time around? What can be improved in the way we work this year to give our students that extra boost, and make our efforts more efficient? When it comes to assessment, our answer is: you can work smarter, not harder, with the help of digital exams and learning analytics.

Learning analytics is all about collecting, analysing and leveraging insights from learner data to make better-informed decisions about learning and teaching. And contrary to popular belief, with tools like our own Inspera Assessment, you don’t have to love maths or be a computer whizz in order to get started. Read on to get some inspiration with our tips.

resized_Work20smarter20not20harder   

Learning analytics offers an opportunity to reflect and optimise learning and assessment, through the collection, analysis, and reporting of data. All of these can be presented to the end-user as actionable insights in Inspera Assessment. 

At Inspera, we distinguish between three kinds of learning analytics:

  • Question analytics: Using data to determine the quality of questions for our learners. For example, is an item too difficult or too easy, and does it provide enough information at the right ability level?
  • Question Set analytics: Using data to determine whether a test was suitable for our target group, and evaluate its quality and reliability.
  • Learner analytics: Using data to evaluate learner performance, and check whether we have met specific learner goals.

Combined, these three branches of learning analytics can be used to:

  • Improve exam quality 
  • Improve teaching
  • Enhance the learner’s experience

The more data that are collected on exam questions and stored in the item bank (a repository of questions used in an assessment program, and all information pertaining to those items), the richer the insights that can be generated. Therefore, it is worth investing in an organisational approach to item bank management. However, even if you don’t have a robust item bank or a learning technologist on your team, as an individual exam author user of Inspera Assessment, you could still gain benefits from one-off basic learning analytics per test.

Learning analytics are part of Inspera’s core offer, and we have a dedicated R&D team who are working on innovations in this field. We asked one of our data scientists, Dr Niels Goet, to share a few hands-on strategies on how to apply learning analytics on some of the teachers’ frequently asked questions.

Is my exam too difficult?

First, it is good practice to have a look at the distribution of p-values (i.e. the proportion of correct answers for an item) across the entire test. Usually, your choice of items at the aggregate level should be informed by your knowledge of the group of learners that you want to test and the purpose of the test.

For example, including only very difficult items (low p-value) means that you are setting a very high bar for your students. This may or may not be your intended goal. And, if the purpose of the test is formative, that is, to help students learn in the process of being tested, you may want to consider the build-up of items from easy to difficult across the test, and include some easier items to start off with.

What distractors should I use for my Multiple-Choice Question?

When using multiple-choice items, test writers need to choose effective “distractors”, that is, the other (incorrect) options that make up the item in addition to the correct response. Again, we can use data to inform our choices. 

First, we want to make sure that we are choosing other answer options that allow us to discriminate well between low- and high-ability learners. When we want to improve and reuse items in a test, we can use historical response data to calculate the answer discrimination index. This index can be used to improve our selection of discriminators. It is defined as the proportion of examinees that select the answer option that is in the top 27 percent of the total score distribution, minus the proportion of examinees that select the answer option in the bottom 27 percent of the total score distribution. A negative answer discrimination index therefore tells us that lower-ability students are more likely to select a particular answer option and vice versa. 

Second, we can evaluate a number of more straightforward statistics to inform our choice of distractors. The percentage of learners that selected each option, for example, gives us some idea of how well a distractor performs. An answer option that is never chosen may, for instance, be too implausible, while a distractor that attracts an equal number of responses as the correct option may reveal that the question is unclear, or that two answer options might be correct.

Learn more

Do you want to learn more about how to conduct online exams and assessments, and how Inspera Assessment can make your assessments accessible, secure, valid and reliable? No worries, we wrote a Guide to Online Exams and Assessments.

Did you like this post? Sign up and we’ll send you more awesome posts like this once a month