inspera-logo-2021-dark

Q&A: Improving Learning Environments

Suzanna Doran
April 13, 2021

How has online assessment  disrupted the way we teach and assess? What are the opportunities and risks associated with digital assessment? If you missed our Q&A Coffee Break about how digital assessment is presenting us with the opportunity to improve learning environments, you can access the full recording here. In this article, we will address some of the questions from our audience.

The panellists:

Simon-McCallum-greyscale

Simon MaCallum

Victoria University of Wellington
Sam-McKenzie-greyscale

Sam McKenzie

University of Queensland
hylton-square

Hylton Abrahams

32 Stylus

During the coffee chat, we addressed the impact digital assessment is having and will continue to have on learning environments around the world. One core message from the panel was the belief that digital assessment is disrupting evaluation methods and that the future will see a world where exams are not the dominating way for students to display their knowledge and learnings.

In addition, the flexibility of online learning environments allows universities to design assessments to reflect real life situations, with real choices. This highlights that students can learn and reflect while doing their assessments, not just reproduce what they have learned already. This lead to the overall conclusion that while we’re redesigning assessment, we should “remember the student voice”.

Now let’s have a look at some of the questions that we didn’t get time to answer during the webinar:

Q1: In your opinion, what is the new role of assessment in the online learning environment? Will it perhaps pull us more towards formative assessment? 

Sam: Before, assessment was one of the only means of measuring student engagement when it came to the courses they were taking. We use Inspera to create engagement, by adding videos and then asking the students to apply them to their assessment, creating a dialogue, even though no one was talking.


Q2: Based on your years of experience in e-assessment projects - which opportunities presented by digital exams are, in your opinion, most relevant for higher education institutions in the Australia/New Zealand region in particular? 

Hylton: On a macro level, it’s important we move towards digital in higher education. A lot of schools are embracing digital tools, which means when students reach higher education, they should be able to continue to use digital tools, so they can continue to learn and grow. The opportunity is there, so let’s do it! 


Simon: There's a lot of pressure from society for students to have good qualifications. In this sense, it's important that students get value in return for the fees they pay and that they're well-prepared for the job market.


Q3: The Covid-19 pandemic has triggered a lot of conversation about academic honesty, especially in the context of certification and assurance. Is this another opportunity for e-assessment to improve learning environments?

Simon: In my experience, I used continuous assessment, they would submit regularly during the exam. The way code is developed is assessed by how you solve the problem along the way not the final delivery at the end. For example, if this was an essay, you would see it getting built, not just the end result essay coming in as a big chunk. With digital assessment, you get to see the problem solving, with continuous deliveries, ensuring it is more realistic and therefore easier to see if something is developed honestly.


Q4: What are your thoughts on individualising assessment - what do we win and what do we potentially lose with  that approach? 

Sam: Within boundaries, it is great. You have to have some boundaries. It gives students a sense of ownership and control. If students have a number of options, they get to pick the type of assessment that suits them the best. It can get time-consuming for the academic when it comes to making the level of choices equitable. 


Simon: In games, we give people choices, and the choices aren’t free, they all mean something, so this means the choices are structured. When the choices are structured and you have very large classes, it can be successful due to comparability (Ie 100 students, 5 choices, some will choose each choice). In smaller classes, you will often personalise anyway, but in different ways.


Q5: Speaking of this pandemic period in more detail - how did your university cope when it comes to assessment, and what was the role of Inspera Assessment in that adaptation? How did you decide whether to proctor or not to proctor?

Sam: We had a strategy and goals - we wanted to make assessment more agile, authentic, etc. For the courses with Inspera assessment, they had my team helping them to transform. We have 5000 courses at our university, so we had to have every academic hitting the ground, looking at their content, and transforming everything to online, many in the LMS. We had our centralised unit ITALY helping them out and giving support.

Earlier in 2020, we needed to get everything online really fast, so that was a challenge. The first round was ‘digitising’ exams due to time constraints, later we’ve moved towards transitioning to digital assessment, which is totally different. Even though we had a strategy, it was still difficult with that huge acceleration due to the COVID crisis. 

Do you want to learn more?

Download

Follow this link to download our new whitepaper on the topic of “How to Create Better Exams”. Enjoy!

Did you like this post? Sign up and we’ll send you more awesome posts like this once a month