Skip to main content

Putting 110,000 examinations online – how are we doing?

15 Dec 2020 | Dr Linda Amrane-Cooper The Covid-19 pandemic made it impossible for University of London students to take their summer 2020 examinations in local examination centres around the world. In response, we moved 110,000 exams online for 35,000 students. Evaluating the experience for students, staff and our systems is revealing some insights into the future of online examinations.

98% take up

Exams finished in the first week of August, and 91% of students surveyed sat all of their exams, whereas 7% sat some. This is a higher rate of engagement than in our traditional model of unseen exams taken at our 600 examinations centres across the 190 countries in which we have students.

40% of the students who did not sit all their exams explained that Covid-19 had affected their study time, while others did not feel academically ready to sit their assessments. Mental health also posed a barrier for some students. Very few identified issues of equipment, internet access or a suitable place to take the exam as the barrier to engagement with exams. As our students are taking online degrees with us, it is a requirement that they have access to broadband and a computer.

Measuring up

In our evaluation of the move on online exams we are looking at four key areas: Student behaviours, student sentiment, student outcomes and operational issues.

We used three types of online examination; proctored (invigilated) exams, fixed-time unseen closed-book exams, and unseen open-book exams with a longer response time (24 hours or several days). This contrasts with the unseen, handwritten examinations that would have taken pace in exam centres under exam condition of no access to additional resources.


Our survey to students has provided over 8600 results, and the anonymised data allows us to look at student sentiment by examination format, location, gender, age, programme of study and those requiring special examination arrangements. Student behaviour data is easily accessible via our virtual learning environments, which was where students accessed their exam papers.

As the examiners conduct their marking online, rather than with paper exam scripts couriered to them, we have the opportunity to explore the experience of academic staff in this new normal. Exam boards are just finishing and we can then start our detailed analysis of student outcomes – comparing average marks and pass rates with previous years’ performances.

What we are finding

Old habits

Initial analysis indicates that with the less familiar format of open-book exams, many students still access their paper as soon as their exam becomes live and then submit their answers within a few hours of receiving the paper – they did not take advantage of the extended time available to reflect on their work. We suspect that working under (in this case self-imposed) time pressure, is a habit learned over many years of conventional examinations. This has implications for how we help students to develop strategies for open-book assessment, more broadly for examination under these new conditions. 

Positive experience

Feedback from the student survey is helping us to understand the ways in which communication with students worked best; how students undertook their exams – by writing or by typing; and if they encountered issues uploading their answers to the VLE.

The majority of students reported a positive online assessment experience, with some variation by country and by examination type. They generally felt prepared for the online assessment platform, and were successful in uploading their exam answers. Most typed, but a significant minority hand wrote their responses, particularly in exams with mathematical symbols.

With students in over 20 time-zones, understanding how local Wi-Fi availability and bandwidth affected access and outcomes is important. Proctored exams were demanding of broadband stability and for some students in Pakistan and Bangladesh this did not work well. These students had the opportunity for an alternative exam.

Perceptions of success?

Overall, around 40% of those who responded felt they had done better in these exams compared with assessment in exam centres. 10% felt they had done less well in online exam; the remainder felt there was no difference and or stated they did not know if they had done better or worse. Marks analysis will help us to understand if perceptions match reality. Moreover, it will be important to analyse this across location, exam type and programme.

Future gazing

In the final part of the student experience survey, we found that 66% of survey respondents would like online exams at home to continue, but there was also support for our usual examination approach of handwritten exams in a local exam centre. Exams taken on computers in exam centres are a less popular choice.

We expect to have completed this large study by the middle of November 2020. Lessons learned from swiftly moving 110,000 examinations online will inform our understanding and practice of assessment,
well beyond the current pandemic. 

Assessment plays a vital role in HE. It is essential for measuring the extent of student learning (assessment of learning). Assessment should be designed in ways that promote student learning; whether learning the subject or broader level. Find out more about how Advance HE can work with your institution to develop your assessment practices here.

Keep up to date - Sign up to Advance HE communications

Our monthly newsletter contains the latest news from Advance HE, updates from around the sector, links to articles sharing knowledge and best practice and information on our services and upcoming events. Don't miss out, sign up to our newsletter now.

Sign up to our enewsletter