Skip to main content

Ask, answer, assess: peer learning from student-generated content

It is well established that reviewing the work of others can both enhance learning and help develop metacognitive skills such as the ability to critically evaluate one’s own work and gain an appreciation of what constitutes high quality.

The aim of this project is to involve students in assessing and providing feedback on answers to assessment questions. We will use a form of adaptive comparative judgement (ACJ) in which students are presented with pairs of submissions authored by their peers and asked to judge which of the two is better based on a given quality metric e.g. ‘which submission explains the answer more clearly?’. Iterating this across all students results in a ranked list of submissions. The process has been shown to be capable of achieving a very high degree of reliability equalling or even exceeding that obtained from standard marking arrangements.

Students will also be offered the opportunity to further reflect and add to their own explanations after having compared peers’ answers.

In this project we will address a number of questions:

  • To what extent do peer-generated rankings of contributions based on ACJ match those of experts?
  • Which activities (e.g. answering assessing commenting) do students consider to be most beneficial to their learning?
  • To what extent does this activity promote deep learning and the development of high quality problem solving skills?
  • To what extent do students feel engaging in the ACJ process helps them prepare better for future assessments and develop an understanding of the examiners perspective?
Institution:
University of Edinburgh

The materials published on this page were originally created by the Higher Education Academy.