Skip to main content

Triangulating data to better understand the student experience: The use of UKES at LJMU

Motivations for participating in HEA Surveys

At LJMU we are striving to maximise the use of student feedback and where possible triangulate various student data sets to gain a more comprehensive picture of student experience. That’s why when the opportunity to take part in the national UKES pilot arose in 2014 we jumped on it.  Outcomes of the survey provided interesting and valuable insights into patterns of student engagement at our institution (something that usual satisfaction surveys were not able to pick up) so since 2015 we have been surveying all non–final year students and working closely with programme teams to raise the profile of the survey.

What have the findings of the survey highlighted?

The institutional UKES data are not static and survey findings reveal something new every year reflecting dynamics of student cohort and impact of interventions.  National benchmarking data provided by the HEA are a valuable asset that allow us to identify persistently strong features of student experience at LJMU and also areas where student engagement could be improved.  For example we were pleased to find out that LJMU students have more frequent interactions with staff to discuss feedback or academic performance than students in other institutions and that we consistently outperform the sector in student representation and career advice. However in-class activity/contribution has been lower than the sector average especially in some subjects with large cohort teaching.  Lower levels of engagement with extracurricular activities was another area requiring institutional attention. 

Using national benchmarking data (subject level) Faculties and individual programmes were able to explore student engagement in their specific disciplinary areas and identify their own strengths and weaknesses. The survey data provided a stimulus and a renewed focus for dialogue with staff and students about students’ engagement with their learning. The key findings from the survey are discussed at the Education Committees (institutional and Faculty level) and appear as action points in Faculties’ Teaching and Learning plans. For example a number of technology initiatives aimed at enhancing classroom interactivity for large student cohorts have been developed in response to UKES outcomes. Sharing UKES data with Liverpool Students Union and discussing findings related to extracurricular activities help to join the efforts in order to drive the changes in this area.

The engagement data we receive from students also allow us to do additional data mining and to gain institutional intelligence on how different aspects of student engagement and satisfaction are associated/linked and how they are related to other indicators of student success. This research has been initially done as part of the Strategic Excellence Initiative project funded by the HEA (1) and we continue exploring and monitoring the data.

For example 2016 findings revealed that perception of skills development development of critical thinking opportunity to interact with staff and perception of course challenge had the strongest associations with overall course satisfaction. Exploration of relationships between individual questions demonstrated that students who report feeling that their course challenges them to do their best work are most likely to be satisfied with their course.

Research we have done on the level of programme - looking at how engagement indicators correlated with student retention and success - was also very insightful. Overall programme engagement scores demonstrated a statistically significant positive correlation with programme retention and number of good honours.  The outcomes of this research help to strengthen/reinforce the message about importance of understanding student engagement on programme level.

What has the impact of participating in the survey been?

Having collated and analysed three year worth of the UKES data we as an institution are much better informed about dynamics of students engagement if/whether interventions or enhancement initiatives are making any difference and if there are any persistent issues that need further institutional efforts.

Ability to explore links between engagement data and other indicators of student experience and success on all levels: institutional programme and individual student    allow us capitalise on these findings -  by getting new insights and make informed decisions that were historically made on “gut feelings” or incomplete information ( JISC p.15).  We are currently in the process of moving UKES data to the institutional business intelligence system Web-hub – thus giving the survey more visibility and emphasising its strategic importance. 

Impact on the level of individual programmes does vary – with some programme teams embracing the survey and actively using its outputs for facilitation of dialogue with students and others – cautiously browsing through the statistics and questioning reliability of national benchmarks and relevance of some questions to their specific subject areas.  By encouraging staff to include UKES data in the programme annual monitoring reports (where response rates are representative of student populations) we are making another step in promoting the survey and its quality assurance and enhancement potential. 


UKES data is a valuable asset with a potential to empower students and staff with better understanding of student experience   - leading to informed interventions and enhancement initiatives.  As an institution we understand that patterns of engagement differ across subject areas and buying in from programme teams is the main contributing factor to the success of the survey.  UKES outcomes are more subtle and nuanced than in more common satisfaction surveys so programme teams are offered help and guidance with data interpretation. 

Increasing UKES response rates at both the institutional and national level is crucial to effectively understanding and embedding its outcomes. A more representative sample will be viewed by academic staff as a more authentic student voice. This will ensure the sustainability of the initiative and its wider institutional endorsement.

The inclusion of engagement questions in the NSS 2017 is important as this gives programme teams a reference point and opportunity to see a dynamics of how students report these aspects of engagement at different levels of study.

Our overall aim remains to capitalise on the student engagement data provided by the UKES to enable the institution to take a strategic engagement-based approach to curriculum enhancement. 

If you would like to find out more about the surveys you can visit our webpages email us at or call us on 01904 717500. 

triangulating_data_to_better_understand_the_student_experience_-_the_use_of_ukes_at_ljmu.pdf View Document