Last academic year, the University of Chester School of Law bucked the trend in the sector with an increase of over 15% in student agreement with NSS Question 25: it is clear how student feedback on the course is acted upon.
We put this down to our new approach to gathering, and closing the loop on, student feedback.
Why make a change?
It goes without saying that we should all be closing the loop on student feedback, the drivers for doing so are well documented. However, awareness of the importance of doing so has increased with the advent of the inclusion of the NSS Student Voice heading - introduced in 2017 - into the TEF metrics.
The first set of NSS responses under the Student Voice heading (NSS questions 23 to 25) demonstrated an anomaly in our students’ perceptions of our efforts to engage with them via student voice mechanisms.
Whilst the proportion of our students agreeing with the statements ‘I have had the right opportunities to provide feedback on my course’ and ‘staff value students’ views and opinions about the course’ exceeded the sector score for these areas, our NSS responders scored us 15% below the sector average for ‘it is clear how students’ feedback has been acted upon’.
What did our students want?
We surveyed our Level 5 and 6 students to understand what had gone wrong and how we could put it right. This threw up some interesting results:
- Students were more concerned to be kept informed about what had not been possible than what had: whilst 93% felt it was ‘important’ or ‘very important’ to close the loop when student feedback had been actioned, this rose to 98% where action had not been possible.
- The warnings in the literature on disengagement and expectancy theory played out on the ground: of those surveyed, 77% reported being unaware of action taken following student feedback the previous academic year and, unsurprisingly, 75% had chosen not to participate in the staff student liaison process that year. Of these, 40% ‘felt nothing would be done’.
- Students value the personal touch (but not from their student reps): when asked to rank methods of closing the loop, students overwhelmingly ranked ‘verbally in a lecture’ as the most effective method of keeping them informed of the results of student voice mechanisms. When asked to suggest alternative methods of closing the loop to those presented in the survey, not one student suggested feeding back via student reps themselves.
Shifting the focus – and focussing on action
We made changes to both our Staff Student Liaison (SSL) system and module evaluation process.
The aim of the SSL changes was to shift the focus of the meetings with reps away from being a forum for gathering concerns and towards being an opportunity for reps to work with us during the meetings to co-create solutions.
All feedback for the SSL meetings was gathered via an individual survey on the VLE completed by students using their phones at the start of a lecture.
This removed the ‘bearer of bad news’ role that can often befall a student rep during SSL meetings and enabled a more collaborative, solution-focussed atmosphere around the table.
The outcomes of the SSL meetings were recorded via a focussed rolling action plan. A lecturer gave over the first five minutes of a full-cohort lecture to provide a summary of the actions arising and pointing students to the full action plan on the VLE.
For module evaluations, we asked students to respond to a VLE survey via their mobile devices mid-term, during a November lecture. We asked for suggestions for ‘one thing to keep’ and ‘one thing to change’, allowing students to give short, sharp feedback early on in their module experience. Module leaders responded to suggestions orally in a lecture within two teaching weeks and saved their action plan on the VLE.
What we learnt: the (consistent) whole is greater than the sum of its parts
Our revised methods are not, in themselves, ground-breaking. However, our systematic, consistent approach to both module evaluation and SSL processes sent our students a strong message: we will gather, listen to, and respond to, your feedback. Students clearly heard us: our response rate for NSS question 25 increased by 15% in one year. Our end of year module evaluation participation rates also increased, reflecting our students’ faith in the process and demonstrating that closing the loop increases student motivation to participate in further surveys.
There is, of course, always more we can do. This year we have produced a Student Guide to Feeding Back to Us, as well as a Student Voice timeline showing what students can expect of the Student Voice process each week of the academic year. Both documents (included in our Programme Handbook and advertised widely during induction) ensure that students are absolutely clear on what they can expect from us, and when.
In NSS 2018, Question 25 remained the second-lowest scoring NSS question across the sector. We need to continue to share our experiences at events such as the Surveys and Insights conference to ensure that we do not fall into the trap of telling our students that we listen to them yet fail to hear (and act on) what they are telling us.
I am in the process of setting up a Student Voice network on Advance HE Connect. Watch this space for more information.