The Making Evidence Work Effectively Symposium has been designed to bring together a variety of staff that are looking to learn more about theorising, designing and utilising evidence for decision-making in higher education.
broad set of activities that collect, transform, analyse, and use data to generate evidence to support institutional planning, policy formation, quality enhancement, and decision making" (Woodfield 2015, p. 89). Institutional research (accompanied by evaluation) in the UK higher education sector is supported by a network of institutional practitioners, but is much less well defined within institutions. Institutional researchers in this context are often self-defined, are unlikely to have this phrase in their job title and have a variety of professional and academic backgrounds."
This area of institutional activity is the focus of my role at Sheffield Hallam, within a central Directorate titled ‘Student Engagement, Evaluation and Research’ (@SHU_StEER). Our function is to support the needs of students, as identified through student-focused research, evaluation and the analysis of student data. One of our core areas of work is to develop an evaluative mindset across the institution and build capacity for effective, evidence-informed decision making.
Structures and resources vary across the sector, but there will be someone in your institution doing something similar - maybe it's you.
There are benefits to developing an evaluative mindset: evidence is sought for what works and what doesn’t with the potential to enhance student experiences and outcomes. Conversely, as expectations for data collection, evaluative thinking and impact increase, staff may have concerns about capacity, confidence and workload.
The impact on students, as partners and participants, also needs to be carefully managed so that they are consulted, valued and empowered and not over-researched and exploited.
Co-creating a set of value based principles for institutional research and evaluation is one way to ensure transparency. Here are some of mine:
- institutional research and evaluation should align with social justice approaches with a focus on equality of opportunity (structural rather than individual change)
- institutional research and evaluation design is appreciative where possible, not deficit focused
- institutional researchers and evaluators position themselves as brokers of dialogue
- institutions seek to build criticality through data confidence
- institutions support staff wellbeing during capacity building
- institutional research and evaluation situates experiential expertise as key
- data collection activities develop ownership through leadership
- institutional research and evaluation should be ethically sound with acknowledgements of potential bias
Developing and evaluation strategy
The development of a robust and credible evaluation strategy is another way to ensure that data and evidence are the foundation of transformational change. Drawing on the OFS Evaluation Framework (based on the work of the Centre for Social Mobility 2019), I suggest that this strategy should:
- provide an evidence informed rationale for why the institution is undertaking all new and continuing activities and interventions
- build evaluation of clearly defined objectives into all activities/interventions at the point of design
- adequately resource and plan all evaluative activity, including obtaining ethical approval
- ensure that pathways for the dissemination of findings are clear so that contextual learning of 'what works' is an outcome.
This approach is not just applicable to the Access and Participation context in which it was designed. In fact, our APP states:
We will embed an evaluation mindset in our approach; it will be a core competency of staff and will drive objective setting, activity/intervention design and our teaching and learning. We will be joined-up, aligning evaluation with our work on TEF, equality objectives and privacy impact assessments, and avoiding duplication and repetition."
Values-based principles and strategic guidelines for an evaluative mindset have resonance across all areas of an institution (which is why capacity building is essential). This approach aims to build evidence ownership, via the analysis and synthesis of existing data and the collection of new data for clearly rationalised and well-designed activities.
What can we learn from others?
My colleagues and I lead the 'Making a Difference with Data' group on Advance HE Connect, which is a community of practice dedicated to discussions about data and evidence use in HE. Please do join our conversations.
The Making Evidence Work Effectively Symposium will also provide a space to discuss different approaches and expectations, challenges and reflections within a teaching and learning context. Please do consider submitting a proposal for a presentation or a workshop, or come along to join the discussions.
Our symposia series focuses on thematic areas of interest within teaching and learning and are for anyone wishing to improve their knowledge of the latest teaching pedagogies and upskill their practice in key areas.
Colleagues are invited to submit an abstract either for a 20-minute presentation or for a 45-minute workshop. Find out more and book your place
NB: Research and Evaluation videos are taken from the digital glossary created to support Guide to Using Evidence in Higher Education (Student Focus), Austen & Jones Devitt 2019, QAA Scotland.