Academic governance in England: a new era
The introduction of the Office for Students’ (OfS) Regulatory Framework (2018) heralded the dawn of a new era of regulation, monitoring and intervention for English higher education, bringing into effect key aspects of the Higher Education and Research Act (2017). In practice, the Regulatory Framework requires the lay governing bodies (or boards of directors) of registered higher education providers to be formally responsible to the OfS. As part of this responsibility, the OfS requires proof of the adequacy and effectiveness of each providers’ academic governance arrangements, in order to maintain public confidence in the quality of its educational provision and the standards of its academic awards. These requirements are outlined in a series of OfS regulatory conditions (the E conditions) underpinned by wider public interest governance principles. It is in this latter set of principles that the importance of academic governance (principle IV) is most clearly expressed:
IV. Academic governance: The governing body receives and tests assurance that academic governance is adequate and effective through explicit protocols with the senate/academic board (or equivalent) [emphasis added].
In practice, each provider’s academic governance arrangements comprise a series of established protocols, manifest in decision-making committee structures and quality assurance policies, processes and procedures. These enable a provider’s academic members collectively to endorse the propriety of academic judgements – from fitness to enter higher education, to qualification to graduate. Such endorsements, which cascade up (and down) decision-making boards and committees, rest on common and shared understandings of qualification and performance standards, a recognition of how best to measure students work against these, and a robust multi-level process of peer review and scrutiny which records how these standards have been applied and academic judgements have been made.
Degree apprenticeships: what’s different?
Degree apprenticeships (DAs) are a relatively new form of employer-led training programme, first introduced in 2015, that incorporate higher-level qualifications at Levels 6 and 7 of the. In practice, DAs are distinct from other mainstream English higher education provision, for which academic governance protocols required by the Regulatory Framework are designed, in a number of ways including:
- design leadership through employer-led trailblazer groups;
- funding and compliance monitoring processes through the Apprenticeships Levy and the Education and Skills Funding Agency (ESFA);
- employers (not students) as lead partners and clients; and
- dual-agency external quality assurance (QA) monitoring and inspection.
It is the final of these four differences – QA – that forms our focus here; although separating this out from the other three can prove difficult due to the complex and evolving operating environment in which DAs sit and the multiple external bodies to which DA providers are accountable.
Quality assurance: requirements and arrangements
To begin, it’s worth reminding ourselves of the two contrasting forms of external quality assurance/inspection associated with DAs. The ‘degree’ element of DA provision is subject to the same OfS conditions of registration as all other forms of higher education - including those relating to quality and standards (the B conditions). To perform its quality and standards oversight role, OfS seeks advice from the Quality Assurance Agency for Higher Education (QAA) – the current designated body for higher education quality in England. In parallel, the quality of the ‘apprenticeship training’ element of each DAs is subject to the expectations of Ofsted’s Education Inspection Framework under the oversight of the UK Government’s Department for Education. Of course, these two separate forms of external QA cannot always be separated in practice. The degree element of the DA often forms the delivery mechanism for some (if not all) of its training element and vice versa.
So why does this matter? The apprenticeship training element of DA courses comprises both ‘off-the-job’ and ‘on-the-job’ elements. ‘On-the-job’ learning, as might be expected, is delivered in or at workplaces and away from a provider’s premises. While at first glance this might seem similar to other work-based elements of academic courses (such as work placements and industry experiences), or other educational partnership provision (where teaching and learning takes place at other sites managed by validated partner institutions), the nuances of DA provision make this very different in practice. In such circumstances, key elements of provision may not be adequately and fully reflected in existing QA systems and processes. For example:
- The bi-partite partnership arrangements that underpin mainstream degrees (between a provider and its student fee-payer) are replaced in DAs by a tripartite agreement where the employer is the lead stakeholder and often sees the provider as their client. Academic teams who have established processes for engaging with their student stakeholders may have less experience or understanding of the formal undertakings needed to establish and maintain quality tri-partite arrangements. Further, established and integrated student engagement and voice aspects of QA processes are unlikely to be appropriate for DA provision.
- The assurance and monitoring meetings and visits that are used to ensure the quality of provision with mainstream educational partners, are not easy to replicate with large numbers of employer partners, which for some DAs can be in double or triple figures. Even if numbers are manageable, engaging in the QA of employer learning settings – particularly when employers see the provider as the ‘client’ – may be difficult to arrange and navigate.
- Senate/academic board and committee members may not be adequately informed about or appropriately skilled to scrutinise this different form of provision without careful briefing and support.
- Due to the distinct monitoring expectations and data requirements associated with ESFA funding compliance rules, providers’ specialist student information and records management systems, built to serve the external reporting requirements of HESA and the Student Loans Company (and which often form the backbone of academic governance recording and reporting arrangements) may not be fit for purpose.
With these differences and associated challenges in mind, many providers are now actively investing in new and adapted quality assurance processes, recording and monitoring systems, staff development programmes, employer-facing apprenticeship teams and specialist legal expertise. How these connect into existing academic governance arrangements, and what changes are needed to ensure that these are both adequate and effective becomes increasingly important as the number of DAs grow.
Things to consider
With this very different quality assurance landscape in mind, the following questions are offered as a starting point for reflection on and review of the policies, systems, processes, and procedures that underpin the adequacy and effectiveness of the academic governance of DAs.
Do academic and professional services staff, student and senior leaders and governors have an appropriate understanding of the differences between DAs and other forms of mainstream educational provision and how this relates to academic governance?
- Can they adequately judge DA activities in relation to the OfS conditions of registration (overseen by the QAA/OfS), funding compliance rules (overseen by the ESFA) and the expectations of the Education Inspection Framework (overseen by Ofsted)?
- How do you provide development and training in this new form of provision for those involved in academic governance at all levels of your institution?
Are existing quality assurance processes, including discussion, recording and reporting procedures, fit for purpose?
- Can you ensure that DA learning and teaching opportunities and academic decisions – including those associated with the ‘on-the-job’ element of the programme - are appropriately and thoroughly tested wherever academic judgement is applied, and are subject to academic governance?
- How are academic governance arrangements - committees and other deliberative structures - adapting to respond to these differences?
- Can action points and static papers and reports fully identify the extent and effectiveness of these layers of check and challenge?
- How are employer-stakeholders engaged in these processes and how have you adapted your usual stakeholder voice and engagement mechanisms to reflect the tripartite relationship at the heart of DAs?
Are your systems ready to provide accurate and reliable intelligence on the quality and standards of all aspects of your DA provision, and the evidence for effective and appropriate action to maintain standards and enhance its quality?
- How is DA data fed into these systems and how is it drawn on for academic governance purposes?
- Are you recording arrangements configured to identify and record how academic judgements linked to DAs are made, checked and challenged?
As well as offering us some key challenges to address, DAs have the potential to deliver some wider benefits. They offer us a new lens through which to view our academic governance activities and provide opportunities to question, reflect on and renew established practices and assumptions. With comprehensive work-integrated learning becoming a key strategic priority for many providers, alongside a growing interest in ‘practical’ and ‘stackable’ micro-credentials and short awards aligned to the new Lifelong Loan Entitlement, learning about academic governance through the lens of degree apprenticeships can only stand us in good stead.
Professor Liz Cleaver is the Principal Consultant Elizabeth Cleaver Consulting Limited. She has expertise and detailed experience in leadership, management, governance, quality assurance and academic standards in higher education. She is an Advance HE Principal Fellow and an Advance HE Associate.