Written by Matthew Horton, Evaluation Manager, Strategic Planning Office, University of Wolverhampton

The Office for Students (OfS), TASO and the wider higher education (HE) sector are on a major drive to improve standards of evidence to establish which interventions are most effective in improving student outcomes. There are various obstacles to achieving these aims: professional service staff delivering activities are often left isolated without the capacity and expertise to conduct robust empirical/causal evaluations, and improving standards of evidence requires a whole institutional approach to embed a robust culture of evaluation.

To establish ‘what works’ it is paramount that institutions harness their expertise by encouraging academics / Post Graduate Researchers (PGRs) to collaborate with staff and students. Failure to harness institutional expertise will slow the sector’s progress in closing gaps in student outcomes.

At the University of Wolverhampton, we have set out to resolve this issue by employing various mechanisms to encourage academics’ involvement in evaluating interventions across the student lifecycle.

Where we are now and where are we going?

The University of Wolverhampton has a highly diverse student population with a high proportion of mature students and those who are first in the family to go to university.  Hence, our Access and Participation Plan (APP) strategic priorities focus on closing gaps in continuation, success and progression (ethnicity and Index of Multiple Deprivation).

Our initial evaluation self-assessment (2019) highlighted that there were very few examples of robust evaluation and only some evidence of theories of change being embedded into practice. The picture was much better across access compared to other stages of the student lifecycle. There has been limited emphasis on evaluating interventions/activities in terms of ‘what works’, in what contexts and for whom.

This evidence is critical if we are to meet our ambitions to close gaps and support equity of outcome across the student lifecycle. We are under no illusion that improving evaluation practices will require a major change in approach across the university.

Embedding an evaluation culture

In late 2020 I was employed by the university as an Evaluation Manager to coordinate APP evaluation across the institution. To support the embedding of robust evaluation practices, we have conducted an evaluation skills audit and begun to deliver workshops and develop resources/guidance materials to improve staff understanding of the APP, the OfS’ standards of evidence and evaluation approaches.

We have revised ethical approval processes to ensure that, whilst remaining robust, they are straightforward for staff to navigate. We have recently implemented an APP scoping exercise and have initiated ways of aligning the expertise of academics and PGRs to support evaluation.

The scoping exercise

There are many interventions and process changes (e.g. assessment and marking) being implemented across various stages of the student lifecycle; however, there was no official log of all the interventions/activities that were being delivered. We have addressed this by launching a university-wide scoping exercise to improve our understanding of:

  • What was being delivered
  • Identifying gaps in provision in relation to our key APP priorities
  • What is currently being evaluated
  • What high cost/resource-intensive interventions should be prioritised for evaluation

The findings of this scoping exercise will help to align our evaluation priorities to the expertise and interests of academics and PGRs.

Building bridges

I have worked within the WP sector for over 14 years. During this time, I have noticed that often there is little collaboration between professional service staff and academics when evaluating WP activities. It can sometimes be hard to engage academics in this process. In part, this is due to a lack of capacity and small evaluation budgets held by teams to tender this work. For professional service staff, their time and efforts are spent on delivering student support services.

In consequence when evaluation is conducted this often relies on measuring outcomes for students who have engaged, with no comparison or control groups. Many staff are not from a social science background and do not have the expertise to design and conduct complex causal studies. We can improve our standards of evidence by increasing capacity and expertise by encouraging academics and PGR students to engage in this process.

We have begun to identify ways that we can encourage partnerships and build bridges between professional service staff, academics and PGRs. This has been supported by high-level strategic buy-in of management across our academic faculties. To improve academics’ engagement with APP evaluations we have initiated the following changes:

  • Associate Professors’ roles and responsibilities have been amended to incorporate a set number of hours annually to support APP evaluations.
  • Seeking to align academics’ workload allowance for self-directed research to the evaluation of projects related to APP priorities.
  • Creating a community of academics interested in APP evaluations led by learning and teaching professors, emphasising the potential for these as evidence in promotion cases.
  • Sixteen school inclusivity leads providing additional resources (100 hours per year) to promote inclusivity: to staff, within curriculum design and supporting monitoring and evaluation of APP interventions.
  •  We have set aside £1 million to support institutional research projects that aim to improve outcomes across the student lifecycle.

In addition to closing the gaps in student outcomes and improving social mobility, it is also important to communicate how involving academics in the evaluation process can be beneficial in terms of supporting their research outputs, and also support PGRs’ experience within a growing field.

Back to the future…

At the University of Wolverhampton, we are in the early stages of this journey in initiating cultural change in how we measure effectiveness and implement interventions based on evidence. This work will play a pivotal role in improving the equity of outcomes and ensuring our students realise their full potential. A whole institutional approach to evaluation that involves collaboration between professional service staff, students, academics and PGRs will ensure that we meet these ambitions.