In recent years there has been a big push for higher education providers to evaluate the causal impact of their access and success work. The Office for Students standards of evidence refers to this as ‘Type 3 evidence’, which requires the use of specific evaluation designs: randomised controlled trials and quasi-experimental designs.
TASO has been supporting the sector in conducting these forms of evaluation. However, we know that higher education providers continue to face a range of barriers to understanding the causal impact of their programmes. This is why TASO commissioned researchers from The Brilliant Club and the University of Cambridge to conduct a sector consultation to better understand what these barriers are, and to identify ways to support providers to overcome them.
At this interim stage of the project, we are sharing our main takeaways from the consultation so far:
1. Generating Type 3 evidence requires access to technical expertise in research design and statistical analyses.
There is already a high demand on widening participation staff with evaluation expertise, and few people involved in the consultation reported having direct experience in conducting randomised controlled trials and quasi-experimental designs. While there is an appetite for further training in these research methods, providers suggested that having access to experts who can provide bespoke support as they design and implement their evaluations would help them develop more rigorous evaluations.
2. A ‘no-treatment’ control group is not the only option.
For both practical and ideological reasons, there is some hesitancy towards conducting evaluations where participants (particularly school pupils) are assigned to a ‘no treatment’ control group. However, there are many variations in the design of randomised controlled trials and a whole range of quasi-experimental designs which may be more palatable. Just a few examples of randomised controlled trial variations include:
-
- Offering participants in the control group the opportunity to take part in a programme after the trial has ended (known as a waitlisted control group).
- A ‘stepped wedge’ trial where groups of participants start the intervention at different times (waves) and the order in which they receive the intervention is randomised. We can then compare outcomes between groups that have and haven’t started using the intervention yet.
- Comparing outcomes across participants randomised to receive different versions or intensities of an intervention (for example, receiving three hours’ worth of coaching sessions versus six).
- When a programme is oversubscribed, randomisation can be used to determine who will take part (treatment group) and who won’t (control group). This can be seen as fairer than a ‘first come, first served’ approach, and establishes the basis for a randomised controlled trial.
- It is important that providers are aware of the full range of possible evaluation designs, so they can identify opportunities to generate causal evidence that are most appropriate for their context.
3. We need to address the challenges of recruiting participants and securing buy-in from schools
A concern echoed across providers is that it is particularly challenging to recruit and engage schools in complex evaluations of higher education access interventions that may disrupt the normal delivery cycle or place additional demands on staff and students. As a sector, we need to think carefully about how we communicate with and incentivise schools to participate in programme evaluations and research projects.
4. The need for ‘warts and all’ knowledge sharing
Sharing open and detailed accounts of what has worked, what hasn’t, and why, is vital for enabling the sector to generate more ‘Type 3’ causal evidence. Providers who took part in our discussions were at different points in their evaluation journeys and appreciated the opportunity to share their experiences of evaluation and learn from each other. As the next phase of this project, we will be exploring ways to facilitate knowledge sharing effectively.
What’s next?
At the time of writing, the project team are conducting roundtable events to explore with providers what approaches would help to overcome the barriers identified in the first phase of the consultation.
Many thanks to everyone who has contributed to the consultation so far. TASO will use the insights generated to identify ways to best support higher education providers to evaluate the causal impact of their access and success work.