Back in 2015, I was a policy analyst crunching numbers on widening participation and student success. I spent half my time poring over data and the other half speaking to sector colleagues about their work to improve outcomes for disadvantaged students. The scale and scope of this work was – and continues to be – impressive, but time and time again I was asked for evidence of impact. Did the campus visits increase applications? Did mentoring improve attainment? Did bursaries reduce drop-out rates? Unfortunately, the truth was that we didn’t have evidence which could tell us definitively either way. So, in 2016, I decided to make the jump from policy into research with the aim of helping to build the evidence base for WP and student success activities.
Today the call for evidence-informed practice is stronger than ever and the sector now has its own ‘what works’ centre – the Centre for Transforming Access and Student Outcomes, TASO for short. After four years of delivering education research projects and completing a PhD at UCL, I’m delighted to be taking up a new role as TASO’s Deputy Director of Research.
Gaps in the evidence base
TASO launched in Spring 2019 and has achieved a good deal in its first year of operation. Currently, we are focused on two research themes: Theme 1 is widening participation and Theme 2 is gaps in the student experience. Both themes are following a three-phase process: in Phase 1, we analyse the existing evidence base and identify “gaps” which need filling; in Phase 2 we’ll aim to generate new evidence to fill some of these gaps; and in Phase 3 we’ll collate and disseminate our findings before returning to the start of this cycle to continually build our understanding. Until now, we’ve been going through the “gaps analysis” process for both themes and in summer 2019 we conducted a “call for evidence” through which we gathered over 100 examples of evaluation best practice.
These submissions were rolled into a broader review of the existing evidence base lead by the Education Policy Institute (Theme 1) and the Bridge Group and Coventry University (Theme 2). Excitingly, we are now at the stage where we can share the results of this analysis for Theme 1 and have recently released our first report with the Education Policy Institute: “The impact of interventions for widening access to higher education”.
The report highlights that we still lack a robust evidence base on the efficacy of activities designed to widen participation in HE. Specifically, although many approaches produce a modest positive effect on individuals’ understanding and attitudes, we know less about the impact on actual behaviour, such as enrolment. The report also identifies sizeable evidence gaps for some of the most popular approaches, including summer schools and multi-intervention outreach schemes, which combine multiple activities into a programme of support. For summer schools, we still have some way to go in establishing that they have a “casual” impact – that is, we do not know if the participants have better outcomes as a result of attending, rather than it being the case that the most motivated and supported students attend in the first place. For multi-intervention outreach schemes, although it seems these are among the most effective approaches, further work is required to understand which elements are most impactful.
Generating new evidence
For each research theme, we have established a Theme Working Group – comprising practitioners, evaluators, researchers and administrators – to offer advice to TASO. Based on our first report, our Theme Working Group has made recommendations which have been used to shape our first commissioning round and we are currently seeking HE providers to work with us on two projects. Specifically, we are seeking several partners to join a randomised evaluation of pre- and post-16 summer schools and a smaller pool of providers to help us unpick the best way of evaluating multi-intervention outreach schemes and mentoring. Given the time- and resource- intensive nature of these activities, we hope this will be a good opportunity to provide important new evidence which practitioners can use to inform their work. Simultaneously, it’s vital we use this as an opportunity to support evaluators as we develop new methods of establishing which activities are most effective, in which contexts and for whom.
Evidence isn’t the end of the line
Evidence generation is a big part of TASO’s mission, but it can’t be our sole focus – the most complete evidence base in the world is pointless if no one uses it. Behind every APP is a huge team of individuals who collectively shape and deliver WP and student success activities and it is these individuals we need to reach if we are to meet our aim of improving lives through evidence-based practice. We need engagement from across the sector to help us understand existing practice and build tools and resources which are useful and practical. For example, over the coming months we will be releasing our evidence toolkit which is a repository of information on the existing evidence base and this will be accompanied by new guidance on evaluation. All these resources have been developed in close consultation with a number of advisory groups to ensure that the dissemination of evidence is something which is done with, not to, the sector.
My hope for TASO
TASO has a big job ahead of it. If it was easy to produce evidence on the most effective approaches to widening participation and student success, then we wouldn’t exist. In the years ahead I’m sure we’ll face thorny methodological challenges, data sharing woes and a whole host of unanticipated issues – but do I think this will be worth it? Absolutely. If TASO can help the sector be more effective in tackling pernicious gaps which exist at every stage of a student’s journey through education, then we must do so. And in a few years, when I get asked those same questions as before – Do campus visits increase applications? Does mentoring improve attainment? Do bursaries reduce drop-out rates? – perhaps we will have the answers.