TASO has been working with six higher education (HE) providers to pilot the use of impact evaluation methodologies that can be implemented with small cohorts of students – often referred to as ‘small n’ evaluations.
We worked with the following providers on the project – City College Norwich (CCN), University Centre Leeds (part of Luminate Education Group), University of Suffolk, Plymouth Marjon University, University of Leeds (Lifelong Learning Centre) and Leeds Arts University.
In June we brought the providers together to discuss their experience of the project – what did they learn from conducting ‘small n’ evaluations, and what were their key takeaways from the project? We distil a few of their reflections below.
Drafting (and re-drafting) a Theory of Change
Considering the change mechanisms as part of the Enhanced Theory of Change (ToC) development was challenging when thinking about programmes that had been built up over time, and determining what was a mechanism and what was an outcome (and which might be both!) was difficult at first. However, these issues were worked through and the partners reflected that the opportunity to apply the ToC to a high profile and ‘live’ intervention was a positive experience.
A few lessons learnt
- It was helpful to use data analysis to understand and unpick the change mechanisms from outcomes when developing the ToC.
- Getting feedback on how well other colleagues (who had an existing understanding of the programme) understood the ToC was helpful, too.
- Looking at examples of Enhanced ToCs also helped.
- This stage took up a good chunk of time – a clearer plan of when to expect a higher workload would be advisable in future, to help manage the project timeline.
Implementing the evaluation
Implementing the evaluation was more labour intensive than anticipated and recruiting enough participants, and participant drop out, was a persistent issue. For some providers it was the first time they were implementing this type of evaluation, so they had lots of ‘firsts’ to work through.
A few lessons learnt
- Attending seminars to support the use of these new methods helped guide providers through this stage.
- Regular check-ins and support from the TASO project team were really valuable.
- Drop outs should be expected, however further rounds of evaluation will add numbers to enhance the analysis. Providers can also expect to improve data ‘diversity’ in following iterations.
And what will our partners be taking away from the project?
- The project was a brilliant opportunity for collaboration between delivery and academic members of the team and the discussion and debate that it generated helped their teams be critical, reflective and more forward thinking.
- The project has given our partners the opportunity to present the benefits of their work to a broader audience and most importantly allowed them to see clearly the impact their interventions are having on students at their university or college.
The project has formed part of TASO’s work to help the HE sector develop more robust impact evaluation. HE providers – particularly small or specialist providers – may face issues when generating causal evidence with smaller cohorts, as existing methods can be difficult to apply to these cohorts.
We hope to build a community of practice through this work to better understand the opportunities and challenges faced when implementing ‘small n’ evaluations. We want to thank all our partners for being a part of this.
The evaluations from this project are due to be published this Autumn so stay tuned…