What are the unique challenges you face in evaluating programmes, as a college in FE?

Lisa Collyer: College-based Higher Education (CBHE) tends to have a small number of staff working in the Widening Participation (WP) space, as a result of their smaller size. Consequently, the same individual(s) often write the Access and Participation Plans (APPs), carry out interventions, and then evaluate the activity. Larger higher education providers (HEPs) will often be in a better position to set up dedicated teams, rather than evaluation work being an add-on to someone’s existing role. We are fortunate in that we have recruited staff who have an active research background. However, the likelihood is that for many CBHE providers, the academic staff are more focussed on teaching, and their research experience may be more limited. This could be a challenge for CBHE providers and is something to be mindful of.

We’re more likely to have smaller cohorts of higher education (HE) students, which poses a challenge for evaluating interventions designed to support disadvantaged or underrepresented groups. For example, we will see very low numbers in particular target groups and intersections of characteristics (i.e., mature female learners, with a self-declared learning difficulty and/or disability, etc). This tends to mean that our evaluation to date has focused on Type 1: Narrative (such as pre- and post-intervention surveys with students) and Type 2: Empirical Enquiry (including attempts to build bigger groups by, for example, aggregating cohorts over several years).

What do we (the sector) need to know about supporting WP learners in College Based Higher Education that existing research doesn’t tell us?

Simon Rhodes: Whilst there is a risk in trying to characterise the College-based HE sector as a whole, typically we see higher proportions of WP learners than the wider HE sector. These learners are often more likely to be mature and can be juggling childcare and work around their studies, either through part-time work, or because they are on an apprenticeship programme through their employer. Therefore, they are also more likely to have been out of education for a significant period, meaning that they may need support in a wide variety of areas.

Alex Guy: We have a different learning environment to traditional universities. We have smaller classrooms and smaller class sizes, so we plan and deliver our learning in a more individualised way. We don’t have large lecture halls, nor do we lecture for an extended period of time. Our teaching is planned to include differentiation to meet the individual student needs.

For College-based HE students, it is important to know how to foster a sense of belonging, given that there are more likely to be students who commute into the college campus, rather than living on site in halls.

How could working with TASO – and the small n pilot – help you achieve your evaluation goals?

Lisa Collyer: We hope that the small n pilot project will help us to uncover alternative methodologies that can be applied to the CBHE sector and provide something close to Type 3: Causality levels of evaluation. Importantly, we hope that the pilot programmes will develop a range of methodologies that combine robust levels of evaluation, but which can be readily applied to smaller providers.

In time, we envisage that the small n evaluations will give providers a “what works” matrix of interventions to deal with the common problems that that sector faces, so that they can apply proven approaches where they have similar student populations and gaps to close.

Being part of an evaluation project with TASO, we are also part of a community of practice. We are learning from others, whilst also developing and strengthening our own skills as evaluators that we can utilise in the future.

What have you learnt so far from your involvement in the small-n pilot?

Alex Guy: We are using the Contribution Analysis methodology to evaluate the efficacy of one of our interventions (the development of a dedicated role that supports underrepresented groups, specifically students with learning difficulties and disabilities), in impacting the continuation rates and attainment of underrepresented  students.

As a result of working with a new methodology, we have observed that this isn’t a linear process and not quite as clear cut as we thought it may be. We have engaged in a process of drafting and refining our Theory of Change. Being involved in this pilot, we have learnt that it is not always about the end process or the output (whilst this is important), it is crucially also about the experiences of being involved in the process and reflecting on that. It is learning from the “doing” and enjoying the iterative process.

Lisa Collyer: The pilot itself has felt like a bit of a journey, if that is not too hackneyed a term. Our team has worked with colleagues at TASO and Manchester Metropolitan University to build a detailed Theory of Change model. This focuses as much on the change mechanisms at play, as on the actual interventions and expected outcomes.

This is interesting work from a theoretical / research perspective but clearly it also needs to provide workable approaches that small providers can adopt more widely.

How do you hope to benefit from the experience?

Simon Rhodes: We hope to gain a greater awareness of alternative methodologies, be that Contribution Analysis or other methodologies being piloted at other institutions, to provide a higher level of evaluation.

We also hope that we will be able to embed more high-level evaluation into our approach to our APP work, which will be timely given the increasing emphasis on evaluation within the latest Office for Students (OfS) guidance.