Skip to content
Home page

Multi-intervention outreach

Multi-intervention outreach combines two or more activities into an ongoing programme of support for students at different stages of their education.
  • Cost

    High cost

  • Impact on aspirations / attitudes

    Small positive impact

  • Impact on behaviour / outcomes

    Mixed impact

  • Strength of evidence

    Emerging evidence

Pre-entry to HEAttainment raising (pre-entry)Belonging in HE (pre-entry)Progression to ‘high tariff’ institutionProgression to HEProgression to own institutionSkills developmentSocial / cultural capital

About the intervention

What is it? Multi-intervention outreach combines two or more activities into an ongoing programme of support for students at different stages of their education.

Evidence? Most of the evidence for multi-intervention outreach focuses on whether students perceive that an activity has been beneficial and how it has changed their aspirations/attitudes towards higher education (HE). There is little conclusive evidence of the impact these programmes have on HE participation, particularly in the UK.

Should HE providers use multi-intervention outreach to widen participation? Evidence suggest that these programmes are likely to have a bigger impact than other approaches in isolation. However, since they tend to be large-scale, high-cost interventions, providers should seek to embed evaluation to understand whether they impact actual HE participation – see the TASO evaluation guidance for more advice on how to do this. Providers should also seek to build understanding of which elements are most effective – TASO is running a research project to help providers explore the features of successful multi-intervention outreach.

What is this intervention?

Multi-intervention outreach refers to programmes of support for students which include multiple components, such as: mentoring, counselling, coaching and role models; information, advice and guidance (IAG); summer schools, financial support, campus visits and subject tasters; and workshops. These programmes are usually large-scale, high-cost interventions, spanning a year or more and therefore support students at different life-cycle stages.

What is the target group?

The advice on this page is based on outreach designed to address issues faced by students from disadvantaged and under-represented groups. The evidence we’ve used is drawn from studies which focus on groups including:

How effective is it?

Most of the existing evidence is focused on whether these programmes impact student aspirations/attitudes. These studies find that participating in multi-intervention outreach seems to be associated with positive outcomes for students. However, the research methods used in the studies don’t produce causal evidence, meaning that they do not prove that multi-intervention outreach directly impacts students’ outcomes.

There are a small number of studies that look at whether multi-intervention outreach is associated with HE participation. These involve a simple comparison of students who take part versus those who do not, and the results show a positive association. However, where methods seek to take into account selection bias (i.e. the fact that certain types of students might be more or less likely to take part in these programmes), the evidence on the efficacy of these programmes is more mixed (see the list of causal studies below).

A quasi-experimental evaluation of the UniConnect multi-intervention outreach programme also showed that engagement with the intervention was associated with a greater likelihood of achieving a place in a HEP (Burgess et al., 2021). The sample size included 1,386 18-19-year-old learners completing Level 3 qualifications selected from the Uni Connect West Midlands database. Yet, the study also states that the type of engagement, the extent of engagement and the combination of engagement were mediating factors. For instance, the individual marginal benefit of additional engagements appeared to decrease after five or six engagements. The study also found that summer schools and combinations of information, campus visits and master classes were most effective for progression to HE. Despite its quasi-experimental design, the study does not provide causal claims because the learners’ degree of engagement with UniConnect was not random but determined by a combination of the learners’ and the schools’ choices.

What features seem to be important?

The existing evidence focuses on the overall efficacy of these programmes, treating them as ‘black box’ interventions. Therefore, it is not possible to identify which elements of the programmes may be most effective.

Evaluations of standalone initiatives that may be incorporated in a multi-intervention outreach programme (such as summer schools, financial support, IAG and mentoring/counselling) can provide some insight, but the context of those interventions is different when they are part of a larger programme, so we must be cautious when using this evidence.

TASO is running a research project to help providers explore the features of successful multi-intervention outreach. We are working with three higher education providers (HEPs) for this project: Aston University, King’s College London and the University of Birmingham. The project involves a combination of local evaluation work at each provider and collaborative work to map out commonalities and differences between the programmes and develop common evaluation and measurement frameworks. The local evaluation work will involve testing several causal impact evaluation methods, including randomised controlled trials (RCTs) and quasi-experimental designs where possible.

What don’t we know

While the existing research tends to focus on identifying an association between participation and student outcomes, the methods used don’t allow us to conclusively state that the interventions have caused a change in students’ attitudes/aspirations. We are also lacking evidence of the impact of these activities on actual HE participation, particularly in a UK context.

Most of the existing evidence seeks to examine the effect of a bundle of components; therefore, our understanding of which elements of multi-intervention outreach are most effective is still emerging.

Given that these are large-scale, high-cost and resource-intensive interventions, we should expect these programmes to have a bigger impact than less intensive outreach approaches. We are only aware of one study which seeks to compare the quantitative impact of multi-intervention outreach with other approaches and finds that the impact of combined outreach and financial support is large compared to either of these approaches in isolation (Herbaut & Greven, 2019). More evidence on the relative scale of the impact of these programmes versus other approaches would help HE providers understand how best to structure their overall outreach offering.

Where does the evidence come from?

TASO’s advice on the efficacy of multi-intervention outreach in widening participation is based on evidence from four causal studies, one of which took place in the UK.

This advice is also supported by 18 empirical studies which use data to show that participation in these programmes seems to be associated with positive student outcomes. Of these studies, 17 took place in the UK, including three evaluation reports shared confidentially by HE providers with TASO.  The advice is supported by three reviews.

We have focused on evidence produced in the last 10 years and, in the case of UK-based evidence, since the student finance reforms were introduced in 2012. Older evidence has been included if it is exceptionally relevant.

Some key references are given below.

Key references

Causal studies on the impact of multi-intervention outreach

Bergin, D., Cooks, H., & Bergin, C. (2007). Effects of a college access program for youth underrepresented in higher education: A randomized experiment. Research In Higher Education, 48(6), 727-750. https://doi.org/10.1007/s11162-006-9049-9

Bowman, N., Kim, S., Ingleby, L., Ford, D., & Sibaouih, C. (2018). Improving College Access at Low-Income High Schools? The Impact of GEAR UP Iowa on Postsecondary Enrollment and Persistence. Educational Evaluation And Policy Analysis, 40(3), 399-419. https://doi.org/10.3102/0162373718778133

Emmerson, C., Frayne, C., Mckally, S., & Silva, O. (2006). Aimhigher : Excellence Challenge: a policy evaluation using the Labour Force Survey. Department for Education and Skills.  https://www.ifs.org.uk/publications/3801

Page, L., Kehoe, S., Castleman, B., & Sahadewo, G. (2017). More than Dollars for Scholars. Journal Of Human Resources, 54(3), 683-725. https://doi.org/10.3368/jhr.54.3.0516.7935r1 

Other selected references

Bainham, K. (2019) ‘The impacts and benefits of employing a progressive and sustained approach to outreach programmes for universities: a case study – the progress to success framework’ in Broadhead, S., Butcher, J., Hill, M., Mckendry, S., Raven, N., Renton, http://hdl.handle.net/10545/624301

R., Sanderson, B., Ward, T. and Williams, S. W. (eds.). ‘Transformative Higher Education – Access, Inclusion & Lifelong Learning’. London: FACE: Forum for Access and Continuing Education, pp. 193-213: http://hdl.handle.net/10545/624301

Burgess, A. P., Horton, M. S., & Moores, E. (2021). Optimising the impact of a multi-intervention outreach programme on progression to higher education: recommendations for future practice and research. Heliyon, 7(7), e07518. https://doi.org/10.1016/j.heliyon.2021.e07518

Chilosi, D., Noble, M., Broadhead, P., & Wilkinson, M. (2010). Measuring the effect of Aimhigher on schooling attainment and higher education applications and entries. Journal Of Further And Higher Education, 34(1), 1-10. https://doi.org/10.1080/03098770903477052

Emmerson, C., Frayne, C., McNally, S., & Silva, O. (2005). Evaluation of Aimhigher: Excellence Challenge The Early Impact of Aimhigher: Excellence Challenge on Pre-16 Outcomes: An Economic Evaluation. Department for Education and Skills. Linked here.

Harding, S., and Bowes, L. (2022). Fourth independent review of impact evaluation evidence submitted by Uni Connect partnerships: A summary of the local impact evidence to date for the Office for Students: January 2022: Review of impact evaluation evidence submitted by Uni Connect partnerships (officeforstudents.org.uk)

Hatt, S., Baxter, A., & Tate, J. (2007). Measuring Progress: an Evaluative Study of Aimhigher South West 2003?2006. Higher Education Quarterly, 61(3), 284-305. https://doi.org/10.1111/j.1468-2273.2007.00356.x

Kettlewell, K. and Aston, H. (2014). Realising Opportunities Evaluation: Cohort 2 Final Report – July 2012. Slough: NFER. https://www.nfer.ac.uk/publications/ROEE02/ROEE02.pdf

Kettlewell, K. and Aston, H. (2014). Realising Opportunities Evaluation: Cohort 4 Final Report – July 2012. Slough: NFER.

Lamont, E., Kettlewell, K. and Aston, H. (2014). Realising Opportunities Evaluation: Cohort 1 Final Report – July 2011. Slough: NFER https://www.nfer.ac.uk/media/2139/roee01.pdf

Mazzoli Smith, L. and Laing, K. (2015) Supporting the Progression of Looked After Young People to University: Evaluation of the Choices Together Programme. Newcastle: Research Centre for Learning and Teaching, Newcastle University. https://dro.dur.ac.uk/27474/1/27474.pdf

Morris, M., & Rutt, S. (2005). Evaluation of Aimhigher:Excellence Challenge. Department for Education and Skills. https://core.ac.uk/download/pdf/9063656.pdf

Morris, M. , Rutt, S., & Mehta, P. (2009). The longer term impact of Aimhigher: Tracking individuals. National Foundation for Educational Research.

Pluhta, E., & Penny, G. (2013). The Effect of a Community College Promise Scholarship on Access and Success. Community College Journal Of Research And Practice, 37(10), 723-734. https://doi.org/10.1080/10668926.2011.592412

Sandhu, J., Bowes, L., Hansel, M. and Tazzyman, S.(2020). An independent review of evaluation evidence submitted by Uni Connect partnerships: A report for the Office for Students on the findings from the second call for local evaluation evidence: October 2020: CFE – Uni Connect evaluation evidence (officeforstudents.org.uk)

Simms, K., (2015). The ‘Heads Up’ Scheme Evaluation Outcomes. University of Sheffield, Widening Participation Research & Evaluation Unit. https://www.sheffield.ac.uk/polopoly_fs/1.783971!/file/5_Years_of_WPREU.pdf

Thompson, J., Askew J., Crockford J., and Donnelly, M. (2017)., The Impact of a Widening Participation (WP) Scheme on the Learning Experience of Medical Students: A Pilot Study., University of Sheffield, Widening Participation Research and Evaluation Unit. Linked here.

Williams, M., Mellors-Bourne, R. (2019), Improving access for the most able but least likely: Evaluation of the Realising Opportunities programme, Institute for Employment Studies. Linked here. 

Literature reviews:

Herbaut, E., & Geven, K. M. (2019). What Works to Reduce Inequalities in Higher Education? A Systematic Review of the (Quasi-) Experimental Literature on Outreach and Financial Aid (No. 8802). The World Bank. Linked here. 

Moore, J., Sanders, J., & Higham, L. (2013). Literature review of research into widening participation to higher education. AimHigher Research & Consultancy Network. Linked here.

Younger, K., Gascoine, L., Menzies, V., & Torgerson, C. (2018). A systematic review of evidence on the effectiveness of interventions and strategies for widening participation in higher education. Journal Of Further And Higher Education, 43(6), 742-773. https://doi.org/10.1080/0309877x.2017.1404558