Pedagogy and Professional Training
Key information
-
Impact on mental health
More evidence needed
-
Impact on student outcomes
More evidence needed
-
Strength of evidence
Weak evidence
All Student Mental Health Evidence Toolkit
What is it? Pedagogical interventions seek to improve student mental health by making changes to the curriculum, assessment structures and professional training.
Evidence? Currently, there is very limited evidence on the impact of pedagogical interventions within HE. The studies which do exist rarely provide robust evidence on whether approaches are effective and are mainly based on interventions targeted towards medical and nursing students outside the UK.
What is the intervention?
This intervention aims to improve mental health through the academic aspects of the student life cycle. It makes changes to the teaching practices, assessment or curriculum in ways that may help improve student mental health. Professional training can be aimed at any staff working with students and might cover topics such as listening skills or signposting. While this intervention is usually non-targeted in its approach, it may also provide targeted support such as training to support specific student groups (those living with autism, for example). A pedagogical intervention may also include new systems that provide tailored support or reasonable adjustments for students living with specific mental health difficulties.
How effective is it?
When we look at the full range of studies uncovered by our review, they tend to suggest that pedagogical and professional training programmes may have a small positive impact or a mixed impact on student mental health. However, when we restrict this to medium/high-quality studies, there is actually very little robust evidence which can help us say for sure whether, and to what extent, these interventions are effective. In sum, our review found no strong evidence to support the impact of pedagogy/professional training interventions for improving student mental health at this time.
The best evidence available tends to focus on interventions for medical/nursing students and provide some support for the case that changes to the curriculum can improve mental health outcomes including stress and anxiety (Delaney et al. 2016, Halloran et al. 2017). While this provides evidence of promise for the efficacy of specific interventions, there is insufficient evidence to make generalisable claims about this sort of approach, even for this specific group.
How secure is the evidence?
The current evidence base for pedagogical and professional training interventions is weak.
Though there is a sizable evidence base of international studies, only two are from the UK. Our evidence review found 18 causal studies and 13 empirical studies. However, the majority of causal studies were of a lower quality and are therefore not discussed on this page.
It is important to note that only one study in our evidence review comes from the UK and therefore the evidence relies predominantly on curricula and assessment structures in an international context.
To build the evidence base, more robust evidence derived from the UK is needed that measures both mental health and student outcomes.
How do I evaluate this intervention?
Because there is so little evidence on the impact of Pedagogy and Professional Training interventions, it is particularly important for HEPs to evaluate interventions in their context.
The reason so few of the existing studies can help us understand impact is that they lack control/comparator groups, which fundamentally limits how usefully they can inform decision-making. Therefore, future evaluations should seek to include control/comparator groups, ideally using randomised controlled trials (RCTs) which provide a strong way of minimising bias or differences between the groups which might undermine the findings of the study.
The broader literature on mental health interventions (covered in this toolkit) provides many examples or possible RCT designs, including waitlist designs; the key benefit of this design is that the control group is still able to receive the intervention, just at a later date once outcomes have been measured in both groups. However, this approach is less appropriate if we expect that the effect of an intervention will materialise in the longer term and so may be less suitable for some Pedagogy and Professional Training interventions. It should be noted that, in many of the existing designs, the control/comparator group receive ‘business as usual’ or less intensive interventions, rather than no support at all, potentially improving the feasibility of more standard RCT designs.
A common weakness of existing studies is insufficient sample sizes, making it hard to conduct robust quantitative analysis; this is a particular challenge when working with specific cohorts of students which may be limited in size. Effective evaluations may cover interventions running across multiple programmes or include inter-institutional collaboration to address this issue (Upsher et al., 2020).
Outcomes should be measured in both treatment and control groups using validated scales before and after the intervention has been received and at multiple time points (e.g. three-, six- and twelve-month follow-ups). These outcomes should cover both psychological measures but also, where appropriate, student outcomes such as attainment, retention and progression.
Finally, pedagogical interventions may entail quite substantial changes to the way courses are delivered and require considerable time and resources to implement. Given the centrality of pedagogy to the HE experience, and the fact that interventions may be rolled out to whole cohorts of students (rather than experienced on an opt-in basis) it is particularly important that this sort of intervention is subject to piloting with those students who may be affected, so that HE providers can ensure the intervention is acceptable and feasible before it is implemented at scale – for an example of a pilot see Delaney et al. (2016).
See our evaluation guidance for more support.
Where can I find more information and guidance?
For guidance from the Mental Health Charter, please follow the links below:
Pedagogy and professional training interventions fall under:
Some pedagogy and professional training interventions may also fall under the following themes:
Where does the evidence come from?
The evidence in the Toolkit was gathered via an evidence review undertaken as part of the Student Mental Health Project. For full details of this review, please see our Methodology document.
It is important to note that our review, and therefore this Toolkit, only relates to student mental health. The review did not cover other populations (e.g. school children, other adult populations) or non-HE settings. The review was also subject to other inclusion/exclusion criteria, outlined in the Methodology document.
Please also note that this Toolkit page only includes Type 3 (causal) studies which have been rated as providing medium/high-quality evidence according to our evidence strength ratings. These studies are outlined in the page above and referenced below. A full list of studies collated via our evidence review, including Type 1/Type 2 studies, and those rated as providing weak/emerging evidence, can be found in our Evidence Review Spreadsheet. A breakdown of these studies by type and strength of evidence is available to download.
Main references
Delaney, C., Barrere, C., Robertson, S., Zahourek, R., Diaz, D. & Lachapelle, L. (2016) Pilot Testing of the NURSE Stress Management Intervention. Journal of Holistic Nursing. 34 (4), 369–389. doi:10.1177/0898010115622295.
Halloran, D.A. (2017) Examining the effect of virtual simulation on anxiety experienced by pediatric nursing students. Capella University. Available at: https://www.proquest.com/openview/376b0e0ca9359f05ab705adb5fe45f97/1?pq-origsite=gscholar&cbl=18750
Other references
Upsher, R., Nobili, A., Hughes, G. & Byrom, N. (2022) A systematic review of interventions embedded in curriculum to improve university student wellbeing. Educational Research Review. 37, 100464. doi:10.1016/j.edurev.2022.100464.