Learning analytics (post-entry)
Key information
-
Cost
Medium cost
-
Impact on aspirations / attitudes
Mixed impact
-
Impact on behaviour / outcomes
Small positive impact
-
Strength of evidence
Emerging evidence
What is it? Learning analytics is an umbrella term for the measurement, collection, analysis, and reporting of data about learners, for the purpose of understanding and optimising their learning and the environments in which it occurs.
Learner analytics is an area of learning analytics which focuses solely on the learners and attempts to identify targeted interventions that help mitigate the risks individual learners may experience while studying.
Evidence? Since the early 2010s, interest in learning analytics has increased rapidly and prompted the publication of many papers documenting small-scale experiments in the areas of education, psychology, computing and data science. Yet, much of what has been published lacks empirical rigour or peer review. Most literature reviews do not reflect upon the research design or Theory of Change of the studies included.
The existing evidence base suggests that well-designed learning analytics interventions tend to improve students’ outcomes. Whilst we can assume that students’ aspirations/attitudes are also positively affected, more causal research is needed to confirm this assumption.
Should HEPs adopt learning analytics? The existing evidence suggests that learning analytics can be beneficial for students. However, currently, there is no comprehensive model supported by a strong evidence base for instructors to make effective learning analytics interventions. There are also a number of challenges and limitations to the implementation of learning analytics that should be addressed in each context.
What is this intervention?
Learning analytics relates to the measurement, collection, analysis, and reporting of data about learners, for the purpose of understanding and optimizing their learning and the environments in which it occurs.
The main aim of learning analytics is to utilise educational data to improve student outcomes. It also aims to identify at-risk learners, allowing academic staff to offer them meaningful and targeted feedback.
In practice, most interventions aim to improve academic performance, retention and engagement (Wong and Li, 2019).
These interventions use data to identify the challenges learners may be experiencing and provide timely and personalised support to maximise success and progression in higher education.
Learning analytics can be conceived as a four-step process (Khalil and Ebner, 2015):
- Data generation: where data is generated on learning platforms;
- Tracking; where indicators are monitored by analysts;
- Analysis: where patterns are generated and information is retrieved from the data;
- Action: such as prediction, intervention or personalisation.
Data sources most commonly used in these interventions include Virtual Learning Environments (VLEs) – which tends to record login details attendance, grades, and other coursework activities – and Student Information Systems (SIS) – which contains personal details about students, such as their prior qualifications, socio-economic status, ethnic group, module selections and grades obtained. Other data sources such as library systems can also be used in learner analytics (Sclater et al., 2016).
The indicators generated using this data can be used by different ‘agents of change’ (e.g. students, tutors or study advisors). Likewise, the interventions can take a variety of forms. Some are available directly (e.g. dashboards, visualisations, automated alerts) while others require specific processing (e.g. phone calls, tutoring sessions or additional assessment).
Each design decision reflects a certain vision of learning analytics and influences the impact of the intervention.
The emergency shift to online teaching and learning due to the COVID-19 related lockdowns reinforced the need for learning analytics and opened new opportunities for the implementation of interventions. Indeed, the quantity of available data increased as well as the need to monitor students’ engagement.
What is the target group?
The studies and methods discussed on this page are focused on HE students, although learning analytics has started to be used on pre-entry students as school teaching moved online during lockdowns due to COVID-19.
Some interventions cover students regardless of their initial achievement levels, while others attempt to identify and target those who are underachieving and thus likely to encounter learning difficulties, or even fail, in their studies (Fuchs et al., 2003). Indeed, learning analytics appears to be more efficient at identifying at-risk students than demographic data and it can be used to indirectly target support at disadvantaged student groups (Foster and Siddle, 2020).
How effective is it?
The strength of evidence on the impact of learning analytics interventions is emerging. Currently, there is enough evidence to suggest that learning analytics interventions can be effective, but the impact is highly dependent on context and design choices and there is no comprehensive model supported by a strong evidence base to guide instructors in their work.
There is causal evidence to suggest that learning analytics interventions can improve student’s outcomes by allowing for the early identification of the challenges students are experiencing.
Cambruzzi et al. (2015) tested a learning analytics system providing dropout predictions and allowing personalised pedagogic action to be undertaken by teachers. The system predicted student dropout rates with an average of 87% accuracy and led to a 11% reduction in dropout rates.
Krumm et al (2014) evaluated ‘Student Explorer’, a system alerting students and teachers of student progress and performance, using a traffic light system. Teachers were then encouraged to congratulate green-light students for their progress and to engage with red-light students in a consultation. The quasi-experiment recorded significant increases in test scores for all the participating students after the implementation of Student Explorer.
Jayaprakash et al. (2014) evaluated the impact of the Open Academic Analytics Initiative (OAAI) providing notifications to at-risk students and encouraging them to join a peer-to-peer support community. The randomised controlled trial found a large positive effect for this learning analytics intervention, as the intervention group achieved 6% higher grades than the control group. It also concluded that amongst students at-risk of failing, the treatment group had a higher rate of withdrawal from the course than the control group. The authors speculated that this may be because these students preferred to avoid attempting to complete the course and failing.
Davis et al. (2016) also used a randomised controlled trial to evaluate ‘Learning Tracker’, a tool allowing the visualisation of various engagement metrics. The tool also compared the student to previous successful cohorts, providing relative feedback on engagement. They found that academic performance increased by 8.5%, and two out of six engagement proxies improved.
In another randomised controlled trial, Labarthe et al. (2016) evaluated the effect of integrating a peer recommender into Massive Open Online Courses, providing each student with an individual list of potential peer contacts. Results show that the recommender improved learners’ persistence on four factors: attendance, completion, success and participation. Students were much more likely to persist and engage in the MOOC if they received recommendations than if they did not.
Yet, not all interventions had the same results. In a randomised controlled trial, Dodge et al. (2015) evaluated a model utilising engagement data, academic data and trigger events to send email nudges to at-risk students on the VLE, suggesting ways to improve their performance. The study found no overall significant difference between the treatment and control group on course achievement and concluded that stronger interventions are needed.
Similarly, in another randomised controlled trial, Hellings and Haelermans (2020) evaluated the impact of providing students in a Java programming course with a learning analytics dashboard and a weekly personalised email about their study progress and chances of success. They found that the intervention positively affected student performance on quizzes and online mastery exercises, but not on the final exam of the programming course. It also had no positive effect on student behaviour in the online environment. However, they observed differential effects by specialisation and student characteristics.
Overall, the existing body of evidence encourages the wider adoption of learning analytics systems, as most studies report some effect in improving students’ outcomes.
What features seem to be important?
Although the field of learning analytics is growing rapidly, in practice most HE providers have limited to no experience of using learning analytics (Viberg et al., 2018). To widen the use of learning analytics, it is necessary to identify the existing barriers and address them, as well as promote the features that seem important.
There is evidence that learning analytics can identify which students are experiencing challenges. However, once this has been achieved, it can be difficult for instructors to effectively address students’ difficulties. Indeed, they have to deal with competing priorities (i.e. research, teaching and tutoring), a complex balance exacerbated by limited training and lack of emotional capacity to deal with the problems faced by students (McFarlane, 2016)
Furthermore, staff and students also often lack the skills needed in order to use learning analytics tools efficiently. To address this limitation, if HE providers use learner analytics they should actively support staff to acquire the data literacy, technological ability and interpersonal skills required.
Choi et al. (2018) also suggests using different intervention methods depending on the significance of students’ problems. Hence, less costly but scalable methods can be used for general problems whilst more costly and less scalable methods (e.g. face-to-face meetings) should be reserved for students most in need.
The papers reviewed also highlight some important elements to account for when designing a learning analytics intervention. First, students and staff should be involved in the design and evaluation of these interventions to ensure that they are designed in accordance with the specific problems identified in the setting. Secondly, Foster and Francis (2020) recommend extending the scope of learning analytics interventions to all students, not only those experiencing difficulties, in order to improve outcomes overall.
What don’t we know
Much of the evidence published on learning analytics lacks empirical rigour or peer review. It also mainly focuses on testing the predictive power of learning analytics in a particular context rather than understanding the causal links and underlying mechanisms between the interventions and outcomes. In practice, there are very few empirically tested learning analytics programmes and most commercially available learning analytics tools, incorporated in most learning management systems, have not been evaluated.
As for evidence reviews, Foster and Francis (2020) have identified multiple factors contributing to the complexity of these impact evaluations, namely:
- Lack of consistent definition for student outcomes;
- Disparate measurement methods;
- Over-reliance on quantitative methods;
- Underdevelopment of Theories of Change.
More generally, there is a lack of empirical evidence to support the impact of learning analytics interventions and a need for more evaluations, even with null or negative results (Wong and Li, 2019). Indeed, like in other fields, there seems to be an underreporting of negative effects (Ferguson and Clow, 2017).
Furthermore, despite the dominance of quantitative methodologies in the existing evidence base, few studies, particularly in the UK, adopt experimental techniques such as randomised controlled trials (Foster and Francis, 2020). Without the presence of a randomised control group, it is difficult to understand whether the intervention is causing the change observed.
It should also be noted that evaluating the causal impact of learning analytics interventions on learners’ attitude, behaviour and cognition can be challenging (Rienties et al. 2017). For example, some virtual learning environments do not make it technically possible for teachers to randomly assign learners to different groups. Also, the ethics of learning analytics is an emerging sub-field and there are important ethical challenges around the potential biases in the algorithms used by HE providers.
It is important to highlight that learning analytics is a relatively new field of research that has recently made substantial progress by applying advanced computational techniques such as predictive modelling, machine learning, Bayesian modelling, social network analysis or cluster analysis (Rienties et al., 2017). We should thus expect more innovation in this field in the near future.
It is also worth noting that, initially, most large-scale learning analytics interventions evaluated were implemented in the US. Learning analytics were also used in Australia, Canada and the United Kingdom (Sclater et al., 2016). More recently, Europe and Asia have produced an increasing number of publications in this domain (Foster and Francis, 2019).
Yet, there is almost no evidence from Asia or Africa (Ferguson and Clow, 2017).
Learning analytics takes place both in traditional in-person settings and in blended environments (Foster and Francis, 2020).
Where does the evidence come from?
The evidence used to develop this page comes from three recent systematic reviews, one undertaken by Foster and Francis (2020), another by Sønderlund et al. (2018) and a third by Dirk and Yin-Kim Yau (2020).
We have given particular attention to six studies providing causal results on the efficiency of learning analytics interventions, with a sample size of over 100 participants. We also included an individual study that fulfils the same criteria. Three of these studies come from the US, two from the Netherlands, one in France and one from Brazil. They were all published in the 2010s.
Beyond causal evidence, many other reviews and individual studies are available but do not meet the criteria for inclusion on this page as they are not causal or the sample size is too small. Beyond intervention evaluations, general studies on learning analytics have provided contextual information.
Key references
Causal studies
Cambruzzi, W. L., Rigo, S. J., & Barbosa, J. L. (2015). Dropout prediction and reduction in distance education courses with the learning analytics multitrail approach. J. UCS, 21(1), 23-47. Linked here.
Davis, D., Chen, G., Jivet, I., Hauff, C., & Houben, G. J. (2016, April). Encouraging Metacognition & Self-Regulation in MOOCs through Increased Learner Feedback. In LAL@ LAK (pp. 17-22). http://ceur-ws.org/Vol-1596/paper3.pdf
Dodge, B., Whitmer, J., & Frazee, J. P. (2015, March). Improving undergraduate student achievement in large blended courses through data-driven interventions. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (pp. 412-413). doi: 10.1145/2723576.2723657
Hellings, J., & Haelermans, C. (2020). The effect of providing learning analytics on student behaviour and performance in programming: a randomised controlled experiment. Higher Education, 1-18. doi: 10.1007/s10734-020-00560-z
Jayaprakash, S. M., Moody, E. W., Lauría, E. J., Regan, J. R., & Baron, J. D. (2014). Early alert of academically at-risk students: An open source analytics initiative. Journal of Learning Analytics, 1(1), 6-47. doi: 10.18608/jla.2014.11.3
Krumm, A. E., Waddington, R. J., Teasley, S. D., & Lonn, S. (2014). A learning management system-based early warning system for academic advising in undergraduate engineering. In Learning analytics (pp. 103-119). Springer, New York, NY. doi: 10.1007/978-1-4614-3305-7_6
Labarthe, H., Bouchet, F., Bachelet, R., & Yacef, K. (2016). Does a Peer Recommender Foster Students’ Engagement in MOOCs?. International Educational Data Mining Society. https://files.eric.ed.gov/fulltext/ED592665.pdf
Other studies on learner analytics
Choi, S. P., Lam, S. S., Li, K. C., & Wong, B. T. (2018). Learning analytics at low cost: At-risk student prediction with clicker data and systematic proactive interventions. Journal of Educational Technology & Society, 21(2), 273-290. https://www.jstor.org/stable/26388407
Ferguson, R., & Clow, D. (2017, March). Where is the evidence? A call to action for learning analytics. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 56-65). doi: 10.1145/3027385.3027396
Foster, C., & Francis, P. (2020). A systematic review on the deployment and effectiveness of data analytics in higher education to improve student outcomes. Assessment & Evaluation in Higher Education, 45(6), 822-841. doi: 10.1080/02602938.2019.1696945
Foster, E., & Siddle, R. (2020). The effectiveness of learning analytics for identifying at-risk students in higher education. Assessment & Evaluation in Higher Education, 45(6), 842-854. doi: 10.1080/02602938.2019.1682118
Fuchs, D., Mock, D., Morgan, P., & Young, C. (2003). Responsiveness-to-intervention: Definitions, evidence, and implications for the learning disabilities construct. Learning Disabilities Research and Practice, 18(3), 157–171. doi: 10.1111/1540-5826.00072
Ifenthaler, D., & Yau, J. Y. K. (2020). Utilising learning analytics to support study success in higher education: a systematic review. Educational Technology Research and Development, 68(4), 1961-1990. doi: 10.1007/s11423-020-09788-z
Jones, K. M. (2019). Learning analytics and higher education: a proposed model for establishing informed consent mechanisms to promote student privacy and autonomy. International Journal of Educational Technology in Higher Education, 16(1), 1-22. doi: 10.1186/s41239-019-0155-0
Khalil, M., & Ebner, M. (2015, June). Learning analytics: principles and constraints. In EdMedia+ Innovate Learning (pp. 1789-1799). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/p/151455/
Larrabee Sønderlund, A., Hughes, E., & Smith, J. (2019). The efficacy of learning analytics interventions in higher education: A systematic review. British Journal of Educational Technology, 50(5), 2594-2618. doi: 10.1111/bjet.12720
McFarlane, K. J. (2016). Tutoring the tutors: Supporting effective personal tutoring. Active Learning in Higher Education, 17(1), 77-88. doi: 10.1177/1469787415616720
Pistilli, M. D., & Heileman, G. L. (2017). Guiding early and often: Using curricular and learning analytics to shape teaching, learning, and student success in gateway courses. New Directions for Higher Education, 2017(180), 21-30. doi: 10.1002/he.20258
Prieto, L. P., Rodríguez-Triana, M. J., Martínez-Maldonado, R., Dimitriadis, Y., & Gašević, D. (2019). Orchestrating learning analytics (OrLA): Supporting inter-stakeholder communication about adoption of learning analytics at the classroom level. Australasian Journal of Educational Technology, 35(4), 14–33. doi: 10.14742/ajet.4314
Rienties, B., Cross, S., & Zdrahal, Z. (2017). Implementing a learning analytics intervention and evaluation framework: What works? In B. K. Daniel (Ed.), Big data and learning analytics in higher education (pp. 147–166). Heidelberg: Springer doi: 10.1007/978-3-319-06520-5_10
Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education. London: Jisc. Accessed February, 8(2017), 176. Linked here.
Siemens, G. (2012). Learning analytics: envisioning a research discipline and a domain of practice. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 4-8). doi: 10.1145/2330601.2330605
Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The current landscape of learning analytics in higher education. Computers in Human Behavior, 89, 98-110. doi: 10.1016/j.chb.2018.07.027
Wong, B. T. M., & Li, K. C. (2020). A review of learning analytics intervention in higher education (2011–2018). Journal of Computers in Education, 7(1), 7-28. doi: 10.1007/s40692-019-00143-7