Online teaching and learning (post-entry)
Key information
-
Cost
Medium cost
-
Impact on aspirations / attitudes
More evidence needed
-
Impact on behaviour / outcomes
Mixed impact
-
Strength of evidence
Emerging evidence
What is it? Online remote learning is learning, teaching and support carried out in the absence of face-to-face contact. It relies on digital tools and platforms, and often requires the use of an internet connection.
Evidence? Generally, findings on remote learning are not based on a large body of literature. There is a lack of rigorous and causal evidence, particularly in the context of COVID-19.
Should HEPs adopt remote learning? The existing evidence suggests that remote learning can be as, if not more, efficient than in-person learning. However, its efficiency is highly dependent on a number of design choices. Moreover, efficient online teaching should be carefully planned. This planning is not possible with emergency remote teaching, where the priority is to adapt promptly to unforeseen crisis circumstances.
What is this intervention?
Online remote learning is learning, teaching and support carried out in the absence of face-to-face contact (Jarett et al., 2020). Remote teaching and learning comprises a wide range of approaches, from blended programmes – where remote and classroom learning are combined – to fully remote learning.
Beyond this fundamental distinction, remote learning programmes vary greatly depending on their design features. For instance, remote learning can either be synchronous (i.e., resources are available at set intervals, following the timing of a traditional face-to-face course) or asynchronous (i.e., all the resources are available immediately and students are responsible for deciding their study schedule). Each design decision affects the effectiveness of the intervention.
The main aim of online teaching and learning post-entry to HE is to enable efficient remote learning, using digital tools to replace the in-person teaching environment.
What is the target group?
The studies and methods discussed on this page are focused on HE students, most often first-year undergraduate students in US universities.
Before the COVID-19 pandemic, remote learning was chosen by students who were unable, or preferred not, to attend in-person classes. This online educational ecosystem allowed them to receive a quality education without the constraints of in-person learning. Blended approaches, mixing online and in-person learning, have also been an accelerating trend in the HE sector.
But as countries across the globe experienced national lockdowns, schools and universities were required to cancel all face-to-face classes and shift to online instruction. Overnight, most students became online learners. This emergency online learning required HE providers to promptly provide a widely accessible, appropriate, temporary access to instruction, although few of them were prepared for such a shift.
As of now, online learning is predicted to be a lasting trend beyond the end of the pandemic for a variety of reasons. For instance, the New York City Department of Education decided that distance learning will be used on snow days and other traditional days off, such as Election Day (Cramer, 2020). It is thus essential to understand how to maximise its efficiency for all students.
How effective is it?
The strength of evidence on the impact of this intervention is limited. Currently, we do not have enough evidence to make a definitive call on the effectiveness of online teaching and learning, especially in the context of the global pandemic.
There is some causal evidence to suggest that online teaching and learning can be effective in maintaining or even improving students’ outcomes compared to in-person learning, as long as the programme is designed carefully and the elements of effective teaching are present (e.g., clear explanations, scaffolding, feedback).
Indeed, two randomised controlled trials (RCT) and one quasi-experiment comparing the attainment outcomes of a typical interactive online learning and an in-person statistics course found no difference in attainment outcomes between the two groups (Bowen et al., 2014; Lovett et al., 2008; Schunn and Patchan, 2009).
Furthermore, Lovett et al. (2008) found that those taking the online course had a significantly higher percentage point increase on a Statistics Knowledge Assessment designed to measure students’ basic statistical reasoning, compared to in-person students. Schunn and Patchan (2009) noticed a significantly lower drop-out rate for the online version of the course. However, Bowen et al. (2014) noted that those studying online enjoyed the course less than those learning face-to-face.
Setting up an efficient online course requires time and practice. Its quality generally improves as faculty gains more experience with remote teaching. Schunn and Patchan (2009) speculate whether limited efficiency of the blended approach tested was due to the teaching assistant having low familiarity with the online system, and therefore being unable to make use of the feedback the system provided when supporting students.
Although new research on online teaching and learning during the COVID-19 pandemic is currently being created and peer-reviewed, for now existing research papers are narrative (Guo, 2020; Orlov et al., 2020), thus not as robust as the causal evidence described above.
Beyond causal evidence, a number of other research papers are available, but do not meet the criteria for inclusion on this page as they are not causal or do not provide statistically significant results. Notably, Baker et al. (2018) carried out an RCT to examine the efficiency of a scheduling intervention aimed at improving students’ time management when learning online. They found that whilst the intervention improved achievement early in the course, these positive effects were concentrated amongst students with poor time management skills and faded in subsequent weeks of the course. Yet, the sample size of this study is too small to generate reliable results.
What features seem to be important?
The papers reviewed highlight some elements of effective practice, indicating how HE providers can contribute to the successful implementation of online teaching and learning:
First, HE providers should ensure that both students and teaching staff have access to online teaching and learning tools. They should pay close attention to disadvantaged students, who might not have the financial resources to acquire the required material. Furthermore, HE providers should provide support and guidance to use specific technologies and platforms.
Online courses should also be designed by keeping in mind learners’ characteristics to fit the specific needs and encourage beneficial behaviours. That is, online students should interact with their peers (e.g., peer marking and feedback, opportunities for live discussions on content) as it increases motivation. Students should also be supported when working independently and encouraged to reflect on their own learning or consider strategies to use when they get struck.
Finally, there is no significant difference between synchronous and asynchronous learning.
What don’t we know
There is not enough high-quality programme evaluation, particularly RCTs, on remote teaching and learning. This is true for all programme types, age groups, and outcomes examined.
Furthermore, most existing causal evidence on online teaching and learning is from the US and there is little to no evidence from the UK. It is also old (from the early 2000s), and thus may not reflect the latest technological changes.
Moreover, the context of the studies included in TASO’s analysis does not parallel the circumstances facing schools responding to national lockdowns due to the COVID-19 pandemic. Although there are similarities in the online teaching methods employed, emergency remote teaching appears to greatly differ from ‘regular’ online learning (Hodges et al., 2020). Indeed, few education stakeholders could have possibly been prepared for the rapid shift to distance learning in March 2020.
One year on from the start of the pandemic, there is no causal evidence on the impact of COVID-19 on teaching and learning. The existing evidence is only narrative, low quality and with small sample sizes. It thus highlights the need for more comprehensive and rigorous studies.
We are also lacking a large enough evidence base to make claims about the relative efficiency of each programme design feature or about the relative impact of distance learning techniques on specific types of students. Notably, the existing studies do specifically focus on certain disadvantaged or underrepresented groups, including:
- BAME students
- Mature students
- Students from lower-socioeconomic status groups
More generally, it would be useful for research to continue to explore how online courses can be designed more effectively.
Where does the evidence come from?
The evidence used to develop this page comes from two reviews – a rapid evidence assessment undertaken by the Education Endowment Foundation (EEF) and a review undertaken by the What Works Clearninghouse (WWC) in the US – and three individual studies. From these sources, three individual studies focus on post-entry student outcomes and meet the What Works Clearinghouse’s standards for best evidence – two randomised controlled trials and one quasi-experiment. None of these studies took place in the context of the global pandemic.
Key references
Causal studies
Baker, R., Evans, B., Li, Q., & Cung, B. (2019). Does inducing students to schedule lecture watching in online classes improve their academic performance? An experimental analysis of a time management intervention. Research in Higher Education, 60(4), 521-552. doi: 10.1007/s11162-018-9521-3
Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T. I. (2014). Interactive learning online at public universities: Evidence from a six‐campus randomized trial. Journal of Policy Analysis and Management, 33(1), 94-111. doi: 10.1002/pam.21728
Lovett, M., Meyer, O., & Thille, C. (2008). The Open Learning Initiative: Measuring the Effectiveness of the OLI Statistics Course in Accelerating Student Learning. Journal of Interactive Media in Education. doi: 10.5334/2008-14
Schunn, C. D., & Patchan, M. (2009). An evaluation of accelerated learning in the CMU Open Learning Initiative course Logic & Proofs. Report, Learning Research and Development Center, University of Pittsburgh. doi: 3105/2009
Stanley, D., & Zhang, Y. (2018). Student-Produced Videos Can Enhance Engagement and Learning in the Online Environment. Online Learning, 22(2), 5-26. https://files.eric.ed.gov/fulltext/EJ1181370.pdf
Other studies on teaching and learning online
Alexander S. (2001). E-learning developments and experiences. Educ. Train. 43, 240–248 10. doi: 10.1108/00400910110399247
Cramer, M. (2020, September 25). Sorry, kids. Snow days are probably over. The New York Times, p. 6.
Education Endowment Foundation (2020) Remote Learning, Rapid Evidence Assessment, London: Education Endowment Foundation. Linked here.
Guo, S. (2020). Synchronous versus asynchronous online teaching of physics during the COVID-19 pandemic. Physics Education, 55(6), 065007. doi: 10.1088/1361-6552/aba1c5
Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (2020). The difference between emergency remote teaching and online learning. Educause review, 27, 1-12. Linked here.
Jarrett, K., Hopkins, D. and Foote, G., 2020. Enhancing Social Mobility in the Digital Learning Age. Rise Think Tank. Linked here.
Orlov, G., McKee, D., Berry, J., Boyle, A., DiCiccio, T., Ransom, T., … & Stoye, J. (2021). Learning during the covid-19 pandemic: it is not who you teach, but how you teach. Economics Letters, 109812. doi: 10.1016/j.econlet.2021.109812
Sahni, S. D., Polanin, J. R., Zhang, Q., Michaelson, L. E., Caverly, S., Polese, M. L., & Yang, J. (2021). A what works clearinghouse rapid evidence review of distance learning programs. https://files.eric.ed.gov/fulltext/ED610886.pdf
Smith D., Hardaker G. (2000). e-Learning innovation through the implementation of an Internet supported learning environment. Educ. Technol. Soc. 3, 1–16 http://www.jstor.org/stable/jeductechsoci.3.3.422