SRHE Blog

The Society for Research into Higher Education

Lessons from learning analytics

3 Comments

by Liz Moores and Rob Summers

Why bother collecting learning analytics data?

Some of the reported benefits of using learning analytics data include enabling personalised learning and narrowing attainment gaps. Indeed, a quick dip into some of the recent TEF feedback summaries to higher education institutions seems to suggest that use of learning analytics is valued by TEF panels. But can we learn more from the data to influence teaching practice? Aside from the potential benefits for a more personalised learning experience, we think that it’s a good way of understanding the learning process more generally. Over the past few years, we’ve been analysing some of the data generated from Aston University.

Last minute cramming is not effective in improving attainment.

Yes, your parents were correct – it’s much better to work consistently! Early engagement with studies really appears to matter. In fact, the average attainment levels of those first-year students whose engagement remained at the lowest relative levels throughout the year was very similar to those whose early engagement was lowest in the first three weeks but became the very highest in the last three weeks. In contrast, those who started off enthusiastically, but then lost interest, were awarded higher average marks than any of the groups that started off slowly, regardless of how much or whether their engagement peaked later. The consistency of the data – in that those who started off with high engagement tended to finish with high engagement – was remarkable. Also noteworthy were the effects of early engagement on attainment. For the chart below, we divided students into activity quintiles based on only their first three weeks of engagement (Q5 being the highest engagement) and on end of year mark quintiles (Q5 being the highest attainment). The width of the lines connecting engagement quintile to mark quintile is indicative of the proportion of students linking the two measures. The results highlight how few students pass from higher activity quintiles to lower mark quintiles and vice versa.

Of course, these results come with the usual caveats that we cannot infer cause and effect (it could be that the lower engagers in the first three weeks were just low achieving students). However, for us, this highlights the importance of a good induction into academic life – possibly enhanced by some structured engagement exercises to help get first years into good habits (ie tell them how they should be engaging, and the different ways that they can, not just that they should be doing so). There were probably a fair few students represented in this figure that were not even sure what they were supposed to be doing with all their ‘spare’ time

Behaviour outweighs demographics when predicting attainment

The recent pandemic generated much discussion about digital poverty, suggesting that who we teach might be important – at the very least in terms of access to technology. Our recent evidence suggests that both how you teach and who you teach mattered. However, it is important to note that behaviour outweighed demographics in predicting attainment, albeit that in this case behaviour was probably also influenced by demographics. The gap between disadvantaged students’ attainment and their peers widened during online teaching and assessment conditions, and disadvantaged students were also less likely to obtain all 120 module credits on their first try. We also observed changes in their patterns of engagement, although less so for synchronously delivered teaching (as compared to recorded lectures). Students with the lowest engagement were the ones driving the widened gap; those who engaged well with synchronously provided teaching (even if online) fared much better.

So, we should stop teaching online and get people into the classroom early?

No – not necessarily. We don’t want to claim that all online teaching is bad – instead we need to understand what forms of online teaching work, what good looks like, and how our various teaching strategies affect different groups. Anecdotally, many students have appreciated the flexibility of online teaching, particularly where it has included facilities such as the ability to ask questions anonymously. And if you want to reuse those pre-recorded videos, there has been some interesting research from other research groups on ‘watch-parties’. With the cost-of-living crisis, many students will appreciate being able to log into a lecture from home rather than forking out a bus fare or missing out on some part time work. What is important is to understand what works – and for whom.

Professor Liz Moores is Deputy Dean in the College of Health and Life Sciences at Aston University and has research interests in the evaluation of higher education, particularly as applied to widening participation issues.

Dr Rob Summers is research manager at the Centre for Transforming Access and Student Outcomes (TASO). Before joining TASO, Rob worked in the student outreach team at Aston University managing a randomised controlled trial of two post-16 outreach programmes as part of the TASO MIOM (Multi-intervention, Outreach and Mentoring) project.

Author: SRHE News Blog

An international learned society, concerned with supporting research and researchers into Higher Education

3 thoughts on “Lessons from learning analytics

  1. Pingback: Lessons from learning analytics - Education

  2. The reported benefits of using learning analytics are greatly exaggerated. Personalised learning doesn’t need analytics: you just need the data from that student, not an analysis of all the students. Data is not going to greatly influence teaching practice at a larger scale. For that you need to train, or retrain, teachers. Just presenting them with the data will do little.

    When using data it is important not to confuse correlation with causal relationships. Successful students tend to work consistently, but there may be a third factor which allows students to do both. Engagement may be an outcome, not the cause of success.

  3. Hi Tom. Thanks for your comment on our blog. Of course, I agree with you on cause and effect. Personalised learning does only need an individual’s data, but that may be most easily provided at scale (for several individuals) by an analytics system. Any retraining of teachers should be evidence based in my view – and that does require data and understanding of the learning processes. The pandemic changed teaching practice at scale and pace, so I think it’s important to evaluate that practice before deciding on future practice.

Leave a Reply

Discover more from SRHE Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading