Using Learning Analytics to Identify Poor Performance and Engagement: Case Study from IT Students

Guy Wood-Bradley1, Sophie McKenzie2, Nick Patterson3, David Tay4, Elicia Lanham5

1 Deakin University, Geelong, Locked Bag 20000, Australia, VIC, 3220, 

2 Deakin University, Geelong, Locked Bag 20000, Australia, VIC, 3220, 

3 Deakin University, Geelong, Locked Bag 20000, Australia, VIC, 3220, 

4 Deakin University, Geelong, Locked Bag 20000, Australia, VIC, 3220, 

5 Deakin University, Geelong, Locked Bag 20000, Australia, VIC, 3220, 

Learning analytics (LA), using data produced by computer-based educational systems, has been shown to provide an opportunity to impact on the student experience (Peña-Ayala, 2018, Lacave, Molina and Cruz-Lemus, 2018). This poster will explore how LA is being used to better understand the Information Technology (IT) student experience at an Australian University. Current approaches to data analysis, as well as contextual factors, are presented to describe how sense and value can be made from learner data to create impactful results.

LA can be used as a transdisciplinary paradigm to explore areas such as: learner behaviour and performance, social and discourse interaction at learning, prediction of students’ success and attrition rate, and assessment and feedback, learners’ emotions and engagement (Peña-Ayala, 2018). LA is the analysis and reporting of learner related data from diverse fine-grained viewpoints, such as: logins, views, time, and communications. Peña-Ayala (2018) constructed a learning analytics taxonomy to define the current profile and underlying theoretical and contextual factors relevant to LA in higher education. Factors aiding retention include effective orientation and induction, authentic curricula, integrated study skills, and teachers knowing who their students are (Crosling et al. 2009). Higher education organisations experience a percentage of dropouts in their cohorts, reported in the university sector to reach 20% or higher (Strategic Intelligence and Planning Unit, 2017). Existing research also suggest factors impacting progression and student non-completion, such as balancing social and academic activity in university life as well as the integration of students from more diverse backgrounds (Cartney and Rouse 2006).


With approximately 22,000 students enrolled in an IT course in Victoria in 2016 (Department of Education and Training, 2017), student retention is a key challenge within Australia that varies as institutions strive to keep these loses to a minimum. Notably the results from the Deakin Statistics Summary 2017 (Strategic Intelligence and Planning Unit, 2017) indicates retention rates for higher education domestic students from 2015-2016 with percentages ranging from 75% (Swinburne) to 90% (University of Melbourne).


Our results from the subject in IT showed that over 2016 and 2017 there was a decrease in non-participation (XN) from 21 to 16 students, and that those who unenrolled early from the subject stayed relatively the same at 80 students in 2016 and 78 students in 2017. We found that students who received a fail (N) in their initial attempt were largely successful in their second attempt, with smaller amounts of students failing again or withdrawing early. Further we looked into patterns that could result in an N based on student’s assessment results. This led to us looking for a relationship between not submitting assignments and failing a unit by counting non-submitted assignments for N students throughout two study units. Outcomes indicated that failing students did not submit the later subject assignments.

Our Solution

Current work is underway on an application which uses data analytics to review historical activity in the learner data contained on the learning management computer system, which will feed into a model of student data that can inform real-time assessment. This will employ a dedicated algorithm to determine when a student is on the pathway to an XN. In addition the application will focus on early disengagement of students leading to un-enrolment from the unit and identifying students who are likely to leave (WE). When focused on IT students, results from the Student Experience Survey 2016 and 2017 (QILT, 2017) indicate lower in learner engagement (55.1% vs 60.7% nationally) and skills development (69.2% vs 73.8% nationally). Our model should help address these weak points to benefit Computing and Information Systems students.


Alejandro Peña-Ayala, (2018), Learning Analytics: A glance of evolution, status, and trends according to a proposed taxonomy, WIREs Data Mining Knowledge Discovery. 2018; 8, 1-29

Crosling, Glenda; Heagney, Margaret and Thomas, Liz. Improving Student Retention in Higher Education: Improving Teaching and Learning [online]. Australian Universities’ Review, The, Vol. 51, No. 2, 2009: 9-18. Availability:<;dn=159225407205474;res=IELHSS> ISSN: 0818-8068. [cited 14 Mar 18].

Cartney, P. and A. Rouse (2006). “The emotional impact of learning in small groups: highlighting the impact on student progression and retention.” Teaching in Higher Education 11(1): 79-91.

Carmen Lacave, Ana Molina and José Cruz-Lemus (2018), Learning Analytics to identify dropout factors of Computer Science studies through Bayesian networks, Behaviour and Information Technology, DOI: 10.1080/0144929X.2018.1485053

Department of Education and Training. (2017). 2016 All Students.

Mohammed Saqr, Uno Fors & Matti Tedre (2017) How learning analytics can early predict under-achieving students in a blended medical education course, Medical Teacher, 39:7, 757-767, DOI: 10.1080/0142159X.2017.1309376

QILT. (2017). Deakin University – Student Experience – Undergraduate.   Retrieved 27/8/18, from

Strategic Intelligence and Planning Unit. (2017). Deakin Statistics Summary 2017.   Retrieved 27/8/18, from