Moodle Analytics – do they tell us anything about student assessment
Abstract
Following on from success in many areas of business, learning analytics are becoming very popular in education. Some projects therein require expensive software and data mining. However, VLEs such as Moodle collect much... [ view full abstract ]
Following on from success in many areas of business, learning analytics are becoming very popular in education. Some projects therein require expensive software and data mining. However, VLEs such as Moodle collect much student activity that can reveal interesting insights. Grades and assessment are the most tangible yardsticks by which educators have traditionally measured student success. It is interesting to ask what the metrics held in Moodle logs tell us about student activity vis-à-vis success in assignments and exams.
The researcher evaluated the Moodle logs for three successive cohorts (all run in one academic year) of a Master’s-level, classroom-based, 12-week cloud computing module. The Moodle pages were very similar for each cohort, containing details on assessment and 8 blocks of content on the topic areas of the module. The first component of each block was a file of the lecture slides for that topic. Access patterns were evaluated after each module was completed, in particular, when students logged in, and what content they accessed.
Findings indicated a disappointing picture in terms of student performance. It’s reasonable to expect all students to access all lecture slides. However, on average, only 62% of the lecture slides were viewed. Average viewing of other content items in each block came in at just 24%. It’s possible that students see these as secondary and not worthy of their time.
Addressing the correlation of grades with the percentage of the 8 lecture file viewed, and with the total numbers of views on these files revealed a most disconcerting picture. For example, in the second cohort a correlation of -0.64 was found between the overall grade for the module with the percentage of the 8 lecture slides viewed on Moodle. This suggests that accessing lecture content on Moodle might be bad for student performance. Some similar correlations were produced for other cohorts. It would be particularly easy for a school’s management team to respond by suggesting that lecture notes are not be put on Moodle. However, this is to take a shallow, surface reaction. A key conclusion from the research is that numbers on their own can be hide a wealth of explanatory detail that is not always visible to those outside the classroom from which the numbers are generated. For example, breaking the numbers down into week-by-week access frequencies revealed a huge surge in access the week before the exam. Given the nature of the module, this is too little too late and accounts to some extent for the poor academic performance of some students. The output can be used to inform changes to the assessment structure for future cohorts on the module.
A further key aspect that the data highlighted is the timeframe of drop-out for those students who disengaged with the module. The analysis was carried out after the event which was too late to focus in on such students. However, future research could work on appropriate predictive analytics that lend to capture by Moodle, to identify such students in time.
Keywords; learning analytics, Moodle, assessment
Authors
-
Brid Lane
(Dublin Business School (DBS))
Topic Areas
Learning trends & technologies , Data analytics for learning
Session
RP - 7 » Data Analytics for Learning | Trends & Technologies III (12:25 - Friday, 27th May, Dominic Dowling Room (Basement) -: Video recording)
Presentation Files
The presenter has not uploaded any presentation files.