The advent of systems which gather large amounts of micro transactional data has ushered in the era of big data (Jacobs, 2009). Concurrent with the advent of big data has come the proposition of data analytics – the ability to mine this big data for actionable insights (Lavalle et al, 2011). In higher level education the key system that provides micro transactional data at module level are Learning Management Systems (LMSs), for example Blackboard stores details on each students’ activity within a module which is accessible to the instructor of that module1. The availability of such micro-level data on each students’ interactions with the VLE along with other systems such as Audience Response Systems (ARS) has led to the emerge of the area of learning analytics (Siemens, 2013; de Freitas et al, 2015) and the publishing of special journal issues on the topic (Conde and Hernandez-Garcia, 2015).
Early uses of learning analytics focused on predictive modelling – the creation of a mathematical model on the likelihood that a student will complete a module. (Clow, 2013). More recent uses of learning analytics Agudo-Peregrina et al (2014), used the student activity data from a LMS to see if the extent of student activity on the LMS is a predictor of academic achievement. They found that there was a relationship between specific types of student activity on the LMS and academic performance for an online course, but not for a face to face course. Alongside the development of LMS which support students outside lectures, audience response systems have developed to support student learning and engagement while attending lectures. These systems have been in place for decades (Hunsu et al, 2016), but in more recent years they have moved to being App-based and thus accessible to students via their mobile phones (Stowell, 2015).
This study gathers data for a 12 week module in Semester II in the academic year 2017-18 on which there were 460 students registered. Use was made of the analytics tools available within Blackboard with respect to student activity and this was combined with data from a mobile based audience response system (PollEverywhere)2 which was used in lectures. The
combination of these two sets of data provides a unique insight into student activity by capturing data on the students face to face lecture attendance and engagement alongside their on-line engagement with course materials. Initial analysis of the data gathered shows that 71% of students believe that the ARS improved their engagement with lecture material and 57% believe it improved their learning experience at lectures. The extent to which LMS and ARS usage are related to academic performance will form part of this research and the results will be available for presentation at the conference in June, once the students’ exams have been sat and marked.
Selected References
CLOW, D. 2013. An overview of learning analytics. Teaching in Higher Education, 18, 683-695.
CONDE, M. A. & HERNANDEZ-GARCIA, A. 2015. Learning analytics for educational decision making. Computers in Human Behavior, 47, 1-3.
DE FREITAS, S., et al. 2015. Foundations of dynamic learning analytics: British Journal of Educational Technology, 46, 1175-1188.
SIEMENS, G. 2013. Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57, 1380-1400.
STOWELL, J. R. 2015. Use of clickers vs. mobile devices for classroom polling. Computers & Education, 82, 329-334.