Automated Marking for the Masses: Providing Students Early Access to Automated Marking Tools
Abstract
Providing timely and consistent marking with clear feedback to students is a common struggle, especially when dealing with large class sizes and tight deadlines for progression. One approach that can be employed in some... [ view full abstract ]
Providing timely and consistent marking with clear feedback to students is a common struggle, especially when dealing with large class sizes and tight deadlines for progression. One approach that can be employed in some domains is the use of automated assessment, whereby marking is performed by software usually checking submissions against a sample solution. Although not uncommonly used in computer science the tools are generally exclusively applied by the marking staff after submission. Another option would be to provide access in some form to the automated tools to students before submission, providing instantaneous formative feedback using some or all of the same criteria with which their summative marks will be assigned.
To this end in 2018 a first-year module project was set to c. 400 learners to implement a database. The intention had always been to automatically mark this work using tools (a bot) comparing against a specimen that met all the requirements, but in this case students were also provided early access to the bot. This allowed students to upload their proposed solutions and run these against a subset of the real marking tests (not all tests were included but there was good coverage of the types and areas to be tested) providing detailed output and a score of marks awarded against the theoretical maximum for their submission. During the 33 days of the project, 6231 submissions were made to the bot for which feedback was given. A survey of the students found extremely positive feedback and that those that used the tool found it invaluable in helping them to understand what was required and errors in their submissions. Providing this facility also helped lower the number of queries coming to academic staff and seemed to promote student problem-solving abilities, also helping ensure adherence to the required format.
Authors
-
David Cutting
(Queen's University Belfast)
-
Andrew McDowell
(Queen's University Belfast)
-
Angela Allen
(Queen's University Belfast)
-
Neil Anderson
(Queen's University Belfast)
Topic Areas
Topics: Assessment and Feedback in a Digital Age , Topics: Digital Technologies in Disciplinary Contexts
Session
Px - 02 » Assessment and Feedback in a Digital Age (09:30 - Friday, 1st June, L115 (Parallel 2))
Presentation Files
The presenter has not uploaded any presentation files.