Programme approaches to assessment are a challenging yet essential part of higher education (Jessop et al, 2014; O’Neill, 2015). Continuous Assessment has fully or partially replaced end-of-semester exams in almost all programmes, the literature providing support for its effectiveness in delivering longer term student learning (Falchikov, 2013, Brown & Knight, 1994). Research shows that students prefer continuous assessment, most achieving higher grades through what they perceive to be a fairer mode of assessment than traditional examinations (Holmes, 2015;Trotter, 2006; Isaksson, 2008). However, the increased use of continuous assessment can result in over assessment enslaving, rather than saving students and lecturers alike. One response to this challenge is the development of programme assessment schedules. Primarily intended to assist students in managing both their time and their anxiety levels, by preventing the clustering of assessments, this macro view of assessment across the stages of a programme also raises lecturer awareness of the frequency, volume and variety of student assessments.
This paper outlines the use of the Programme Assessment Schedule for Students (PASS), an online calendar-based mapping tool designed to capture the types and timings of assessments across modules in each year of a programme. A single version of the PASS schedule providing an overview of assessment dates and formats is completed by all teaching staff at the start of the semester and then shared with students. PASS can be viewed through two lenses: a zoom lens for the modules and year, and a wide-angle lens for the programme overview.
However, more recently, as technology has become increasingly embedded in practice, the issue of the diversity of tools being used for assessment has emerged. The technology which assists in the collection, grading and originality checking of submitted work can result in an additional workload and be a source of anxiety for students unfamiliar with features used. This has necessitated the addition into the PASS schedule of columns detailing the assessment submission methods and the technologies employed. In this way, the schedule makes explicit the alignment between the technology tools employed and the learning and assessment process.
Bibliography
Brown, S., Knight, P., 1994. Assessing learners in higher education. Psychology Press.
Falchikov, N., 2013. Improving assessment through student involvement: Practical solutions for aiding learning in higher and further education. Routledge.
Holmes, N., 2015. Student perceptions of their learning and engagement in response to the use of a continuous e-assessment in an undergraduate module. Assessment & Evaluation in Higher Education, 40(1), pp.1-14.
Isaksson, S. 2008. “Assess as you go; the Effect of Continuous Assessment on Student Learning During Short Course in Archaeology.” Assessment & Evaluation in Higher Education 33 (1): 1–7. doi: 10.1080/02602930601122498
Jessop, T., El Hakim, Y. and Gibbs G. (2014). The whole is greater than the sum of its parts: A large-scale study of students’ learning in response to different programme assessment patterns. Assessment & Evaluation in Higher Education, 39(1), pp. 73-88.
Myers, C.B. and Myers, S.M., 2007. Assessing assessment: The effects of two exam formats on course achievement and evaluation. Innovative Higher Education, 31(4), pp.227-236.
O’Neill, G. (2015). Curriculum Design in Higher Education: Theory to Practice. Dublin: UCD Teaching and Learning. Available at http://www.ucd.ie/t4cms/UCDTLP0068.pdf.
Trotter, E. 2006. “Student Perceptions of Continuous Summative Assessment.” Assessment & Evaluation in Higher Education 31 (5): 505–521. doi: 10.1080/02602930600679506
Topics: Assessment and feedback in a digital age , Topics: Learning analytics: research and practice