Document Type

Conference Proceeding

Publisher

AUC

Editor(s)

Michael Docherty and Matt Hitchcock

Faculty

Faculty of Education and Arts

School

School of Education

RAS ID

13328

Comments

This is an Author's Accepted Manuscript of: Wren, J. J., Campbell, A. B., Heyworth, J. N., & Lovering, C. A. (2011). Using iPad2 to assess students' live performances and actively engage students with tutor and peer feedback. Paper presented at the Create World Conference. Brisbane, Australia. Available here

Abstract

Assessing student live performances can be challenging because markers need to make quick and often complex judgements about the learning while at the same time record information and watch the performance. This is further challenged where multiple markers are involved and moderation between markers is required. Maintaining fairness and validity throughout the assessment process can consequently become a significant issue. Moderation of assessment can cause a delay in the turnaround time for student feedback because markers need to meet and review. In addition, the ‘busy type of work’ associated with compiling and sorting individual marks and distributing them to students, often further delays this process. This paper describes a two phase, qualitative, action research project which trialled the use of an innovative digital tool to streamline the assessment process when assessing live performances. Phase one involved the assessment of arts performances of 170 Bachelor of Education students and phase two involved 200 students. For each phase, the students were enrolled in a 12 week Arts Education unit in the third year of their course and were assessed in groups of 5 or 6 students. The digital assessment tool enabled each marker wireless access to a customised database during marking and moderation. Markers used laptops in phase one of the study, then iPad2 in phase two, as it enabled mobility during assessment. Each group’s performance video was embedded into their marking key. This made it quick and easy to locate and view. The digital tool automatically saved and collated data. At the completion of the marking and moderation period, the marking key with markers’ feedback as well as the embedded video of performance were automatically emailed to individuals as a pdf attachment. Individuals only received the feedback pertaining to their own group’s performance. The markers reported that the digital tool significantly enhanced the way in which they were able to capture and record their observations of complex learning. They felt that the assessment was more accurate and that the paperless process was far more efficient. The students reported that they became more engaged with the assessment process and that they engaged with their feedback on multiple occasions.

Access Rights

free_to_read

Share

 
COinS