2021: Honing self-assessment via serious games and team learning

Session has concluded - see the recording

This session explores two approaches to enhancing the self-assessment capabilities of students using gamified, team-based, formative assessment activities.

This panel session is jointly presented by Transforming Assessment and the Assessment in Higher Education Network (UK).

Session chair: Fabio Arico (University of East Anglia, UK)

Presenter 1: Jayne Coleman (Glasgow Caledonian University, UK)
"An Evaluation of Serious Gaming as a Formative Assessment Tool"

Formative assessment, or assessment for learning, is ill defined within the literature, however common best-practice characteristics include opportunity for student self-regulation and learning via peer feedback (Andersson & Palm, 2017). Recognised as supporting meaningful learning, and aligning well with educational theories such as social constructivism, Serious Gaming (SG) has developed as an accepted educational approach (Yildirim & Sen, 2019; Gentry et al., 2018; Qian & Clark, 2016). Loosely defined as a learning mechanism by which fun and educational concepts are combined to promote student engagement, SG includes the introduction of a competitive element into an educational task, along with set rules and a clear educational purpose (Gentry et al., 2018; Al-Azawi, Al-Falitit & Al-Bushi, 2016).

To evaluate the possible role SG could play as a formative assessment tool, feedback was sought from Undergraduate Physiotherapy students after participating in a competitive team-based quiz. Students were randomly allocated to two teams, each team member was required to answer a self-selected unseen question; they could answer the question independently for 2 points (self-regulation), or work with their team for 1 point (peer support). All questions were based upon the module content to date, and covered areas of troublesome knowledge often encountered within the teaching.

In this session focus will be placed upon analysis of the students’ feedback themes, which were consistent with those often present within assessment design research; assessment anxiety and developing awareness of the structure of the summative assessment (Bloxham & Boyd, 2007). Also, and of particular interest, the feedback linked with the characteristics of SG "really fun...game a framework for what we should know or knowledge we need to touch on...was able to see strengths and weaknesses of my current learning".

Presenter 2: Paul McDermott (University of East Anglia, UK)
"Finding Your Confidence in the Numbers: Developing Self-Assessment Accuracy through an Adapted TBL Process"

For a number of years we have given pre-registration pharmacists clinical decision making training using an adapted Team Based Learning approach. Amongst other activities, our approach involves the completion of a clinical decision making test both individually (using a confidence marking answer format) and as a group (using IF-AT scratch cards for scoring and instant item level feedback). The results from this exercise feed into a league table and badge system used to incentivise learner engagement.

In previous years, we have observed a shift in the self-assessment profile of our learners away from a conventional Dunning-Kruger trend towards much less overconfidence in grade predictions relative to test performance. We attributed this shift in self-assessment to the provision of repeated tests with immediate item level feedback. However, in response to the COVID crisis, this exercise was moved online using the Intedashboard software. In our analysis of this years’ data, we did not observe the same shift in self-assessment patterns with online delivery. In-fact we retained a conventional Dunning-Kruger profile which we have provisionally attributed to the loss of "in person peer instruction" which diminishes the internal feedback learners would usually generate through interactions with their fellow learners. In this session, we will reflect on our change in focus away from the benefits of “immediate item level feedback" towards the "effective generation of internal feedback" as a mechanism by which self-assessment can be improved.

To register for this session please login (or create an account) then click the 'register now' button.

Sessions are hosted by Professor Geoffrey Crisp, DVC Academic, University of Canberra and Dr Mathew Hillier, Macquarie University, Australia.

Please note all sessions are recorded and made public after the event.

Time is displayed below is in *universal coordinated time* -> See your local time.

5 May 2021 07:00 AM   through   08:00 AM Coordinated Universal Time (UTC) : See your equivalent local time

Help spread the word

Please help us and let your friends, colleagues and followers know about our page: 2021: Honing self-assessment via serious games and team learning


You can also share the below link in an email or on your website.
http://transformingassessment.com/civicrm/event/info?id=151&reset=1