Skip to main content

Transferring Gamification Across Disciplines: Determining the Efficacy of Migrating a Gamification Platform from a Language to a Law Module: Transferring Gamification Across Disciplines: Determining the Efficacy of Migrating a Gamification Platform from a Language to a Law Module

Transferring Gamification Across Disciplines: Determining the Efficacy of Migrating a Gamification Platform from a Language to a Law Module
Transferring Gamification Across Disciplines: Determining the Efficacy of Migrating a Gamification Platform from a Language to a Law Module
    • Notifications
    • Privacy
  • Issue HomeJournal of Interactive Technology and Pedagogy, no. 22
  • Journals
  • Learn more about Manifold

Notes

Show the following:

  • Annotations
  • Resources
Search within:

Adjust appearance:

  • font
    Font style
  • color scheme
  • Margins
table of contents
  1. Transferring Gamification Across Disciplines: Determining the Efficacy of Migrating a Gamification Platform from a Language to a Law Module
    1. Abstract
    2. Introduction
    3. Literature Review
    4. Background and Methodology
    5. Development Stages of the Gamification Platform
      1. Phase One
        1. 1) Extrinsic and intrinsic motivation elements and self-directed learning
        2. 2) Use of consistent icon design and color for object recognition
        3. 3) Chunk content into small segments
        4. 4) Design as a template for easy re-use for different modules
      2. Phase Two
      3. Phase Three
    6. Migrated Gamification Platform
    7. Gamification Platform and Quiz Participation in the Law Module
    8. Results and Discussion
    9. Conclusion
    10. References
    11. Appendix
      1. Mixed-methods survey used in this study
    12. About the Authors

Transferring Gamification Across Disciplines: Determining the Efficacy of Migrating a Gamification Platform from a Language to a Law Module

Lynette Yuen Ling Tan, National University of Singapore

Helena Whalen-Bridge, National University of Singapore

Wee Ying Qin, National University of Singapore

Abstract

The use of technology in higher education is an established phenomenon today, and gamification as a teaching strategy has, in particular, received much attention in the last decade. There is, however, conflicting evidence for its effectiveness in supporting student learning, with research indicating both positive outcomes and problems posed by the strategy. If gamification is desired, a key challenge is investment in terms of financial and time demands. To address these challenges of cost and time, we investigated the efficacy of transferring a gamification platform originally created for and successfully employed in a language module to a law module. A control platform was incorporated into the study in the form of a traditional and readily available online quiz. Participant students were asked to evaluate and compare both platforms. Findings suggest that while students perceived that both gamification and online quiz platforms supported their comprehension of pre-reading material, each platform has distinct affordances that appealed to different students. Thus, while the process of transferring the gamification platform met the challenges of time and cost, it brought to light the need for multiple platforms to support different kinds of online learning. The study also provides reasons for the further development of coherent methods of gamification transfer among different course subject matters.

Keywords: gamification; financial costs; time costs; course migration.

Introduction

The number of online gamers in the world was estimated at 1 billion as of 2022, with a projected increase to 1.03 billion by 2025 (Clement 2022). The world population was recorded at 7.84 billion in 2021 (The World Bank Group 2022), and the percentage of global gamers at that time stood at 12.7 percent. These statistics are not unexpected, as the global gaming market was forecast to generate US$159.3 billion in 2020 (Reuters 2020). Riding the wave of this phenomenon is gamification, defined as “the use of game design elements in non-game contexts” (Deterding et al. 2011, 10). Education is one such non-game context. Bolstered by findings such as the US study of 27 colleges and universities from Pew Internet Research, where Jones (2003) reported that 70 percent of college students played video, computer or online games “at least once in a while” and about 50 percent of college-student gamers reported that gaming keeps them from studying “some” or “a lot,” gamification has been used as a platform for student learning and academic engagement.

Research has been conducted on the impact of gamification on learning, but with inconclusive results. Of the two major systematic reviews of published evidence on computer games with respect to learning, one revealed that while the majority of authors share the opinion that gamification has “the potential to improve learning if it is well designed and used correctly,” “most of the empirical studies do not provide a proper evaluation” (Dicheva et al. 2015, 83). The other study confirmed that while there has been a “surge of interest” in games for education, there was scant research providing evidence on the impact of these games with respect to learning (Connolly et al. 2012, 671). In particular Dicheva et al. established that while the bulk of papers show encouraging results from gamification, key considerations such as “a requirement for an ongoing monetary and time investment” made for a less than persuasive case (83).

To address concerns about the impact of financial and time demands on both teachers and students, this study explored the efficacy of transferring a gamification platform from a language module (UTW1001S) to a law module (LL4209V/LL5209V/LL6209V, Legal Argument & Narrative) at the National University of Singapore (NUS). The gamification platform was originally created for and successfully employed in the IEM (Ideas and Exposition) series of modules at the NUS Centre for English Language Communication, where evidence revealed a positive impact on students’ comprehension as well as intrinsic motivation (Tan 2018). To evaluate the possibility of migration to a different discipline, this study migrated that platform to a law module and collected quantitative and qualitative data regarding law students’ preferences for gamification as a learning strategy as opposed to a traditional online quiz. The traditional online quiz was chosen as a control exercise due to the similarity it had with the gamification example in testing students’ comprehension of the reading material. These results were further analyzed in relation to the financial and time concerns necessitated by the transfer.

Literature Review

While most studies on the impact of gamification in educational spaces focus on the outcomes of the intervention, such as knowledge acquisition and other perceptive, cognitive or behavioral (for example, motivational) effects (Manzano-León et al. 2021), very few center on the more practical aspects of financial and time demands (Huang and Soman 2013; Sánchez-Mena and Martí-Parreño 2017). The latter concerns should not be the overriding determinants of using gamification as a strategy. However, they often guide numerous institutional decisions, particularly whether this pedagogical strategy is employed, and they elicit a weighing of the cost of the platform against the benefits of the outcome. O’Donavan, Gain and Marais (2013, 250) reported that both financial and time costs were “substantial” for lecturers and teaching assistants employing the strategy, with the hiring of an artist and programmer, as well as an estimate time investment of six additional hours of work per week, on top of ongoing preparations. O’Donavan et al. do conclude that “the benefits outweighed the cost of the game,” but this view was qualified by their admission that gamification “requires ongoing monetary and time investment to be successful” (250–251). The level of gamification which led to these conclusions was comparable to those reported in most studies, in which there is a mix of goals/challenges, rapid feedback, visible status/recognition, freedom of choice, freedom to fail, a storyline, and social engagement, rather than low cost, non-digital measures of gamification that employ more basic elements (Connolly et al. 2012, 666; Dicheva et al. 2015, 79, 81; Ortiz 2016, 76). One study used zero-cost tools; however, the platform was primarily a “dynamic website” that allowed users to create exercises and add videos with explanations, and thus not as sophisticated (Maican et al. 2016). The authors of this article determined that additional research was needed to provide evidence for weighing the benefits of gamification in education against the financial and time costs required to successfully implement the strategy.

The research that formed the basis for this article was conducted pre-COVID, and it appears that the COVID-19 pandemic led not to further research on gamification, but rather the development of strategies that would allow for more face-to-face contact, such as platforms like Zoom that facilitate synchronous face-to-face engagements between students and educators (de Oliveira Dias et al. 2020; Stefanile 2020; Sutton-Brady 2021). This article is intended to address this gap in the literature, as well as address a gap in comparing online quizzes with gamification.

Background and Methodology

The students who participated in this project were enrolled in one of two courses: UTW2001S, Masculinities on Film (“language module”) at the Centre for English Language Communication (CELC), and LL4209V/LL5209V/LL6209V, Legal Argument & Narrative (“law module”) at the Faculty of Law. The platform was initially created for UTW1001S, Women in Film (“original module”), a first-level Ideas and Exposition Module (IEM) related to the language module, and was migrated to the second-level language module and the law module simultaneously. Transfering the original module to the language module did not pose significant challenges, as they were both language modules, had similar disciplinary content, used similar pedagogical methods via the same lecturer, and were offered in a similar setting in a Residential College classroom in University Town. What distinguishes the first and second level of IEM is the difficulty of the rhetoric and writing skills that students learn, with the second level requiring a higher degree of expertise with content and writing skills.

The platform’s transfer to the law module, a course with different subject matter and a different lecturer, was potentially more complex. The gamification platform was originally developed by the IEM lecturer together with assistance from Playware and then refined by the IEM lecturer with the NUS Centre for Instructional Technology (CIT) for the language module, so it was implemented by an educator well versed not only in the affordances but also the delivery of the platform (Tan 2018). For the educator in law, however, this was their first experience with a gamification platform. The law module, an entirely distinct discipline with associated learning outcomes, was also taught using different pedagogical methods and offered in a different setting, the Law Faculty. The law module addressed the use of narrative in the law, while the language module addressed the role and portrayal of women in film. Another difference between the modules might have been the different kinds of students taking them. The IEM classroom within the University Town College Programme is multidisciplinary, with students coming from different faculties within NUS such as Business, the Sciences, the Arts, Engineering, Architecture and Computing. The students enrolled in the law module were more homogenous in terms of disciplinary knowledge, with all students in the law stream.

While these variants might pose challenges in the migration of the gamification platform, what the language and law modules have in common is the need for students to engage in pre-reading before coming to class. Retrieval or repeated testing in general has been found to produce a large positive effect on learning (Chan and McDermott 2007; Karpicke and Roediger 2008; Roediger and Butler 2012; Weinstein et al. 2010), and gamification has been established as a strategy that can fulfill that purpose in testing and aiding students’ comprehension of pre-reading material. Connolly et al. identified twenty-six published reports where learning games led to knowledge acquisition or content understanding (Tan 2018; Connolly et al. 2012, 669). Students’ comprehension of pre-reading materials is pertinent to both modules, as students’ abilities to think critically and contribute to seminars depends largely on their engagement with core reading materials prior to class. Each IEM module typically has ten or more core readings, and a much larger supplementary list, while the law module normally has fifty pages of assigned readings for every class meeting. Having students prepare for seminars beforehand via pre-reading and retrieval would enhance their learning experience within the classroom, affording them a level of competence that an educator can build upon and develop further through a flipped classroom or other strategies that promote critical thinking. Students who have prepared by reading will more easily achieve the higher level learning outcomes resulting from critical thinking, i.e. to “analyze, evaluate and create” per Bloom’s taxonomy (Anderson and Krathwohl 2001, 61).

The online, digital game migrated in this study, ‘The Protégé,’ was initially built for the original module, where it tested students’ understanding of concepts in pre-reading materials that students would then apply in class (Tan 2018). Further iterations developed into a version where more thorough instant feedback, for correct as well as wrong answers, was delivered together with additional content, such as relevant articles and videos, for an improved learning experience. It is this updated version that was employed in the language module and migrated to the law module in 2018. The class of forty students in the law module were given the option of participating in the study and awarded increased class participation points if they participated. Thirty seven out of forty students participated in the study, undergraduates mostly in their early twenties and mainly from Asia. The study was approved by the NUS Internal Review Board, and informed consent was provided by all participants.

Given this article’s focus on the effectiveness of migrating a gamification platform to a different discipline, only the data gathered from the law students are analyzed here. The research questions addressed by this study in gamification platform migration were:

  1. What were students’ preferences for gamification as a learning strategy compared to traditional online quizzes?
  2. What distinct affordances of the gamification platform and of traditional online quizzes did students identify as helpful to their learning?
  3. How do the results of the first two questions correlate with the financial and time demands required to use gamification as a learning strategy?

Development Stages of the Gamification Platform

The development of the gamification platform for the study involved three phases. In the first phase, the initial version of the platform was made based on the requirements discussed between the lecturer of the original language module and the design team. In Phase Two, refinements were made to the platform based on feedback from both educators and student users of the gamification platform in the original course. Phase Three is what this paper is concerned with: the migration of the gamification platform from the language module to the law module.

Phase One

Figure 1 shows the process of the project’s development in Phase One, which took about three months in development.

Stages of the gamification project. Analysis took place in June, design in July, development over July and August, implementation from August to December and the gathering of feedback in the January after. Phase one, the creation of the gamification platform, took three months.
Figure 1. Project timeline for Phase One.

To provide students with a better user experience, the following general design elements were considered and incorporated in Phase One:

1) Extrinsic and intrinsic motivation elements and self-directed learning

Acquiring knowledge independently can be quite a challenge for some learners. They might not have the motivation to complete tasks like reading articles, viewing videos, and doing quizzes. To increase extrinsic motivation for learners to complete the task, learners are introduced to a stress-free environment with a reward system each time they complete one stage of the quiz. Rather than points, badges, or leader boards, which are the norm of reward-based gamification (Deterding 2012), the learner is motivated by congratulatory messages for completing each level, as well as a gradual poster reveal as they move towards the finish. The elements that encourage intrinsic motivation align with Ryan and Deci’s paradigm of competence, autonomy, and relatability (Ryan and Deci 2000). The condition of competence is facilitated when students gain mastery of the reading as they progress through the gamification platform. They are able to meet the challenge because when every wrong move is explained, the right answer becomes evident. High-quality instant feedback helps reframe a wrong answer as the path to success, and students’ effort to correct their mistakes is rewarded via progression in the platform. Autonomy occurs in the platform in the selection of the avatar protagonist, as well as other choices that are open to the user, e.g. to begin the quiz or access the additional material. Moreover, the student relates to the platform because mastery in the platform parallels mastery of the reading material. In order to progress and succeed in the gamification platform, the student must in effect understand the key concepts of the reading material. Relatability is also injected into the platform via the presence of the lecturer at the start, encouraging students to prepare for class.

2) Use of consistent icon design and color for object recognition

Some learners may not know that certain objects are clickable items and these learners may miss relevant content. Using color changes among clickable objects helps to provide such prompts. These colors are kept constant so as to not confuse learners regarding different functions. Icons are similarly kept consistent to help users familiarize themselves with what each icon represents.

3) Chunk content into small segments

It is important that the learner does not get affected by information overload. Chunking is a good practice which helps the learner to break down information into manageable pieces. This involves dividing content into smaller pieces, which helps the learner identify with the ideas within the content and makes it easier for the learner to organize and synthesize the information provided.

4) Design as a template for easy re-use for different modules

It can be time consuming and costly to design e-learning content. The development process regarding the nature of content and type of interface is potentially endless. To speed up the process and ensure easy transfer of content, the gamification platform interface is designed as a template format which helps to provide clarity for any creator to insert and replace the content easily.

Feedback was gathered from students and fellow colleagues in the original course to find out how they felt about using the gamification platform at the end of Phase One. Most of the feedback pointed to how the game platform encouraged reflection and was effective as a learning tool:

I liked the part where we were not allowed to move on until we got the correct answer cause it made me think and reflect on the things I learnt to make sure that it really stuck. (Student A)

Loved the challenge of the readings, students would know after first try that their chances of trial and error would be quite futile. (Student B)

However, there were also complaints:

Fun to play but unable to link how the game storyline goes with content. (Student C)

Overall no technical issues aside from the audio sync on the trailer and the confusing navigation terms for the map. (Student D)

Phase Two

Based on the feedback gathered, enhancement and bug fixing were done in Phase Two. In particular, the game storyline was simplified and improvements were made to game navigation. Figure 2 shows the development process of Phase Two. As most of the assets needed were created during Phase One, the time taken for development in Phase Two was significantly shorter, about one month compared to the three months needed in Phase One.

Timeline for Phase two of the gamification project. After gathering feedback from users, creation of new context and testing, the new gamification platform takes one month to launch.
Figure 2. Project timeline for Phase Two.

Phase Three

At the end of Phase Two, all the gamification assets had been created and gameplay was set. Phase Three development was therefore significantly shorter in duration. Transferring the content from one module to another only took less than one week, greatly reducing the time and resources needed. Figure 3 shows the development process of Phase Three.

Timeline for Phase three, the transferring of the gamification platform. This took 4 days.
Figure 3. Project timeline for Phase Three.

Migrated Gamification Platform

The gamification platform used for the law module in Phase Three was digital, utilizing two-dimensional graphics. The narrative was about an undergraduate settling down to study. Students were able to choose from two avatars, male or female. The voices of the avatars were also gendered.

Upon entering the game interface, students made a choice to either interact with the gameplay environment or access some pre-reading materials first before attempting the test questions. The test questions, crafted by the law lecturer, targeted key concepts of the reading that were scheduled for discussion later in class, as well as common misconceptions that students might have of the reading based on the lecturer’s prior experience.

Game interface of a female student avatar making her mind up about what she is going to eat. The text reads “decisions, decisions. Shall I have a snack or get my reading done for class?”. She is in a typical room in a college residence. There is a laptop on the desk with a drink and chocolate next to it. There is a book shelf filled with books on the wall.
Figure 4. The game interface.

This main section of the gamification platform consisted of three levels, with each level offering a series of questions with three possible answers. To assist with their comprehension, students had access to materials relevant to the questions being asked. Engagement with these additional elements was voluntary, and students could either access the material or go directly to the question.

Game interface of a male student avatar making a choice between a reading and a quiz. There are two green buttons that offer this selection.
Figure 5. Choice between reading material and quiz.
A page from the reading. It is entitled “Lawyer as Artist: Using Significant Moments and Obtuse Objects to Enhance Advocacy”.
Figure 6. Reading material.

Students had an infinite number of attempts to answer the questions correctly, but they could only move to the next level in the platform when all their answers were correct. A sample question with answers is provided in Figure 7.

Quiz question showing 3 options for the user to select in answer to the question: “Why might you want to try and use visual images, as opposed to legal argument and storytelling?”
Figure 7. Sample question with answer options.

All answers triggered instant feedback with detailed explanations, whether the answer was correct or incorrect.

Quiz instant feedback showing the explanation that is given when a wrong answer has been selected.
Figure 8. Instant feedback.

A wrong answer within the set would bring students back to the start of the level to attempt the same questions again, this time with the answers in a different sequence. A correct answer to every question in the level cleared that level for students and produced an encouraging message.

Game interface of a male student avatar looking delighted at having passed a stage in the quiz.
Figure 9. Stage complete.

Each time students completed one of these levels, part of an image was revealed. Clearing all three levels revealed the entire surprising image as a form of reward, the relevance of which was explained fully in the following class.

Game interface of the picture reveal after all quiz questions have been answered correctly. It is a picture of a popular local dish in Singapore, chicken rice. The words read: “Chicken rice – in Singapore an image with informational as well as symbolic meaning”.
Figure 10. Poster reveal.

The gamification platform devised in this process of development and migration aligns with the affordances demonstrated in relevant studies in the education field (Dicheva et al. 78–79; Tan 2018, 147). There is a mix of goals and challenges in the three levels via the reveal, rapid feedback, visible answer status, and correct answer recognition via a congratulatory message once each level is cleared. Freedom of choice existed, in that at every stage no action was forced. Students also had freedom to fail, a storyline, and social engagement. Social engagement, however, is only present in the gamification platform in its barest form and functions primarily as a frame for the learning activity, when the student clicks on a button before entering the testing stage and triggers a voice message from the educator. This aspect of the platform will be developed more fully in later versions.

Gamification Platform and Quiz Participation in the Law Module

Having revised the platform with questions and answers appropriate to the law module, the game was tested in the law module together with an online quiz also produced by the law lecturer. Law students were prepared for the gamification platform and the quiz in the following ways: students were tasked to read an article in preparation for class and encouraged to play the game before coming to class. In the week prior to this activity, they were given another reading of comparable difficulty to prepare for class and directed to a simple online quiz to test their comprehension of the reading. The online quiz had the same nature and number of questions, testing key concepts and common misconceptions. Unlike the gamification platform, there were no multiple tries in the quiz, unless the entire list of questions in the quiz were attempted again, and students were only told at the end if the answers were correct or incorrect, without detailed explanation.

After students had experienced both the gamification platform and the traditional online quiz, they were given an online feedback form to submit their responses. In order to compare the effectiveness of the gamification platform and the traditional quiz on students’ perceptions of their own learning, quantitative and qualitative data was collected via a survey. The quantitative part of the survey consisted of four questions that tasked students to rank the scaffolding strategies that they found to be the most and least helpful to their learning. The qualitative part of the survey asked students to explain their rankings in the form of written responses. The last question of the qualitative survey asked students which aspects of the gamification platform they thought needed improvement. There were three questions in the qualitative portion of the survey. The complete survey can be found in the appendix at the end of this article.

Results and Discussion

The quantitative results regarding the first research question, students’ preference for gamification as a learning strategy compared to the traditional online quiz, produced surprising results (Figure 11). There was no majority preference for the game platform and students were roughly split between the learning platforms, with slightly more preferring the traditional quiz.

A pie chart showing student preference for the game (46 percent) and quiz (54 percent).
Figure 11. Quantitative results regarding platform preference.

Qualitative feedback addressed the second research question around students’ identification of the distinct affordances of the gamification platform compared to the traditional online quiz in terms of how helpful the respective platforms were in their learning. The feedback raised two points: First, students identified certain features as helpful which did not necessarily have to be paired with only the gamification platform or the quiz. For example, nine out of thirty-seven students identified instant feedback on right and wrong answers as being extremely helpful.

I appreciate how after selecting the right answer, an explanation (plus reference to the readings) pops up on the screen to confirm my thoughts and offer some additional insight where I might not have seen or registered it entirely before. (Student A)

The Gamification Quiz was most helpful because the questions were tougher and there was an instant response given with justification. (Student B)

Close to 40 percent of the students, 14 out of 37, expressed spontaneous appreciation for both platforms (per Student C, “To be honest, both exercises were helpful in their own way”). Even when required to identify a preference, most students associated similar characteristics to the quiz and similar characteristics to the gamification platform, regardless of which platform they preferred. These similar characteristics were expressed in a positive or negative fashion, depending on whether the participant liked a characteristic or disliked it. For example, two typical students described their preference for the gamification platform or the quiz in the following ways:

Student Preference for:Description of the platformDescription of the quiz
Gamification platformI felt that the interactive nature and interface of the quiz made the exercise more engaging and thus allowed me to retain the information more effectively!As opposed to the Gamification…, the…quiz felt relatively more dry and less engaging to the audience.
QuizThe multimedia was a tad distracting for meI liked the clean layout of the quiz
Table 1. Sample student preference comparisons.

Students who preferred the gamification platform described it using words such as “interesting,” “engaging,” “interactive,” a “breath of fresh air as it made the quiz a lot more interactive,” and “fun” (“With the avatar and rather colorful yet user-friendly interface, it made the quiz quite fun to attempt! :)”). These students described the quiz as boring, “less interactive and … not as fun.” Students who preferred the quiz described it using words such as “straightforward,” “quick” and “to the point,” “practical,” “simple,” and “clean.” These students described the gamification platform as distracting, more tedious to work through, unnecessary (and a little bit juvenile), and corny. Students therefore ascribed similar qualities to the gamification platform and the quiz, but they expressed a distinct preference for one or the other of these sets of qualities. This aspect of the feedback suggests that the distinct affordances of the gamification platform and quiz platforms appealed to different students, depending on how these students preferred to learn.

One way of understanding student preferences for either the gamification platform or the quiz is through the lens of learning styles. Fleming and Mills (1992) proposed the VARK model of different learning styles. This model suggested four learning styles: visual, aural, reading/writing, and kinesthetic. Kinesthetic refers to the “perceptual preference related to the use of experience and practice (simulated or real),” and it “encodes a preference to learn through experience, demonstrations, practice and simulations” (Barata et al. 2015). As a simulation, the gamification platform used in this study may constitute a kinesthetic learning experience, which is not the primary learning style for some students. The VARK model has been criticized (Newton et al. 2021), but to explore this connection further, future research could establish students’ self-reported preferred learning style and then examine whether this is correlated to their preference or aversion to a game learning platform.

The results reported here also help answer this article’s third research question: how the survey results correlate with the financial and time demands required to use gamification as a learning strategy. The quantitative feedback, in which only half of the students preferred the gamification platform, could be interpreted as lessening the attractiveness of gamification. However, these results suggest that as a group, students indicated a distinct preference for different platforms. If universities are to have multiple platforms that respond to different student learning styles, the costs of multiple platforms must therefore be reasonable. Phase Three above has described the saving of time, energy and money that gamification migration offers, which suggests that the strategy of gamification migration addresses learning needs in an optimal manner. While the study does not measure students’ learning gains in the form of their grades or performance in assignments, the two courses were overall very well received, as measured by high official university student feedback. Moreover, as an online scaffolding strategy prior to face-to-face classroom time, in the gamification platform the lecturer perceived that students had better comprehension of the readings selected, and this in turn should translate into better student performance in assessments (Vygotsky 1978; Jones 2019; Kruiper et al. 2022; Syakur and Aziz 2020).

Conclusion

This article has observed the potential benefits of gamification in educational platforms as well as the drawbacks, such as time and financial resources. The resources required are substantial, and the sophisticated games students are used to playing creates additional pressure to invest even more resources; for instance, even though the game used here was generated by design professionals, one student characterized the gameplay as “quite rudimentary.” Financial and resource drawbacks prompted the authors to investigate the feasibility of gamification migration from a language module to the very different subject matter of law. As described in this article, the migration process demonstrated considerable time and resources saving, for the lecturer and the university. The quantitative feedback could be interpreted as detracting from the value of migration, as only half of the students preferred the gamification platform. On the other hand, these results also indicate that student preferences were split between multiple, different platforms. If universities choose to implement multiple platforms, this supports the use and development of migration strategies to control costs even more strongly. Future studies could investigate this claim by exploring whether the benefits of dedicating resources to gamification platforms outweigh resource constraints through a comparison of student learning gains. Similarly, student performance and learning style data can help measure the practicality of offering multiple forms of assessment for students to accommodate their different learning styles in this context.

References

Anderson, Lorin W., and David R. Krathwohl. 2001. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. London: Longman.

Barata, Gabriel, Sandra Gama, Joaquim Jorge, and Daniel Gonçalves. 2015. “Gamification for Smarter Learning: Tales from the Trenches.” Smart Learning Environments 2, no. 10.

Beale, Ivan L., Pamela M. Kato, Veronica M. Marin-Bowling, Nicole Guthrie, and Steve W. Cole. 2007. “Improvement in Cancer-related Knowledge Following Use of a Psychoeducational Video Game for Adolescents and Young Adults with Cancer.” Journal of Adolescent Health 41, no. 3: 263–270.

Chan, Jason C. K., and Kathleen B. McDermott. 2007. “The Testing Effect in Recognition Memory: A Dual Process Account.” Journal of Experimental Psychology: Learning, Memory, and Cognition 33, no. 2): 431–437. http://dx.doi.org/10.1037/0278–7393.33.2.431.

Clement, J. 2022. “Online Gaming - Statistics & Facts.” Statista (May 9). https://www.statista.com/topics/1551/online-gaming/.

Connolly, Thomas M., Elizabeth A. Boyle, Ewan MacArthur, Thomas Hainey, and James M. Boyle. 2012. “A Systematic Literature Review of Empirical Evidence on Computer Games and Serious Games.” Computers & Education 59, no. 2: 661–686.

de Oliveira Dias, Murillo, Raphael de Oliveira Albergarias Lopes, and André Correia Teles. 2020. “Will Virtual Replace Classroom Teaching? Lessons from Virtual Classes via Zoom in the Times of COVID-19.” Journal of Advances in Education and Philosophy 4, no. 5: 208–213.

Deterding, S. 2012. “The Gameful Classroom. A Workshop at Games.” Learning & Society 8.

Dicheva, Darina, Christo Dichev, Gennady Agre, and Galia Angelova. 2015. “Gamification in Education: A Systematic Mapping Study.” Journal of Educational Technology & Society 18, no. 3: 75–88.

Field Level Media Reuters. 2020. “Report: Gaming Rrevenue to Top $159B in 2020.” Reuters (May 12), 2020. https://www.reuters.com/article/esports-business-gaming-revenues-idUSFLM8jkJMl.

Fleming, Neil D., and Colleen Mills. 1992. “Not Another Inventory, Rather a Catalyst for Reflection.” To Improve the Academy 11: 137–155.

Huang, Wendy Hsin-Yuan, and Dilip Soman. 2013. “Gamification of Education.” Research Report Series: Behavioural Economics in Action 29, no. 4: 37.

Jones, Steve. 2003. “Let the Games Begin: Gaming Technology and College Students.” Pew Research Center (July 6). http://www.pewinternet.org/2003/07/06/let-the-games-begin-gaming-technology-and-college-students/.

Jones, Jennifer A. 2019. “Scaffolding Self-regulated Learning through Student-generated Quizzes.” Active Learning in Higher Education 20, no. 2: 115–126.

Karpicke, Jeffrey D., and Henry L. Roediger III. 2008. “The Critical Importance of Retrieval for Learning.” Science 319 (2008): 966–968.

Kruiper, Stephanie M. A., Martijn J. M. Leenknecht, and Bert Slof. 2022. “Using Scaffolding Strategies to Improve Formative Assessment Practice in Higher Education.” Assessment & Evaluation in Higher Education 47, no. 3: 458–476.

Maican, Catalin, Radu Lixandroiu, and Cristinel Constantin. 2016. “Interactivia. ro — A Study of a Gamification Framework using Zero-cost Tools.” Computers in Human Behavior 61: 186–197.

Manzano-León, Ana, Pablo Camacho-Lazarraga, Miguel A. Guerrero, Laura Guerrero-Puerta, José M. Aguilar-Parra, Rubén Trigueros, and Antonio Alias. 2021. “Between Level Up and Game Over: A Systematic Literature Review of Gamification in Education.” Sustainability 13, no. 4: 2247. https://doi.org/10.3390/su13042247.

Newton, Philip M., Hannah Farukh Najabat-Lattif, Gabriella Santiago, and Atharva Salvi. 2021. “The Learning Styles Neuromyth Is Still Thriving in Medical Education.” Frontiers in Human Neuroscience 6, no. 15.

Nte, Sol, and Richard Stephens. 2008. “Videogame Aesthetics and e-Learning: A Retro-looking Computer Game to Explain the Normal Distribution in Statistics teaching.” In 2nd European Conference on Games-Based Learning: 341–48.

O’Donovan, Siobhan, James Gain, and Patrick Marais. 2013. “A Case Study in the Gamification of a University-Level Games Development Course.” In Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference, 242–251. New York: Association for Computing Machinery. https://doi.org/10.1145/2513456.2513469.

Ortiz, Chris C. 2016. “Game on a Dime.” Talent Development 70, no. 12: 76. https://www.td.org/magazines/td-magazine/game-on-a-dime.

Papastergiou, Marina. 2009. “Digital Game-Based Learning in High School Computer Science Education: Impact on Educational Effectiveness and Student Motivation.” Computers & Education 52, no. 1: 1–12.

Roediger III, Henry L., and Andrew C. Butler. 2011. “The Critical Role of Retrieval Practice in Long-Term Retention.” Trends in Cognitive Sciences 15, no. 1: 20–27.

Ryan, Richard M., and Edward L. Deci. 2000. “Self-determination Theory and the Facilitation of Intrinsic Motivation, Social Development, and Well-being.” American Psychologist 55, no. 1: 68–78.

Sánchez-Mena, Antonio, and José Martí-Parreño. 2017. “Drivers and Barriers to Adopting Gamification: Teachers’ Perspectives.” Electronic Journal of e-Learning 15, no. 5: 434–443.

Stefanile, Adam. 2020. “The Transition From Classroom to Zoom and How it Has Changed Education.” Journal of Social Science Research 16, no. 1: 33–40.

Sutton-Brady, Catherine. 2021. “Zooming through a Pandemic: A COVID-19 Approach to Teaching.” Marketing Education Review 31, no. 3: 256–261.

Sward, Katherine A., Stephanie Richardson, Jeremy Kendrick, and Chris Maloney. 2008. “Use of a Web-based Game to Teach Pediatric Content to Medical Students.” Ambulatory Pediatrics 8, no. 6 : 354–359.

Syakur, Abd, and Rosidi Azis. 2020. “Developing Reading Learning Model to Increase Reading Skill for Animal Husbandry Students in Higher Education.” Britain International of Linguistics Arts and Education (BIoLAE) Journal 2, no. 1: 484–493.

Tan, Lynette Yuen Ling. 2018. “Meaningful Gamification and Students' Motivation: A Strategy for Scaffolding Reading Material.” Online Learning 22, no. 2: 141–155. https://doi.org/10.24059/olj.v22i2.1167.

The World Bank Group. Population, total. Accessed July 12, 2022. https://data.worldbank.org/indicator/SP.POP.TOTL.

Vygotskiĭ, Lev Semenovich, and Michael Cole. 1978. “Interaction Between Learning and Development.” In Mind in Society: The Development of Higher Psychological Processes, edited by Michael Cole, Vera John-Steiner, Sylvia Scribner, and Ellen Souberman, 79–91. Cambridge: Harvard University Press.

Weinstein, Yana, Kathleen B. McDermott, and Henry L. Roediger III. 2010. “A Comparison of Study Strategies for Passages: Rereading, Answering Questions, and Generating Questions.” Journal of Experimental Psychology: Applied 16, no. 3: 308.

Appendix

Mixed-methods survey used in this study

Question:QuantitativeQualitative
1: Which of the 2 scaffolding strategies (IVLE quiz, Gamification quiz) did you experience?X
2. Which of the 2 strategies did you find helpful (more than one response is possible)?X
3: Which of the 2 strategies did you find the most helpful (only one response is possible)?X
4: Please explain why you found this strategy the most helpful.X
5: Which of the 2 strategies did you find the least helpful (only one response is possible)?X
6: Please explain why you found this strategy the least helpful.X
7: Please let us know what you liked and what you thought could be improved on the gamified quiz of the Eyster reading. If you have other comments regarding the IVLE quiz, please feel free to add them at the end. Thank you very much for all your contributions, they are greatly valued.X

About the Authors

Lynette Tan is the Director of Studies and Associate Director of Student Life at Residential College 4 (RC4), National University of Singapore (NUS) where she is at present Senior Lecturer. She teaches systems thinking philosophies and tools that empower students to be humane change agents as they navigate global issues that are critical in the twenty-first century. Lynette has also published on gamification in higher education, student motivation and higher order learning skills. In recognition for excellence in teaching she has been placed on both the Annual Teaching Excellence Award and Residential Colleges Teaching Excellence Award honor rolls at NUS.

Helena Whalen-Bridge is Associate Professor, National University of Singapore, Faculty of Law. Helena’s areas of teaching and research are access to justice, professional regulation and education, and legal narrative. She has received NUS Teaching Excellence Awards and is an Expert in UNODC’s Education for Justice project. Helena’s publications in legal pedagogy include, “The Rhetoric of Corruption and The Law School Curriculum: Why Aren’t Law Schools Teaching About Corruption? (2018), “A Common Law Fly on the Transsystemic Wall: Observing the Integrated Method at McGill Faculty of Law” (2016), and “We Don’t Need Another IRAC: Identifying Global Legal Skills” (2014). 

Wee Ying Qin is an Instructional Designer specializing in adult education, learning management systems (LMS), curriculum development, and e-learning. He has a master’s degree in instructional design and technology. Throughout his career, he has developed a passion about instructional design, edtech, and design. He is skilled in analyzing learning needs, design and developing content, and evaluating the effectiveness of training programs. He is passionate about creating innovative and interactive learning experiences which incorporate gamification elements that help learners to be more engaged while achieving their desired learning outcome.

Attribution-NonCommercial-ShareAlike 4.0 International

This entry is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license.

Annotate

Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org