Enabling Interactivity through Design: Outcomes from a Gamified Health Insurance Onboarding Course

, , &
DOI:10.51869/102/nb
GamificationUnderstanding by Design FrameworkInteractivityWorkplace Learning
The purpose of this study was to examine new hires’ learning outcomes and perceptions of an interactive gamified e-learning course in a health insurance organization. The conceptual framework drew from literature surrounding the Understanding by Design process and the relationship between engagement and interactivity in e-learning. The researchers also explored the role of course design to support interactivity. This article employed a pre-test, post-test, and a survey implemented to 121 new hires to examine learning outcomes and perceptions of a gamified e-learning course. To provide an in-depth understanding of the quantitative data, the authors followed up with 10 semi-structured interviews. Results from this study highlighted that participants experienced high levels of engagement and understanding of foundational insurance information.

Introduction

Asynchronous web-based learning is a prevalent workplace learning strategy used by learning and development professionals (Lester et al., 2013) who turn to the modality to adapt to changing technology and a flexible workforce (Akdere & Conceicao, 2006; Long & Smith, 2003). Within this context, gamification has emerged as a popular tool for enhancing learner engagement in both instructor-led and web-based courses (Alsawaier, 2017; Calvo & Reio, 2018; Jabbar & Felicia, 2015). There is limited research, however, on the effect of gamification on engagement and knowledge gains. Additionally, there is limited research on design strategies to enable engagement in asynchronous gamified courses. To add to the body of literature, the authors explored the impact of a gamified course design on engagement and knowledge gained in an asynchronous gamified course delivered in a health insurance organization.

Gamification is the practice of incorporating elements of game design into a course to increase engagement and motivation (Dichev & Dicheva, 2017). Game elements include, but are not limited to, challenges, points, leaderboards, leveling up, and badges. A course is “gamified” when it incorporates elements of games. For this reason, gamification and gamified courses are hereafter accompanying terms. The authors employed gamification techniques in a course titled Member Journey that was delivered asynchronously to new hires on their first day of functional job-based training. The gamification techniques used included challenges, achievements, leveling up, rules, and goals.

The course covered the customer life cycle from a customer’s perspective. By the end of the course, learners were expected to be able to define terminology, categorize business units, and order steps in the customer experience. In the course, two guides helped the learners through four levels of challenges with increasing complexity. These challenges were presented as destinations in the Member Journey. Learners received stamps in their passports when a new destination was reached. At each destination, the learners completed a challenge. When passed, they were able to “level-up” to the following destination. Each new destination would contain a challenge with prerequisite knowledge from the previous destinations. The learner’s goal was to accumulate all stamps in their passports. To explore the impact of course design on engagement and performance, the authors incorporated features to enable the gamification in the course, such as narrative, plot, rules, increasing complexity, and characters. 

The conceptual framework drew from literature addressing the relationship between engagement and game elements in e-learning and how course design promotes greater engagement and understanding of course content in gamified courses. This study is beneficial for field practitioners and scholars who are interested in strategies to enhance gamification approaches.

Literature Review

Importance of the Design Process

Technology-based learning represents an ongoing growing trend in businesses (Ho, 2017), however, critical discussions around the effectiveness of different delivery methods have created several classes of thought (Bell & Federman, 2013). One class of thought identified by Bell and Federman (2013) maintains that the effectiveness of e-learning instruction is dependent upon the vigor of the instructional design process. The perception of this group is that the most appropriate delivery method varies based on the learning context, content, and audience (Bell & Federman, 2013). The authors also believe that pedagogical approaches are an avenue to learning and not learning itself and the success of a learning approach is dependent upon a thorough instructional design process that considers organizational objectives, audience, technology and learning environment.

The Relationship Between Engagement and Interactivity in E-Learning

If e-learning is determined to be an effective approach, it is important to consider participant engagement. Engagement in learning refers to the level of positive emotional response and commitment a learner feels while completing learning tasks (Young, 2010). This is critical because research has indicated that engagement in learning influences critical thinking, determination, and performance (Young, 2010). Performance, in this context, refers to competence in a role or task and determination to complete role responsibilities or tasks.

Research supports increased learner engagement can lead to increased knowledge gains (Bloom, 1956; Nkhoma et al., 2014). For some researchers, learning engagement is seen to have a significant positive effect on knowledge and performance that as a result, the level of engagement influences the breadth and depth of the knowledge gained (Nkhoma et al., 2014). For others, the effects of learning engagement can be captured in a hierarchy of engagement levels and outcomes (Bloom, 1956). 

An emergent area of research centers on the need for interactivity to promote engagement in e-learning courses (Zhang & Zhou, 2003). In a comparison of learners in an interactive e-learning environment to those in the traditional classroom environment, Zhang and Zhou (2003) found that interactive courses increased flexibility and engagement and achieved higher levels of knowledge gains and participant satisfaction. 

Interactivity in e-learning may refer to any action in which the learner influences the learning program through inputs or actions, such as text entry, clicks, and rollovers. Additionally, interactivity goes beyond physical actions (Hong et al., 2014; Woo et al., 2010). When the definition of interactivity is expanded to include any components of two-way communication, then complex cognitive interactions or online social exchanges can be considered e-learning interactivity (Hong et al., 2014). In fact, intangible interactions such as games, stories, and other strategies were found by Chatterjee (2010) to increase engagement at a greater rate than physical components. 

Interactivity in Gamified Courses 

Gamification represents an interactive approach that is used by organizations to engage employees and reduce knowledge gaps (Calvo & Reio, 2018). Underlying game components activate learner engagement through cognitive interactivity, but gamification is distinct in also requiring physical action. For example, physical interactions in an e-learning game would include clicks, drag-and-drop, text entry, and other one-way approaches to influence an e-learning module. Cognitive interactions are reciprocal between the e-learning solution and the learner. Examples of these include problem-solving scenarios and increasingly difficult challenges. By incorporating both mental and physical components of interactivity, gamified e-learning achieves advanced levels of interactivity (Chang et al, 2018).

Course Design to Support Interactivity Through Gamification

In the Member Journey course, storytelling techniques were used to support the gamification. Storytelling has been shown to increase interactivity and engagement in a course (Baldwin & Ching, 2016) and to promote information storing and comprehension (Novak, 2015). These techniques included a plot and support for the overarching plot through characters, a journey, conflict, and resolution (Baldwin & Ching, 2016). Characters in particular assisted in the development of the story and guided learners through plot points, a technique that has been shown to increase engagement (Smeda et al., 2010). The authors examined the impact of using storytelling to enable the interactivity and engagement of gamification in the Member Journey course.

Engagement and Learning Achievement with Gamified E-Learning

Game elements such as challenges and awards are powerful tools for promoting motivation and engagement through interactivity (Alsawaier, 2017; Calvo & Reio, 2018; Jabbar & Felicia, 2015) and interactive storytelling can increase engagement and comprehension (Baldwin & Ching, 2016; Novak, 2015). Gamification, however, also directly contributes to knowledge gains by increasing engagement (Calvo & Reio, 2018). Calvo and Reio (2018) found higher levels of engagement linked to higher knowledge gains in a gamified course administered to professionals in the tourism industry. The more engaged learners were in a gamified course, the greater the level of knowledge gains.

Jabbar and Felicia (2015) found similar results in a systematic review of engagement and knowledge gains in gamification. They found that gamified learning increased knowledge attainment. Gamified learning is most effective when components contributing to emotional and cognitive engagement are present in the course design. Combining multiple types of interactivity and using an effective design has the greatest impact on engagement and knowledge gains (Jabber & Felicia, 2015). The article also examines the effects of an interactive gamified course design on understanding of content.

Methods

Purpose of the Study

Limited research is available on design approaches to facilitate engagement in asynchronous gamified courses. This study adds to the body of knowledge for scholars and practitioners by exploring the impact of a gamified course design on engagement and understanding of course content in an asynchronous gamified new hire course titled Member Journey. The course was delivered on the first day of functional job-based training at a health insurance organization.

Through quantitative approaches, this study answers the following research questions:

Research Design

The study primarily employed a quantitative driven approaches. The researchers administered a pre- and post-test to consider participants’ understanding of course content from the intervention of the gamified course. A survey was used to gain insight into the participants’ views and attitudes on the instructional approaches. After collecting and analyzing the quantitative data, the authors conducted semi-structured interviews to delve deeper into the topics and provide rich descriptions on participants' attitudes (Creswell, 2014). Participants were assured of anonymity; in fact, any names used in the article to enhance the narrative are pseudonyms.

Setting

The United States based health insurance company employs over 23,000 people; more narrowly, the specific division this project focused on includes 9,000 concierge customer service employees. In 2017-2018, leadership of the concierge customer service division determined that many new hires did not have sufficient knowledge of health insurance basics, such as terminology and insurance claim and inquiry processes. To address these knowledge gaps, instructional designers employed the Understanding by Design Framework (UbD) to inform the course design (Wiggins & McTighe, 2011). UbD is an instructional design framework composed of three stages: Stage 1 involves identifying desired results; Stage 2 involves determining assessment evidence; and Stage 3 encompasses planning the learning experience and instruction (McTighe & Wiggins, 2012).

Identify Desired Results

The desired result of the fundamental onboarding course was for employees to understand foundational elements of the business and the customer experience. Several instructor-led and e-learning learning modalities were considered to address this knowledge gap. To determine the most appropriate learning solution, leadership and instructional designers considered several factors. In 2017 and 2018, the health insurance instructional designers deployed physically interactive techniques, such as clickables, dragging features, text entry, and system simulations in the organization’s e-learning courses. The courses were positively received by learners and they supported the evaluation of core competencies. Expanding upon this innovation, the company sought more immersive and cognitively interactive approaches. Additionally, the business wanted to implement a course covering foundational content across multiple lines of business. Stakeholders determined that an instructor-led approach would not be an appropriate fit with available resources, so e-learning solutions were explored.

Determine Assessment Evidence

It was determined that assessment evidence would be gathered through mixed methods strategies including a knowledge check on core competencies and a survey centered on the learner experience. The knowledge check allowed a timely measurement of knowledge gained in the course and the survey provided an insight into the Member Journey course relative to other courses and modules in the program. Success in the evaluation of learner performance would be achieved when a learner could demonstrate the effective application of the terminology, concepts, and processes learned in a different scenario and context. The scenarios outlined in the pre- and post-assessment focused on unique situations to gauge learner adaptability in new contexts. The UbD framework considers this level of adaptability to be evidence for knowledge transfer (McTighe & Wiggins, 2012). The questions in the pre- and post-assessment included prerequisite knowledge learners worked through to solve the problem. The result was a holistic assessment approach that considered the comprehension and application of concrete and abstract information and concepts.

Plan Learning Experiences and Instruction

Within the organization, Member Journey represents a complex e-learning course. The organizational standard of complex e-learning includes: (a) audio (including scripting); (b) video; (c) interactive simulations; (d) high interactivity; (e) multiple animations. The learning solution consisted of a 30-minute Storyline course informed by gamification e-learning principles. Articulate Storyline, an authoring software that promotes interactive and personalized online and mobile courses, was used to develop the course (Articulate Global, 2019). The gamified e-learning course reviewed foundational insurance topics from the member’s perspective across major topics: the purpose of insurance, retail and group plan enrollment, benefits terminology and claim submission, and appeals. The learning solution presented content sequentially in the context of the overall customer journey enabling employees to interact and work through a customer’s experience with the company.

Learners completed the course in an instructor-led computer lab setting as part of an instructor-led new-hire training program that incorporated a variety of learning approaches such as experiential activities, scenario-based exercises, e-learning courses, and others. The gamified e-learning course on health insurance basics was self-directed and completed on the first day of the program.

At the beginning of the gamified e-learning course, storytelling elements were introduced when the learners were presented with two narrators, Gil and Gidget. The two narrators explained that during the course they would be going on the customer experience journey by visiting four islands. Each island included a game-based challenge focusing on one of the major topics. Once the challenge was completed successfully, the participant received a badge in the form of a stamp on a passport. At the conclusion of the course, the facilitators reviewed and debriefed the content.

In sum, working through the UbD process revealed that a gamified e-learning approach aligned with the audience and the desired results. The approach allowed employees to work through a customer’s experience with the company and allowed the business to assess knowledge gains through an interactive process.

Participants

The researchers collected data between October and December 2018. The researchers administered a pre- and post-test and a survey to 121 employees in eight new-hire classes across the United States on the first day of onboarding. The participants were primarily under the age of 39; 57% percent were under 29 and 22% were between the ages of 30-39. 82% of participants were female. 43% of participants had some college and 27% had a high school degree or equivalent. Only 11% of the participants had more than five years of experience in healthcare; in fact, 60% of participants had less than one year of previous experience. Follow-up interviews were performed to add depth to the discussion. Ten participants of the 121 were randomly selected to engage in semi-structured one-on-one 60-minute interviews to gain a deeper understanding of their perceptions on the gamified course. The interview protocol consisted of six questions (see Appendix B). 

Procedure

The tests and survey were administered in an online format. Participants engaged in the pre-test prior to completing the Member Journey course. Participants responded to the post-test and survey after completing the course. The pre- and post-test were composed of four multifaceted multiple-choice questions. The tests aligned to the topics at the “destinations” including insurance foundations, enrollment, benefits, and claims. The questions were scenario-based and challenged the learners to problem-solve as well as recall knowledge. The pre- and post-test are equivalent tests in number of questions, difficulty, and subject matter. Parallel tests were used to avoid problematic issues inherent with test-retesting.

The survey consisted of nine multiple choice questions and one open-ended question (see Appendix A). The survey questions centered on demographics, course perceptions, and perceptions of course quality and usability. Previous experience in health care was considered by less than 1 year, 1-3 years, 3-5 years, and more than 5 years. Perceptions on the narration, course activities and engagement were measured on a five-point Likert scale. Narration questions centered on whether the narrators assisted participants in understanding their role. The researchers examined whether the activities (i.e., drag and drop exercise, a tic-tac-toe game, etc.) helped build understanding of concepts.

Participant responses were housed in the organization’s learning management system. The day after the pre-test, post-test, and survey were administered, the researchers began the process of accessing, cleaning, and compiling the data.

Data Analysis

Quantitative data were analyzed using SPSS statistical software. For the first research question, employee pre-test scores and post-test scores were compared using a t-test. To answer the second research question, a multiple linear regression analysis was used. Multiple regression analysis is a statistical technique to explore the relationship between a dependent variable and any one of the independent variables when the other independent variables are held fixed (Hoffmann, 2010). Employee post-test scores were used as a dependent variable and employee healthcare experience, perception on narration, course activities, and engagement were used as independent variables. Residual analysis, multicollinearity with tolerance and VIF (variance inflation factor) values and error term were reviewed to check all required model assumptions to conduct the regression model.

Results

Research Question 1. Does the Gamified Course Enable Learners to Reach Training Goals?

The average scores in the pre- and post-test for all employees are displayed in Figure 1. In addressing the first research question, participants demonstrated on average a 9.5-point increase in scores on foundational insurance topics from the pretest to the post-test. The average score in the pre-test was 65.5 (SD=24.42) and the average score in the post-test was 75 (SD=21.65) out of 100. This was a statistically significant difference (t(120) = -4.11, p < 0.001) and supported the design team’s completion goal.

Figure 1

Knowledge Gains Comparison in Pre-Test and Post-Test

10-2-Merrild-Fig1.PNG
A Bar Graph Showing a Comparison of Knowledge Gains in Pre-Test and Post-Test Scores

Research Question 2. What Are the Significant Factors (Healthcare Experience, Narration, Course Activities, and Engagement) That Predict Employee Post-Test Scores?

Answering the second research question, prior to multiple linear regression analysis, Pearson’s correlation analysis was conducted (see Table 1). Participants’ post-test scores had a positive correlation with health care experience (r = .20, p < .05), narration (r = .25, p < .01), course activities (r = .22, p < .01), and engagement (r = .22, p < .01). Learning activities had a strong positive correlation with narration (r = .69, p < .001) and engagement (r = .68, p < .001). Narration had a strong positive correlation with engagement (r = .83, p < .01). Multicollinearity with tolerance and VIF values were checked and there were no issues with multicollinearity in this study. Table 2 shows that health care experience (β = .21, p < .05) was a significant factor impacting participants’ post-test scores. The regression model was a good fit (R2=10.7%) and the relationships were statistically significant in explaining 10.7% variance in post-test scores using four independent variables.

Table 1

Correlations Among Variables

 Post-Test ScoreHealthcare ExperienceNarrationCourse Activities
Post-Test Score-   
Healthcare Experience.195*-  
Narration.246**-.044- 
Course Activities.217**-.001.690***-
Engagement.217**-.058.830***.680***

Note. *p < .05, **p < .01, ***p < .001

Table 2

Predicting Factors for Post-Test Score

 BStd. ErrorβtSig.
Constant33.4014.16 2.36*.020
Healthcare Experience4.291.84.212.33*.022
Narration4.754.36.181.09.278
Course Activities1.993.42.07.58.561
Engagement.955.31.03.18.858
R2.107
Adjusted R2.076
F3.463*

Note. *p < .05

Research Question 3. What Are the Employees’ Perceptions of the Course, Gamification, Interactive Elements, and Interaction Enablers?

Regarding the third research question, participants also expressed positive perceptions of the interactivity and gamification strategies. The employees’ perceptions of the course rating are presented in Figure 2. 92% of participants rated the course as excellent or good and 93% of learners expressed that engagement through the course promoted their understanding of course content.

Participants also expressed positive perceptions surrounding the activities and narrators. 91% reported they strongly agreed or agreed that the activities helped their understanding of concepts. 89% of participants shared they strongly agreed or agreed the narrators enhanced their understanding of their role. A majority (87%) of participants strongly agreed or agreed that they preferred to learn content through the learning game approach rather than other approaches (e.g., lecture, instructional videos, assigned reading, etc.).

Figure 2

Employees’ Perceptions on the Course and Gamifies Elements

10-2-Merrild-Fig2.PNG
A Pie Chart Showing Employees' Perceptions of the Gamified Elements in the Course

Research Question 4. How Is the Impact of Gamification on Employee Knowledge Gains Different Between Lower Performers and Higher Performers?

To answer the research question, the difference in knowledge gains from the pre-test to the post-test were compared between the lower performers (post-test scoremean ≤ 50 out of 100) and the higher performers (post-test scoremean > 75 out of 100).

The scores in the pre- and post-test between the two groups are presented in Figure 3. Higher performers’ average score in the pre-test was 80.26 (SD = 20.27), while lower performers’ average score in the pre-test was 44.35 (SD = 10.63). There was a 35.91gap, a statistically significant difference (t(67) = -4.81, p < .001). Higher performers’ average score in the post-test was 100.00 (SD = 0.00), while lower performers’ average score in the post-test was 54.84 (SD = 23.65). There was a 45.16 gap, again a statistically significant difference (t(67) = -32.34, p < .001). Higher performers were closer to training goals than lower performers.

Figure 3

Knowledge Gains Comparison in Pre-Test and Post-Test Between Lower Performers and Higher Performers

10-2-Merrild-Fig3.PNG
A Bar Graph Showing Differences in Lower and Higher Performers Pre- and Post-Test Performance

Research Question 5. How Is the Time Spent on Pre- and Post-Test Different Between Lower Performers and Higher Performers?

Time spent in the pre- and post-test between the two groups is displayed in Figure 4. Higher performers’ average time spent in the pre-test was 135.03 seconds (SD = 112.94), while lower performers’ average time spent in the pre-test was 93.53 seconds (SD = 32.38). There was a 41.50 gap, a statistically significant difference (t(67) = 2.16, p < .05). Higher performers’ average time spent in the post-test was 74.81 (SD = 34.20), while lower performers’ average time spent in the post-test was 71.45 (SD = 43.67). There was a 3.36 gap, but the result was not a statistically significant difference.

Figure 4

Time Spent Comparison Between Lower Performers and Higher Performers

10-2-Merrild-Fig4.PNG
A Bar Graph Showing Time Spent by Lower and Higher Performers on Testing Activities

Research Question 6. How Are the Employees’ Perceptions of the Course, Gamification, Interactivity, and Interaction Enablers Different Between Lower Performers and Higher Performers?

The employees’ perceptions of the course rating and its components are compared in Figure 5. Overall, the employees’ perceptions on the course and the interactive components in the gamified course were statistically significantly different between the lower performers and the higher performers. Higher performers rated higher than lower performers on course rating (higher performers = 4.68 and lower performers= 4.06), how much activities helped their understanding (higher performers = 4.68 and lower performers = 4.19), how much the narrator helped their understanding (higher performers = 4.50 and lower performers = 4.03), how much engagement helped their understanding (higher performers = 4.71 and lower performers= 4.32) and how much the interactive gamified approach helped their understanding (higher performers = 4.57 and lower performers = 4.16).

The gaps between the two groups showed statistically significant differences in course rating (t(67) = -3.49, p < .01), activities (t(67) = -2.59, p < .05), narrator (t(67) = -2.11, p < .05) and engagement (t(67) = -2.23, p < .05). The gap in course rating between the two groups was not a statistically significant difference. In sum, for the high performers, the interactive gamified course design facilitated engagement and promoted knowledge gains.

Figure 5

Employees’ Perceptions of the Course and Gamification Between Lower Performers and Higher Performers

10-2-Merrild-Fig5.PNG
A Bar Graph Comparing Lower and Higher Performers' Perceptions of Gamified Activities 

Discussion

The researchers’ goal in conducting this study was to examine new hires’ learning outcomes and perceptions of an interactive gamified e-learning course in a health insurance organization. The researchers’ found that a gamified e-learning course was well received by participants and assisted them in understanding foundational information on insurance terminology, processes, enrollment, benefits, and claims.

The learners’ positive perception of the course and understanding of the course content were the effect of several factors. The instructional designers employed the UbD process to uncover whether gamification was a good fit for the audience and appropriate in meeting the organization’s goals. Additionally, the gamified elements and engagement enablers, such as characters, increasingly complex challenges, and an overarching narrative, supported high levels of interactivity, which in turn enhanced learner engagement and understanding of content.

McKimm, Jollie and Cantillon (2003) explained that e-learning courses are often incorporated into blended learning programs. Varying learning approaches can enhance learner engagement by challenging learners in different ways, such as through visually applying demonstrations, problem-solving scenarios, or interactive exercises. The Member Journey course mirrors this trend. In the new-hire program, the gamified Member Journey course was a component of a larger program including instructor-led lessons, videos, and e-learning courses with simple to average amounts of interactivity and animation. It is critical to align the best learning solution to the content and larger experience; interestingly, several of the participants were also attuned to this. For example, Emily echoed this sentiment with:

I don't think it'd be beneficial to do all games during the learning course. So, to change it up. So, some of them are reading. Some of them were typing. Some of them were playing a game keeps . . . you on your toes and keeps it different.

The results supported the rigor involved in working through an instructional design framework and revealed that the interactive courses resonated with participants. Beyond solely the Member Journey course, nine out of ten learners interviewed found the most interactive e-learning courses the most effective. Lee highlighted “the ones that are interactive, which you have to click on to advance or to try to figure out the scenarios . . . helps me the most.” 92% of participants rated the Member Journey course as excellent or good and participants demonstrated on average a 9.5-point increase in scores from the pre-test to the post-test. From the follow-up interviews, nine out of ten learners’ expressed positive impression with the entertaining course structure compared to the informational and knowledge test-based courses. Aiden shared “I had a little more fun with this, it wasn't as stressful as if I go . . . cramming for an exam.” In this context, the researchers’ findings confirmed that the solution was a good fit based on the thorough consideration of the learning context, audience, and business need.

The learners were perceptive to the importance of considering the learning context. The results revealed game components in the course were beneficial. Participants discussed in the follow-up interviews finding the course layout engaging to interact with, which involved navigating to the new destinations after completing challenges on each island. The challenges involved game-based interactions such as spinning a wheel, flipping over cards and matching content. Emily expressed her reaction to completing the game-wheel challenge noting “ . . . it keeps the fun aspect of it like I'm still playing a game. But I mean you're playing the game and you're learning.” This result aligns with the literature demonstrating the relationship between gamification and engagement (Alsawaier, 2017; Calvo & Reio, 2018; Jabbar & Felicia, 2015).

This study echoes the position that employing an instructional design model (e.g., the ADDIE, SAM, Competency-Based, UbD, Critical Events Model, or others) rigorously will align a learning modality to desired learners’ perceptions and outcomes (Bell & Federman, 2013). Once instructional designers determine the most effective approach for the learning context, content, and audience, they can develop strategies for enhancing engagement (Young, 2010). Similar to Changet al.’s (2018) findings, the instructional designers of the Member Journey course implemented interactivity by employing gamification and course design features that enable interactivity which, in turn, increased engagement. The interplay between the gamification and enabling features is represented by Gil and Gidget navigating participants through the plot and providing participants with the rules of each island challenge. 

The underlying stories in the course mirrored recommendations discussed in the literature which provided learners the opportunity to connect to past experiences and apply them to the content (Smeda et al., 2010). During the follow-up interviews, participants recalled specific examples of Gil and Gidget presenting challenging scenarios. Bill excitedly recounted one challenge in the Member Journey course that resonated due to unique storytelling approaches. He compared it to a “Saturday morning cartoon.” He recalled that “the circle fellow there . . . he climbed a tree and got hit by a coconut.” He further explained “ . . . if someone would have told me that was going to be one of the trainings, I would have given them one of the most confused whatever.” He further shared that this plot device “ . . . prompted the next part of the journey, which was he had to go to the doctor. Because he fell off the coconut tree . . . it was entertaining, and it still made sense in the grand scheme which is impressive.” In the Member Journey course, the pairing of digital storytelling with complex interactions including audio, animations, and interactive simulations resonated with the learners and enhanced their positive perception of the course in comparison to other approaches.

As a whole, 89% of participants shared they strongly agreed or agreed the narrators enhanced their understanding of their role. The challenges required participants to click, drag, or enter content while they engaged in problem solving through the scenarios Gil and Gidget presented. The researchers found that 91% of participants reported they strongly agreed or agreed that the activities helped their understanding of concepts. This evidence demonstrates that learner engagement can lead to increased understanding of the course content (Bloom, 1956; Nkhoma et al., 2014).

According to the results, most participants reported that the course was engaging. Although there was a statistically significant group difference, the gamified course was engaging for both lower performers and higher performers. This suggests a positive correlation between engagement and understanding of course content. This is echoed the literature on higher learning outcomes were the result of learners experiencing greater engagement with the games and overall enjoyment in the course (Nkhoma et al., 2014).

After completing the Member Journey course, 93% of the learners expressed that engagement through the course promoted their understanding of course content. The participants found that the interactive gamified components were fun and motivated their learning. Emily explained “Because I'm a hands-on learner, I like to be able to click on things and, you know, type things in, as opposed to just reading and answering questions. It seems to stick better with me that way.” These findings add to the understanding of how interactivity promotes engagement in e-learning (Zhang & Zhou, 2003). The findings also add to the dialogue on the strategies that increase learner engagement which will, in turn, increase understanding of the content (Bloom, 1956; Nkhoma et al., 2014).

Time spent in the pre- and post-test revealed that there was a statistically significant gap in time spent on pre-test between higher performers and lower performers while there was no significant gap in post-test. We noticed that higher performers had prior knowledge from their comparatively greater field experience and they needed less time to answer the pre-test questions. After the completion of the course, both groups showed reduced time spent in post-test. These results present the parallel pattern that low prior knowledge learners allocate more time to understand the learning materials and process information (Amadieu et al., 2009).

Suggestions for Future Research

The results informed learning solutions at the organization where the study took place and the findings drove the adoption of highly interactive gamification approaches as an initiative for 2019. The findings supported the ideas that interactivity supports higher levels of engagement (Zhang & Zhou, 2003) and that higher levels of engagement can lead to better learning outcomes (Bloom, 1956; Nkhoma et al., 2014). Regarding enabling features, participants rated the storytelling elements highly and said they increased their interest in the content, motivation and connection making (Baldwin and Ching, 2016; Smeda et al., 2010). Overall, the study supported the hypothesis that games increase engagement (Alsawaier, 2017; Calvo & Reio, 2018; Jabbar & Felicia, 2015). A limitation of the study is that a single course was administered instead of multiple gamified courses. Further studies examining the impact of highly interactive and engaging gamified e-learning could be beneficial in understanding this type of course design in the workplace. Furthermore, additional exploration is needed of how varied gamification approaches can be supported to identify consistent results regarding the positive effects of gamified courses in corporate settings.

This study examined 121 employees in eight classes. This represents a small portion of the approximately 1,148 concierge customer service new hires brought into the organization each year. The more robust population provides the opportunity for a longitudinal study to holistically examine the new hires’ learning outcomes and perceptions of the game-based e-learning course and its course design. The participants in the course have the opportunity to reinforce their knowledge during their daily work. The researchers recommend surveying the participants after six months to assess their perceptions on how the course prepared them for work and the gamification strategies by comparison to other approaches experienced in their first six months of onboarding.

This study took place at a health insurance organization. Even though there is limited research on the impact of gamified learning in this setting, enabling interactivity through course design is not a new phenomenon in workplace e-learning. There is opportunity for further examination of gamified e-learning across the healthcare industry.

References

Akdere, M., & Conceicao, S. (2006). Integration of human resource development and adult education theories and practices: Implications for organizational learning. Online Submission. http://www.ulib.niu.edu:2274/contentdelivery/servlet/ERICServlet?accno=ED492681

Alsawaier, R. S. (2018). The effect of gamification on motivation and engagement. The International Journal of Information and Learning Technology, 35(1), 56-79. https://doi.org/10.1108/IJILT-02-2017-0009

Amadieu, F., van Gog, T., Paas, F., Tricot, A., & Marine, C. (2009). Effects of prior knowledge and concept-map structure on disorientation, cognitive load, and learning. Learning and Instruction, 19, 376–386. https://doi.org/10.1016/j.learninstruc.2009.02.005

Articulate Global. (2019). Storyline 3. https://articulate.com/

Baldwin, S., & Ching, Y. H. (2017). Interactive storytelling: Opportunities for online course design. Tech Trends, 61(2), 179-186. https://doi.org/10.1007/s11528-016-0136-2

Bell, B., & Federman, J. (2013). E-Learning in postsecondary education. The Future of Children, 23(1), 165-185. https://doi.org/10.1353/foc.2013.0007

Bloom, B. S. (1956). Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. David McKay Co Inc.

Calvo, L. C., & Reio, T. (2018). The relationship between engagement and knowledge attainment in a computer-based training game and job performance of travel agents. The Journal of Management Development, 37(5), 374-384. https://doi.org/10.1108/JMD-03-2017-0063

Chang, C., Warden, C., Liang, C. & Lin, G. (2018). Effects of digital game-based learning on achievement, flow, and overall cognitive load. Australasian Journal of Educational Technology, 34(4), 155-167. https://doi.org/10.14742/ajet.2961

Chatterjee, P. (2010). Entertainment, engagement and education in e-learning. Training & Management Development Methods, 24(2), 601-621. https://search-proquest-com.gatekeeper.chipublib.org/abicomplete/docview/202612033/2FB1A87A1B5D49FCPQ/30?accountid=303

Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches. (4th ed.). Sage. 

Dichev, C., Dicheva, D. (2017). Gamifying education: What is known, what is believed and what remains uncertain: a critical review. International Journal of Educational Technology in Higher Education, 14(9). https://doi.org/10.1186/s41239-017-0042-5

Hoffmann, J. P., & Shafer, K. (2015). Linear regression analysis: Assumptions and applications. NASW Press.

Ho, M. (2017). Learning investment and hours are on the rise. TD Magazine. https://www.td.org/magazines/td-magazine/learning-investment-and-hours-are-on-the-rise 

Hong, Y., Clinton, G., & Rieber, L. (2014). Designing creative user interactions for learning. Educational Technology, 54(2), 20-25. http://www.jstor.org.gatekeeper.chipublib.org/stable/44430249

Jabbar, A., & Felicia, P. (2015). Gameplay engagement and learning in game-based learning: A systematic review. Review of Educational Research, 85(4), 740-779. https://doi.org/10.3102/0034654315577210

Lester, J. C., Ha, E. Y., Lee, S. Y., Mott, B. W., Rowe, J. P., & Sabourin, J. L. (2013). Serious games get smart: Intelligent game-based learning environments. AI Magazine, 34(4), 31-45. https://doi.org/10.1609/aimag.v34i4.2488

Long, L. K. & Smith, R. D. (2003). The role of web-based distance learning in HR development. Journal of Management Development. 23(3), 270-284. https://doi.org/10.1108/02621710410524122

McTighe, J. & Wiggins, G. (2012). Understanding by design framework. https://www.ascd.org/ASCD/pdf/siteASCD/publications/UbD_WhitePaper0312.pdf

Nkhoma, M., Sriratanaviriyakul, N., Cong, H. P., & Lam, T. K. (2014). Examining the mediating role of learning engagement, learning process and learning experience on the learning outcomes through localized real case studies. Education and training, 56(4). 287-302. https://doi.org/10.1108/ET-01-2013-0005

Novak, E. (2015). A critical review of digital storyline-enhanced learning. Educational Technology Research and Development, 63(3). 431-453. https://doi.org/10.1007/s11423-015-9372-y

Smeda, N., Dakich, E., & Sharda, N. (2010). Developing a framework for advancing e-learning through digital storytelling. Proceedings from the IADIS International Conference e-learning. Freiburg, Germany.

Woo, T., Wohn, K., & Johnson, N. (2011). Categorisation of new classes of digital interaction. Leonardo, 44(1), 90-91. http://www.jstor.org.gatekeeper.chipublib.org/stable/20869409

Young, M. R. (2010). The art and science of fostering engaged learning. Academy of Educational Leadership Journal, 14(1), 1–18. https://www.researchgate.net/publication/266862573_The_art_and_science_of_fostering_engaged_learning

Zhang, D., & Zhou, L. (2003). Enhancing e-learning with interactive multimedia. Information Resources Management Journal, 16(4), 1-14. https://doi.org/10.4018/irmj.2003100101

Appendix A: Survey

  1. What is your age?
  • Less than 20
  • 20-29
  • 30-39
  • 40-49
  • 50-59
  • Greater than 60
  • I would prefer not to answer
  1. What is your gender?
  • Male
  • Female
  • I would prefer not to answer
  1. What is the level of school you have completed? If currently enrolled, please select your highest degree received.
  • Less than high school degree
  • High school degree or equivalent (e.g. GED)
  • Some college but no degree
  • Associate degree
  • Bachelor degree
  • Graduate degree
  1. How long have you worked in health care?
  • Less than 1 year
  • 1 year to less than 3 years
  • 3 years to less than 5 years
  • 5 years to less than 10 years
  • 10 years or more
  1. Overall, I would rate this Member Journey course as . . .
  • Excellent
  • Good
  • Fair
  • Poor
  • Very poor
  1. Please rate how much you agree or disagree with the following statement: The activities (e.g. drag and drop, tic-tac-toe, etc.) in the Member Journey course helped build my understanding of foundational concepts.
  • Strongly agree
  • Agree
  • Undecided
  • Disagree
  • Strongly Disagree
  1. Please rate how much you agree or disagree with the following statement: Gil and Gidget enhanced my understanding of my role in the customer experience.
  • Strongly agree
  • Agree
  • Undecided
  • Disagree
  • Strongly disagree
  1. Please rate how much you agree or disagree with the following statement: This Member Journey course made me engaged in learning.
  • Strongly agree
  • Agree
  • Undecided
  • Disagree
  • Strongly disagree
  1. Please rate how much you agree or disagree with the following statement: I’d prefer to learn this content through the learning game approach than other approaches (e.g. lecture, instructional videos, assigned reading, etc.)
  • Strongly agree
  • Agree
  • Undecided
  • Disagree
  • Strongly disagree
  1. Do you have any suggestion(s) for improving the Member Journey course?

Appendix B: Interview Protocol

  1. Tell me about your understanding of health insurance prior to attending the first day of training.
  1. Describe the learning activities in the training program that stood out to you.
  1. Discuss your perceptions of the Member Journey course in terms of understanding how your role facilitates the member experience. (Show image of course.)
  1. Talk to me about your impressions of the Member Journey course in comparison to other e-learning courses, such as the scenario-based courses. (Show images of different courses, such as a piece of the introduction section and/ or a key topic section.)
  1. In the Member Journey course, you completed a pre-test, games/exercises and a bonus around; how did you feel when you reached the bonus round?
  1. Do you have any suggestions to improve the Member Journey course?
Nicole Buras

HUB International

Nicole Buras, Ed.D. is a scholar-practitioner and a Senior Manager in learning and development at HUB International with the Operations and IT department. At HUB she drives the life cycle of education programs ranging from small scale, single course implementations to larger enterprise-wide program initiatives.
Lauren Merrild

GoHealth

Lauren Merrild is Learning Strategy Manager at GoHealth, where she manages the learning portfolio and is responsible for the analysis and evaluation of learning. She received her Master's degree in Learning Design and Leadership from the University of Illinois. Her practitioner focus includes digital interactivity and scenario-based learning.
WooRi Kim

Blue Cross Blue Shield

WooRi Kim is a Sr. People Analytics Consultant at Blue Cross Blue Shield, and an adjunct faculty in the Learning Design and Technology (LDT) program at Purdue University. She received her Ph.D. degree in LDT from Purdue in 2015. She’s interested in online learning and measurement/evaluation in teaching and learning.

This content is provided to you freely by EdTech Books.

Access it online or download it at https://edtechbooks.org/jaid_10_2/enabling_interactivi.