What Matters? Graduate Students’ Perspectives of Engagement and Learning Based on the QM Rubric
Introduction
Online education has emerged as a prevalent instructional method in higher education.
Traditional and non-traditional students often opt for this class format because of its flexibility and convenience (Barreto et al., 2022). Graduate programs, in general, have seen an increase in total enrollment by 10 percent between fall 2009 and fall 2020 (i.e., from 2.8 million to 3.1 million students) (Irwin et al., 2022). A popular and common standard metric is the Quality Matters (QM) Rubric (Quality Matters, 2023). QM rubric is a well-known and regarded standard for online course design. Several institutions have implemented the QM Rubric to guide course design to improve educational outcomes (Conklin et al., 2020).
Some of the benefits of QM in online courses include explicit instructor expectations for students, a clear grading policy (Bento & White, 2010), and greater involvement and learning from a student’s perspective (Gaston & Lynch, 2019; Lynch & Gaston, 2020). Although the benefits of QM in online courses are known, there is limited research on students’ perceptions of QM standards' impact on their learning and engagement (Sadaf et al., 2019). Thus, investigating students' perceptions of QM standards’ impact on learning and engagement is essential to add to the body of knowledge in the field.
Literature Review
QM Rubric
QM is a nationally recognized, faculty-centered peer review process designed to certify the quality of online and blended courses (Legon & Runyon, 2007). The QM process provides an institutional toolset to meet quality expectations in the following areas: (a) online course design, (b) student learning, (c) improved instruction, (d) assessment and feedback loops, and (e) professional development. The QM program implements a rigorous peer review process using the QM rubric based on the standards of best practice, research, and instructional design principles (Legon, 2015; Shattuck et al., 2014). Additionally, the endorsement and use of QM rubrics can provide ongoing improvement, conversations about what effective and quality online learning looks like and promoting course design consistency in higher education (Martin et al., 2019).
It is important to note that QM Rubrics are designed and grounded in evidence-based research of online learning. QM regularly reviews and updates their Rubrics to incorporate new findings from ongoing literature reviews, ensuring each Rubric is backed by the most current research in the field (Quality Matters, n.d.). In July 2023, QM Higher Education (HE) Rubric was updated to the seventh edition. The seventh edition of the QM HE Rubric is organized by eight General Standards. Under these standards, there are 44 Specific Review Standards (SRS). Each SRS on the QM HE Rubric had an assigned value of 3 points (essential), 2 points (very important), or 1 point (important).
Previous research has examined student engagement and learning to the QM rubric (Sadaf et al., 2019; Conklin & Barreto, 2023). Since then, the rubric has undergone a revision from the 6th to 7th. These changes are highlighted since comparisons are made to previous research.. Mainly, the revisions to the rubric that could have an impact in this study include: (a) clarifications and adjustments to Specific Review Standards (e.g., Specific Review Standard 3.6 has been revised to assist learners in understanding the academic integrity policies in application to assessments); (b) increased societal focus on inclusion and belonging throughout the Rubric; and (c) improvements to provide a clear distinction between activities and assessment (Quality Matters, 2023). For example, even though there are still eight General Standards, the SRSs have increased from 42 to 44.
The following are the eight General Standards and their objectives:
Course Overview and Introduction: This standard has nine SRSs that focus on clarifying how to start the course. Research has shown that students need clear instructions on how to get started in the course and outline expectations for the course activities, such as discussions and communication with the instructor and peers (Conklin & Garrett Dikkers, 2022).
Learning Objectives: This General Standard has five SRSs that focus on measurable learning objectives for the course and modules. The module learning objectives are aligned with the course learning objectives or goals. Clear objectives written in a measurable format from the learners’ perspective improve learning outcomes (Edge et al., 2022).
Assessment and Measurement: This standard has six SRSs associated with ensuring that assessments are aligned with learning outcomes and allow students to track their progress. Research has shown that students can achieve learning outcomes when the assessments are aligned with the learning outcomes and provide clear descriptive criteria for success, such as rubrics (Karunanayaka & Naidu, 2021).
Instructional Materials: This standard has five associated SRSs that focus on supporting the course objectives. Instructional materials must be aligned with learning outcomes. Students value the relationship between instructional materials and learning objectives (Rice & Ortz, 2021).
Learning Activities and Learner Interaction: This standard includes four SRSs focused on promoting active engagement in the course. Designing courses for active engagement can increase student motivation (Conklin & Garrett Dikkers, 2022).
Course Technology: This standard focuses on course technologies that enhance learning rather than impede it. Four SRSs are associated with this General Standard, which refers to the use of various technologies to promote active learning.
Learner Support: Four SRSs under Learner Support ensure that students can access institutional support services such as technical, accessibility, academic, and student services.
Accessibility and Usability: Seven SRSs support Accessibility and Usability, ensuring that all students can easily navigate and interact with the course components.
Although the QM Rubric has been created to establish the standards for the quality of online courses, students’ perceptions of those standards are important to be examined as more learners enroll in online courses. Specifically, students’ perceptions can provide insights for instructors and instructional designers on each area of the standards to prioritize in online course design and development. The section below will cover the research on students’ perceptions of online courses.
Community of Inquiry Framework & Quality Matters
Community of Inquiry (CoI) is a well-known theoretical framework, grounded on collaborative and constructivist perspectives, that guides the practical implications of online and blended learning in higher education (Garrison, 2016). Within this framework, there are three core dimensions by which interactions can be examined within the learning context: (1) cognitive presence, which can be defined “as the extent to which learners are able to construct and confirm meaning through sustained reflection and discourse in a critical community of inquiry” (Garrison et al., 1999, p. 89); (2) social presence, which involves designing a space where trust and open communication occurs and learners can present themselves to others as “real people” (Garrison et al., 1999; Garrison, 2016); and (3) teaching presence, which involves the instructor’s “design, facilitation, and direction of cognitive and social processes” to support student learning (Anderson et al. 2001, p. 5). Connecting the CoI framework, including its presences, to the QM rubric has been applied to a program level, which has assisted courses of their program to maintain high quality and inquiry-based focus (Bogle et al., 2009). Table 1 below presents the CoI components and the connected QM general standards.
Table 1
Connection Between CoI Components and QM General Standards
CoI components | QM General Standard and Number |
|---|---|
Social presence |
|
Teaching presence |
|
Cognitive presence |
|
As shown in Table 1, there is a connection between the QM Rubric and its general standards with the CoI framework. Still, limitations of the QM rubric in relation to the CoI framework have been noted by researchers. For instance, some suggest that the CoI model provides guidelines beyond course design, which seems to be the focus of QM, including instruction and facilitation through collaborative learning methods grounded in a constructivist approach (Markowitz, 2021; Thornton, 2020), while others emphasize that CoI focuses on critical inquiry in comparison to QM (Dickison, 2021).
Student Perceptions of Online Courses
Online education is highly researched, but the perception of online courses is influenced by perspective (Otter et al., 2013; Tanner et al., 2009). Most students care about their learning but do not always sign up for online classes for their quality (Marks et al., 2005; O’Neill & Sai, 2014). Students often sign up for online classes for convenience and flexibility (Lee et al., 2017; Mann & Henneberry, 2012).
Student perspectives of online courses provide critical information and first-hand insights into their experiences and expectations (Dawson et al., 2019). In fact, Van Wort and colleagues (2020) conducted a study to determine the significant factors in creating high-quality online learning experiences from the students’ perspectives, finding that students highly valued the technologies integrated into LMS platforms, such as online grading, gradebook, announcements, and navigation. Additionally, the techniques utilized by the instructors for input feedback and evaluation were also highly valued. Finally, the students valued clear, focused communication from the instructor. These findings corroborate research supporting the importance of the instructor’s role in online courses (Conklin & Barreto, 2023; Fedynich et al., 2015; You et al., 2014).
In addition to the factors aforementioned, learning objectives are also important in the course design process and learning experience. In a systematic review of students’ learning preferences, Konstantinidou & Nisiforou (2022) found most students needed to be aware of the learning objectives of the course. Students find the communication of the learning outcomes and expectations of the course, along with how the content and activities are aligned with them, to be important (Conklin & Barreto, 2023; David & Frederick, 2020; Mykota, 2018; Sadaf et al., 2018). Furthermore, many students prefer collaborative activities throughout the online course (Konstantinidou & Nisiforou, 2022). This can be achieved through student-led discussions (Vlachopoulous et al., 2019), group work, or peer-reviewed tasks where students provide feedback (Bolliger & Martin, 2018).
Student Engagement and Learning
Institutions have often adopted QM because of its best practices for course design. Such practices include promoting navigability, alignment, and course accessibility. Quality online courses can improve student learning outcomes and retention rates and promote learner engagement. Indeed, learners’ satisfaction is affected by their motivation to participate, feelings about the learning process, and engagement in applying the knowledge (Lee et al., 2019).
Courses that have been QM certified improve student learning and allow faculty to spend more time on building their instructor presence. Because of this, learners have increased motivation and performance (Roig, 2021). Previous studies have found students value Assessment and Measurement and Learning Objectives as the highest rated General Standards using the 6th edition of the Quality Matters Rubric (Conklin & Barreto, 2023). Additional studies have found students value Learning Objectives and Course Activities and Learner Interaction (Sadaf et al., 2018). Although there was a discrepancy between Assessment and Measurement and Course Activities and Learner Interaction, students in the 2023 study rated Course Activities and Learner Interaction high in the open-ended questions.
Purpose of the Study
Researchers have been studying student perceptions of the outputs of QM standards. Still, research on students’ perception of the impact of specific QM standards on their learning and engagement is limited. Therefore, this study examined students’ perceptions of QM standards 7th edition on their engagement and learning. The following research questions guided this study:
What are online students' perceptions of QM standards’ impact on their learning?
What are online students’ perceptions of the impact of QM standards on their engagement?
How well do the student impact ratings for QM items align with the eight QM standards?
Materials and Methods
This mixed-methods study was designed to examine students’ perspectives on the impact of QM standards on their learning and engagement, using both quantitative survey data and qualitative analysis of open-ended responses. Data was collected from graduate students enrolled in the following programs via an online survey: Healthcare Administration, Instructional Technology, and Nurse Educator. Quantitative statistical methods were implemented to examine the participants’ perceptions of the impact of QM standards on their learning and engagement. The survey was adapted from Sadaf et al. (2019) and revised based on the new QM Rubric, 7th edition.
Participants
A cross-sectional survey was disseminated to students in three fully online graduate programs at a university in southeastern United States. These programs were selected since they are all fully online and courses were designed based on the QM rubric. Although these programs are not QM certified, many faculty members are familiar with the QM rubric and design courses to align with it. Forty-two graduate students who took online courses in the program in Fall 2023 completed the survey. There were 41 (79%) female, 8 (15%) male, and three (6%) preferred not to say respondents. Many participants were full-time employees (88%) and had taken part-time courses. Additionally, twenty (38%) were from Healthcare Administration, 16 (31%) from Instructional Technology, and 13 (25%) from the Nurse Educator program.
Procedure
This study used a descriptive survey-based research design with closed and open-ended questions. The questionnaire was disseminated through Qualtrics to graduate students from three online graduate programs. Two reminders were sent approximately one week apart through Qualtrics.
Instrument
The 44 QM SRSs within eight General Standards were used to create this survey. Each QM standard was rated on a scale from 0 to 3 (no impact, a little impact, some impact, and a lot of impact) on the two constructs of learning and engagement. Students were given the following prompt: “Please think about each standard and rate how much impact this standard has on your online learning (engagement).” Cronbach’s alpha, an estimate of the internal consistency of student responses across all survey items (.96) and separately for each construct (.92 for learning and .93 for engagement), was high. There were also two open-ended questions: (a) “Which strategy(ies) impacted you the most for your learning?” and (b) “Which strategy(ies) impacted you the most for your engagement?”
Data Analysis
Descriptive statistics were used to report the participants’ perceptions of the impact of the QM survey on their learning and engagement in answering Research Questions 1 and 2. The percentage of the highest responses (above 75%) was provided for the SRS, and a grand percentage for each General Standard was provided. The mean and standard deviation are provided to support the comparative analysis of individual items. An exploratory factor analysis was conducted to examine Research Question 3 regarding how well student perceptions of the items within the standards aligned to the items of the eight General Standards as assigned by QM.
Responses to the open-ended survey questions were analyzed using Miles and Huberman’s (1994) constant comparative method. First, the data was imported to and coded using NVivo 14 by two of the researchers separately. Participants' responses were initially coded based on the presence of language, such as words or sentences, that were prominent in the literature. These codes were provisional and progressively changed as the researcher engaged in the constant comparative method process. Codes were then categorized into the eight pre-structured QM General Standards to gain further insight into the survey results. The frequencies of the codes were also noted to identify categories with the most significant explanatory potential. Each category (General Standard) was reanalyzed to determine the codes' relationships to explain further student perceptions of the SRSs that most impacted their learning and engagement. The initial percentage of inter-reliability code agreement between the researchers for the learning and engagement concepts was 20% and 45%, respectively. The researchers met and went through each of the participants' responses as well as their assigned codes to reach an agreement. Researchers noted that some responses could be coded under multiple categories of QM SRS based on the description of their examples. Researchers reviewed each one of them and made decisions on keeping or modifying their codes. After the meeting, the inter-reliability code agreement between the researchers for the learning and engagement concepts was 85% and 90%, respectively.
Results
Research Question 1
This research question focused on students’ perceptions of learning regarding the QM SRSs. Table 1 lists the ratings for the overall General Standards. Most of the students rated each SRS as impacting their learning A lot. The top three General Standards for learning were Accessibility and Usability (76.80 A lot), Assessment and Measurement (72.21 A lot) and Instructional Materials (72.21 A lot). The lowest-rated General Standard to impact learning was Learner Support. Learner Support was the lowest ranked General Standard to impact learning (45.82% A lot).
Table 2
Percentage of response options for QM General Standards impact on learning
General Standard | N | A lot | Some | None |
|---|---|---|---|---|
Course Overview and Introduction | 42 | 65.67 | 27.51 | 6.61 |
Learning Objectives | 42 | 68.56 | 29.06 | 1.92 |
Assessment and Measurement | 42 | 72.21 | 25.8 | 1.98 |
Instructional Materials | 42 | 72.21 | 24.2 | 2.78 |
Course Activities and Interaction | 42 | 67.25 | 26.78 | 5.95 |
Course Technology | 42 | 67.25 | 26.78 | 5.95 |
Learner Support | 42 | 45.82 | 43.45 | 0 |
Accessibility and Usability | 42 | 76.80 | 17.0 | 5.18 |
Within the General Standards, there were twelve SRS’s students ranked A lot above 80%. Half of the SRS’s were in the Accessibility and Usability General Standard (see Table 2). Overall, students highly value accessibility and usability of course design for learning. The lowest rated SRS under Accessibility and Usability was the SRS related to providing vendor accessibility statements for course technologies (35.7% A lot).
Within the Accessibility and Usability General Standard, the highest-rated item was 8.2: “The course design facilitates readability” (88.10% A lot, mean 3.92). The highest-rated item within the Instructional Materials General Standard was 4.5: “Varied instructional materials” (83.3% A lot, mean 2.81). Within Assessment and Measurement, SRS 3.3 “specific and descriptive criteria are provided for the evaluation of learners’ work and their connection to the course grading policy is clearly explained” (88.1% A lot, mean 2.81).
Table 3
Percentage, mean, and standard deviation of response options for highest SRS’s impact on learning
Percentage | Mean | Std. Dev | ||||
|---|---|---|---|---|---|---|
Item | N | A lot | Some | None | ||
*1.1 Clear instructions for getting started | 42 | 97.6 | 2.4 | 0 | 2.98 | 0.154 |
*1.2 Purpose and structure of the course | 42 | 88.7 | 11.9 | 0 | 2.88 | 0.328 |
*3.3 Specific and descriptive criteria are provided | 42 | 88.1 | 11.9 | 0 | 2.88 | 0.328 |
4.5 Varied instructional materials | 42 | 83.3 | 14.3 | 2.4 | 2.81 | 0.455 |
*5.1 Learning activities consistent with objective | 42 | 88.1 | 11.9 | 0 | 2.88 | 0.328 |
5.4 Requirements for learner interaction | 42 | 85.7 | 11.9 | 2.4 | 2.92 | 0.437 |
*8.1 Course navigation is easy to use | 42 | 82.9 | 7.1 | 0 | 3.93 | 0.261 |
*8.2 Course design facilitates readability | 42 | 88.1 | 9.5 | 2.4 | 3.92 | 0.537 |
*8.3 Text in the course is accessible | 42 | 83.3 | 11.9 | 4.8 | 3.76 | 0.701 |
8.4 Images are accessible | 42 | 81 | 14.3 | 4.8 | 3.71 | 0.708 |
8.5 Video and audio content is accessible | 42 | 83.3 | 14.3 | 0 | 3.79 | 0.565 |
8.6 Multimedia is easy to use | 42 | 83.3 | 14.3 | 0 | 3.85 | 0.358 |
Note * denotes essential standards
In response to the open-ended question, students recorded the strategies that impacted their learning the most (see Table 3). The strategies were coded using the QM standards. Many students noted Accessibility and Usability (32%) as the most effective learning strategies. Students mentioned “accessible video and audio content” and “a clear outline of the course in each module.” One student stated, “Accessibility and making presented information neurodivergent-friendly, including content presented in video, multimedia, or text.”
Course Activities and Learner Interaction strategies were the next highest (22.7%). One student stated, “seeing how all activities relate to and help achieve the learning objectives.” Another student stated, “Instructional congruence – anything that supported clarity and transparency in learning outcomes and why we were doing a thing was really helpful for my learning.” Additionally, timely feedback is also important to student learning.
Table 4
Most Effective Online Strategies for Student Learning
Survey categories | Frequency | Percentage |
|---|---|---|
Accessibility and Usability | 7 | 32 |
Course Activities and Learner Interaction | 5 | 22.7 |
Instructional Materials | 4 | 18 |
Assessment and Measurement | 3 | 13.6 |
Course Overview and Introduction | 2 | 9 |
Learner Support | 1 | 4.5 |
Note. n= 22. Participants may have mentioned multiple strategies.
Research Question 2
This research question focused on students’ perceptions of engagement regarding the QM SRSs. Table 4 lists the ratings for the overall General Standards. The top two General Standards for engagement were Course Activities and Learner Interaction (58.9% A lot) and Accessibility and Usability (58.49% A lot). Learner Support was the lowest ranked General Standard to impact engagement (34.52% A lot).
Table 5
Percentage of response options for QM General Standards impact on engagement
General Standard | N | A lot | Some | None |
|---|---|---|---|---|
Course Overview and Introduction | 31 | 48.42 | 18.24 | 7.13 |
Learning Objectives | 31 | 51.92 | 20.52 | .96 |
Assessment and Measurement | 31 | 53.17 | 17.05 | 3.18 |
Instructional Materials | 31 | 53.17 | 18.25 | 2.38 |
Course Activities and Interaction | 31 | 58.9 | 13.7 | 1.6 |
Course Technology | 31 | 44.65 | 22 | 7.15 |
Learner Support | 31 | 34.52 | 26.8 | 12.5 |
Accessibility and Usability | 31 | 58.49 | 10.53 | 4.44 |
Within the General Standards, there were thirteen SRS’s students ranked A lot above 59% for impacting engagement. Four of the SRS’s were in the Accessibility and Usability General Standard (see Table 5). Overall, students highly value accessibility and usability of course design for engagement. Within the Accessibility and Usability General Standard, the highest-rated item was 8.1: “Course navigation facilitates ease of use” (69% A lot, mean 3.94). The lowest rated SRS under Accessibility and Usability was the SRS related to providing vendor accessibility statements for course technologies (35.7% A lot).
The highest-rated item within the Learning Activities and Learner Interaction General Standard was 5.4: “The requirements for learner interaction are clearly stated” (61.9% A lot, mean 3.77). Another area that was highly ranked was under General Standard 1: Course Overview and Introduction, SRS 1.1: “Instructions make clear how to get started and where to find various course components (66.7% A lot, mean 3.80).
Table 6
Percentage, mean, and standard deviation of response options for highest SRS’s impact on engagement
Percentage | Mean | Std. Dev | ||||
|---|---|---|---|---|---|---|
Item | N | A lot | Some | None | ||
*1.1 Clear instructions for getting started | 31 | 66.7 | 7.1 | 0 | 3.8 | 0.301 |
*1.2 Purpose and structure of the course | 31 | 59.5 | 14.3 | 0 | 3.81 | 0.402 |
1.3 Etiquette expectations for communication | 31 | 59.5 | 14.3 | 0 | 3.81 | 0.402 |
*3.1 Assessment measured the learning objectives | 31 | 59.5 | 14.3 | 0 | 3.81 | 0.402 |
3.5 The types and timing of assessments provide learners with multiple opportunities to track their learning progress with timely feedback | 31 | 59.5 | 9.5 | 4.8 | 3.68 | 0.791 |
*4.2 Clearly explained use of learning materials | 31 | 59.5 | 11.9 | 2.4 | 3.74 | 0.631 |
4.4 Current instructional materials | 31 | 59.5 | 14.3 | 0 | 3.91 | 0.402 |
*5.2 Opportunities for learner interaction | 31 | 59.5 | 14.3 | 0 | 3.81 | 0.402 |
5.4 Requirements for learner interaction | 31 | 61.9 | 9.5 | 2.4 | 3.77 | 0.617 |
*8.1 Course navigation is easy to use | 31 | 69 | 4.8 | 0 | 3.94 | 0.25 |
*8.2 The course design facilitates readability | 31 | 64.3 | 9.5 | 0 | 3.87 | 0.341 |
8.4 Images in the course are accessible | 31 | 61.9 | 7.1 | 4.8 | 3.71 | 0.783 |
8.5 Video and audio content in the course is accessible | 31 | 59.5 | 7.1 | 4.8 | 3.7 | 0.794 |
Note * denotes essential standards
In response to the open-ended question, students recorded the strategies that impacted their engagement the most (see Table 6). The strategies were coded using the QM standards. Many students noted Course Activities and Learner Interaction (42.8%) as the most effective engagement strategies. Students mentioned “accessible video and audio content” and “a clear outline of the course in each module.” One student stated, “Accessibility and making presented information neurodivergent-friendly, including content presented in video, multimedia, or text.” indicating the importance of accessible content for adult learners.
Course Activities and Learner Interaction strategies were the next highest (22.7%). One student stated, “Professor engagement and level of care for the course. The more engagement and quality of feedback from them, the more I want to be engaged in the course. If I am only receiving an email once a week with announcements that I can find on my own in the modules (i.e. what assignment are due), that doesn’t feel genuine, and I feel less desire to be engaged in the course.” Another student stated, “opportunities to discuss or interact were great for my engagement. I like all those things and am intrinsically motivated by the content + love discussing it.” Additionally, another student stated, “Project-based learning is the highest impact for me and something I like to see more. Moving beyond readings and papers. Projects allow for challenging applications that also grow skills and have a holistic approach to course content and applications.” Students emphasized the importance of course activities going beyond papers and quizzes.
Table 7
Most Effective Online Strategies for Student Engagement
Survey categories | Frequency | Percentage |
|---|---|---|
Course Activities and Learner Interaction | 9 | 42.8 |
Instructional Materials | 6 | 28.5 |
Assessment and Measurement | 4 | 19 |
Accessibility and Usability | 3 | 14.2 |
Course Overview and Introduction | 1 | 4.7 |
Note. n= 21. Participants may have mentioned multiple strategies.
Research Question 3
Exploratory Factor Analysis (EFA) was chosen to examine how student perceptions clustered around Quality Matters standards. This method was appropriate given the study’s exploratory goals and the limited existing research grouping QM elements based on learner perspectives. Although the sample size (n=42) is below traditional thresholds, recent studies (e.g., de Winter et al., 2009) support the use of EFA with small samples, particularly when communalities are high, and the factor structure is strong.
Two EFAs were conducted using principal components to answer the third research question. EFA was used to provide insight into the structure of the data and to understand how well students’ perceptions of the QM survey items aligned with the QM General Standards. A preliminary review of the correlation matrix for the factors revealed no correlations that reached the suggested threshold of 0.32. Therefore, we used a varimax rotation (Brown, 2009). We selected factors with eigenvalues greater than 1 (Guttman, 1954; Kaiser, 1960) and examined the scree plot to identify the number of factors for the solution. A threshold of 0.35 was used to identify viable items for each factor (Sadaf et al., 2019).
The first EFA retained 12 factors that explained 84.94% of the variance of the impact of QM standards on learning (Table 8). The second EFA also retained 12 factors that explained 92.73% of the variance of the impact of QM standards on engagement (Table 9). For both factor analyses, some of the survey items were complex and loaded on multiple factors, whereas other factors included one or two items.
Table 8
Eigenvalues, Variance Explained, and Cumulative Variance for QM Impact on Learning
Factor | Eigenvalue | Variance explained | Cumulative variance |
|---|---|---|---|
1 | 12.74 | 28.32 | 28.32 |
2 | 4.96 | 11.03 | 39.35 |
3 | 3.43 | 7.63 | 46.98 |
4 | 2.83 | 6.29 | 53.27 |
5 | 2.62 | 5.81 | 59.08 |
6 | 2.50 | 5.56 | 64.65 |
7 | 2.16 | 4.81 | 69.46 |
8 | 1.70 | 3.78 | 73.23 |
9 | 1.65 | 3.67 | 76.90 |
10 | 1.32 | 2.93 | 79.83 |
11 | 1.19 | 2.64 | 82.46 |
12 | 1.11 | 2.48 | 84.94 |
Table 9
Eigenvalues, Variance Explained, and Cumulative Variance for QM Impact on Engagement
Factor | Eigenvalue | Variance explained | Cumulative variance |
|---|---|---|---|
1 | 14.54 | 32.32 | 32.32 |
2 | 5.45 | 12.11 | 44.42 |
3 | 4.45 | 9.88 | 54.31 |
4 | 3.59 | 7.98 | 62.28 |
5 | 2.74 | 6.10 | 68.38 |
6 | 2.37 | 5.26 | 73.65 |
7 | 1.93 | 4.28 | 77.93 |
8 | 1.71 | 3.79 | 81.72 |
9 | 1.45 | 3.23 | 84.95 |
10 | 1.29 | 2.87 | 87.81 |
11 | 1.20 | 2.66 | 90.47 |
12 | 1.02 | 2.26 | 92.73 |
We reviewed the factor loadings to identify a common theme for each factor. Tables 10 and 11 display the themes for each factor (Factors), the survey items that loaded onto each factor (Survey Items), the factor loadings for each survey item (Factor Loadings), and the connection of each survey item to QM general standards (Connection to QM Standards) for learning and engagement, respectively. Items loaded on multiple factors were assigned to the factor with the highest factor loading. For the first EFA, two factors contained only one survey item at or above the 0.35 threshold and were eliminated from the factor solution, resulting in a 10-factor solution for “online learning.” The factors for online learning were: (1) expectations clearly communicated (9 items), (2) student support and accessibility (8 items), (3) introductions and information sharing (4 items), (4) learner-centeredness (3 items), (5) active learning and achievement (4 items), (6) measurability and alignment (4 items), (7) course orientation (3 items), (8) instructional materials (2 items), (9) assessment variety to support objectives (3 items), and (10) learning materials, grading, and assessment (3 items).
Table 10
Factors and Factor Loadings for the QM Items’ Impact on Learning
Factors | Survey Items | Factor loadings | Connection to QM Standards |
|---|---|---|---|
Factor 1: Expectations clearly communicated | 1.3 | 0.71 | Course Overview and Introduction |
1.4 | 0.69 | Course Overview and Introduction | |
1.7 | 0.55 | Course Overview and Introduction | |
3.6 | 0.65 | Assessment and Measurement | |
4.3 | 0.74 | Instructional Materials | |
6.1 | 0.52 | Course Technology | |
6.3 | 0.52 | Course Technology | |
6.4 | 0.74 | Course Technology | |
7.2 | 0.70 | Learner Support | |
Factor 2: Student support and accessibility | 7.1 | 0.69 | Learner Support |
7.3 | 0.71 | Learner Support | |
7.4 | 0.60 | Learner Support | |
8.2 | 0.50 | Accessibility and Usability | |
8.3 | 0.80 | Accessibility and Usability | |
8.4 | 0.90 | Accessibility and Usability | |
8.5 | 0.90 | Accessibility and Usability | |
8.7 | 0.63 | Accessibility and Usability | |
Factor 3: Introductions and information sharing | 1.6 | 0.43 | Course Overview and Introduction |
1.8 | 0.79 | Course Overview and Introduction | |
1.9 | 0.88 | Course Overview and Introduction | |
4.2 | 0.58 | Instructional Materials | |
Factor 4: Learner-centeredness | 2.3 | 0.57 | Learning Objectives (Competencies) |
5.3 | 0.81 | Learning Activities and Learner Interaction | |
5.4 | 0.88 | Learning Activities and Learner Interaction | |
Factor 5: Active learning and achievement | 3.1 | 0.66 | Assessment and Measurement |
5.1 | 0.88 | Learning Activities and Learner Interaction | |
5.2 | 0.77 | Learning Activities and Learner Interaction | |
6.2 | 0.51 | Course Technology | |
Factor 6: Measurability and alignment | 2.2 | 0.63 | Learning Objectives (Competencies) |
2.4 | 0.60 | Learning Objectives (Competencies) | |
2.5 | 0.56 | Learning Objectives (Competencies) | |
4.1 | 0.77 | Instructional Materials | |
Factor 7: Course orientation | 1.1 | 0.93 | Course Overview and Introduction |
1.2 | 0.49 | Course Overview and Introduction | |
8.1 | 0.54 | Accessibility and Usability | |
Factor 8: Instructional materials | 4.4 | 0.62 | Instructional Materials |
4.5 | 0.86 | Instructional Materials | |
Factor 9: Assessment variety to support objectives | 2.1 | 0.49 | Learning Objectives (Competencies) |
3.4 | 0.66 | Assessment and Measurement | |
3.5 | 0.63 | Assessment and Measurement | |
Factor 10: Learning materials, grading, and assessment | 3.2 | 0.47 | Assessment and Measurement |
3.3 | 0.90 | Assessment and Measurement | |
4.6 | 0.77 | Instructional Materials |
Table 11
Factors and Factor Loadings for the QM Items’ Impact on Engagement
Factors | Survey Items | Factor loadings | Connection to QM Standards |
|---|---|---|---|
Factor 1: Technology, support, and learning outcomes | 1.5 | 0.75 | Course Overview and Introduction |
1.6 | 0.72 | Course Overview and Introduction | |
2.3 | 0.68 | Learning Objectives (Competencies) | |
2.4 | 0.53 | Learning Objectives (Competencies) | |
2.5 | 0.52 | Learning Objectives (Competencies) | |
6.4 | 0.60 | Course Technology | |
7.1 | 0.78 | Learner Support | |
7.2 | 0.85 | Learner Support | |
7.3 | 0.92 | Learner Support | |
7.4 | 0.81 | Learner Support | |
8.7 | 0.49 | Accessibility and Usability | |
Factor 2: Introduction and course orientation | 1.7 | 0.53 | Course Overview and Introduction |
1.8 | 0.76 | Course Overview and Introduction | |
1.9 | 0.50 | Course Overview and Introduction | |
2.1 | 0.76 | Learning Objectives (Competencies) | |
2.2 | 0.80 | Learning Objectives (Competencies) | |
4.6 | 0.78 | Instructional Materials | |
5.3 | 0.93 | Learning Activities and Learner Interaction | |
5.4 | 0.91 | Learning Activities and Learner Interaction | |
Factor 3: Learning activities, resources, and accessibility | 5.2 | -0.45 | Learning Activities and Learner Interaction |
8.3 | 0.93 | Accessibility and Usability | |
8.4 | 0.89 | Accessibility and Usability | |
8.5 | 0.90 | Accessibility and Usability | |
8.6 | 0.76 | Accessibility and Usability | |
Factor 4: Course navigation | 1.1 | 0.75 | Course Overview and Introduction |
4.2 | 0.74 | Instructional Materials | |
8.1 | 0.80 | Accessibility and Usability | |
Factor 5: Assessment and evaluation of performance on learning activities | 3.1 | 0.89 | Assessment and Measurement |
3.3 | 0.70 | Assessment and Measurement | |
5.1 | 0.72 | Learning Activities and Learner Interaction | |
Factor 6: Instructional materials and course tools | 4.4 | 0.90 | Instructional Materials |
6.1 | 0.59 | Course Technology | |
6.2 | 0.56 | Course Technology | |
Factor 7: Clear guidelines for grading and assessment | 3.2 | 0.70 | Assessment and Measurement |
3.4 | 0.75 | Assessment and Measurement | |
Factor 8: Variety of technology and learning materials to achieve objectives | 4.1 | 0.50 | Instructional Materials |
6.3 | 0.87 | Course Technology | |
Factor 9: Clear communication and guidance | 1.3 | 0.69 | Course Overview and Introduction |
1.4 | 0.60 | Course Overview and Introduction | |
3.6 | 0.44 | Assessment and Measurement | |
4.3 | 0.57 | Instructional Materials | |
Factor 10: Connecting learning materials, assessments, and feedback | 3.5 | 0.58 | Assessment and Measurement |
4.5 | 0.84 | Instructional Materials |
Likewise, for the second EFA, two factors were eliminated, resulting in a 10-factor solution for “engagement.” The factors for engagement were: (1) technology, support, and learning outcomes, (2) introduction and course orientation, (3) learning activities, resources, and accessibility, (4) course navigation, (5) assessment and evaluation of performance on learning activities, (6) instructional materials and course tools, (7) clear guidelines for grading and assessment, (8) variety of technology and learning materials to achieve objectives, (9) clear communication and guidance, and (10) connecting learning materials, assessment, and feedback.
Discussion
The findings from this study provide insights into graduate students' perceptions of the impact of Quality Matters (QM) standards on their learning and engagement in online courses. The results highlight the significance of certain QM standards, particularly Accessibility and Usability, Assessment and Measurement, and Instructional Materials, in enhancing students' learning experiences. These findings align with existing literature that underscores the importance of clear course design, structured learning activities, and accessible materials in promoting effective online learning (Conklin & Barreto, 2023; Sadaf et al., 2018).
Impact of QM Standards on Learning
The results indicate that students perceive Accessibility and Usability as the most critical General Standard in their learning, with the majority rating it as having "A lot" of impact. This finding aligns with extant research that underscores the significance of accessible course design in fostering an inclusive learning environment. Through the implementation of an accessible course, the instructor mitigates barriers to learning and facilitates equitable learning opportunities (Al-Azawei et al., 2016). The high ratings for specific SRSs, such as course navigation (SRS 8.1) and readability (SRS 8.2), suggest that students value the ease with which they can access and interact with course content. This aligns with the findings of Van Wort et al. (2020), who noted that students highly value the integration of user-friendly technologies in online courses.
These findings underscore the need for instructors and course designers to prioritize accessibility and usability when developing online learning materials. Educators can enhance student engagement and potentially improve learning outcomes by ensuring that course content is easily navigable and readable.
Assessment and Measurement also emerged as a crucial factor, with students particularly valuing clear, descriptive criteria for evaluations (SRS 3.3). This finding reinforces the importance of aligning assessments with learning objectives, a key principle in instructional design that has been shown to improve student learning outcomes (Karunanayaka & Naidu, 2021). The high rating of SRS 4.5, which pertains to varied instructional materials, further supports the idea that diverse and relevant materials contribute significantly to learning, echoing the findings of Gaston and Lynch (2019).
Impact of QM Standards on Engagement
Regarding engagement, the study found that Course Activities and Learner Interaction were highly rated, particularly SRS 5.1, which deals with aligning learning activities with objectives. This finding suggests that students are more engaged when course activities are clearly tied to learning outcomes, which is consistent with previous research highlighting the role of meaningful interaction in online learning (Vlachopoulous et al., 2019). The emphasis on feedback and instructional congruence also points to the importance of transparent and consistent communication between instructors and students, enhancing student motivation and engagement (Conklin & Barreto, 2023).
Alignment with QM Standards
An EFA was used to examine the alignment of student perceptions of the impact of the QM items to the QM standards to which they had been assigned. The retention of 12 factors in both EFAs, accounting for 84.94% of the variance in learning and 92.73% in engagement, underscores the multifaceted nature of QM standards in online education. These results align with prior research that suggests a broad range of factors contribute to successful online learning environments (Sadaf et al., 2019).
However, the complexity of some survey items, which loaded onto multiple factors, highlights the nuanced and interconnected aspects of QM standards. For example, certain factors included only one or two items, which may suggest the need for more refined or targeted survey items in future research to capture distinct constructs more effectively.
For the learning-related EFA, the final 10-factor solution encompassed critical elements such as clearly communicated expectations, student support, active learning, and the alignment of instructional materials with learning objectives. These factors emphasize the importance of clear communication and robust support systems in fostering an effective learning environment for students. The elimination of two factors due to insufficient item loadings suggests that while these aspects may be important, they were not strongly represented in the data, potentially indicating areas for improvement in the survey design or the need for further exploration.
Similarly, the engagement-related EFA also resulted in a 10-factor solution, highlighting essential components like technology integration, course orientation, and assessment variety. The emergence of factors related to technology and communication reinforces the idea that engagement in online learning is deeply intertwined with the tools and methods used to facilitate interaction and feedback. The removal of two factors in this analysis suggests that certain engagement elements may require more robust measurement tools or alternative approaches to better capture their impact.
Overall, these findings contribute to a deeper understanding of how QM standards influence both learning and engagement in online courses. The identified factors provide a framework for educators and instructional designers to focus on specific aspects of course design that are most likely to enhance student outcomes. Future research should continue to refine these factors and explore their applicability across different educational contexts, considering the limitations observed in this study, such as the complex and overlapping nature of some survey items.
Implications for Instructional Design
The results of this study have important implications for instructional designers and educators aiming to improve the quality of online courses. The strong emphasis on Accessibility and Usability suggests that ensuring all students can easily navigate and access course materials should be a priority in course design. Incorporating universal design (UD) principles is essential for creating a rich rhetorical user experience for diverse populations (Hitt, 2018). Instructors should consider seven usability criteria: Information Quality, System Navigation, System Learnability, Visual Design, Instructional Assessment, System Interactivity, and Responsiveness (Vlasenko et al., 2020; Vlasenko et al., 2023). These criteria can guide the development of online courses and ensure a more engaging and effective learning environment. To improve accessibility and usability, instructors should focus on combining instructional design models with broader theoretical frameworks (Abuhassna & Alnawajha, 2023). Additionally, they should consider implementing universal instructional design (UID) principles tailored to distance education, such as providing multiple means of representation, expression, and engagement (Elias, 2010).
The importance of interactive and engaging activities in online learning environments cannot be overstated. These activities serve multiple purposes, including fostering student motivation, promoting active learning, and enhancing knowledge retention. By incorporating diverse interactive elements such as discussion forums, group projects, virtual simulations, and gamified learning experiences, instructors can create a dynamic and stimulating online classroom that encourages students to actively participate in their own learning process. Such activities not only break the monotony of passive content consumption but also provide opportunities for students to apply theoretical concepts to practical scenarios, thereby deepening their understanding of the subject matter.
The lower ranking of learner support by graduate students may be attributed to their increased self-reliance and academic maturity compared to undergraduate students. Graduate students typically have already completed a bachelor's degree and have developed essential study skills, time management techniques, and research abilities. This prior experience in higher education equips them with a stronger foundation for independent learning and problem-solving, reducing their need for extensive support services.
Furthermore, graduate students often exhibit higher levels of intrinsic motivation and commitment to their chosen field of study. They have likely selected their graduate program based on specific career goals or research interests, leading to a more focused and self-driven approach to their education. This increased motivation, combined with their prior academic experience, may result in graduate students feeling more confident in navigating the challenges of their programs without relying heavily on external support systems. However, it is important to note that while graduate students may require less overall support, they may still benefit from specialized assistance in areas such as advanced research methodologies, professional development, or specific technical skills relevant to their field of study. As online learning continues to grow, these insights can guide the development of courses that not only meet quality standards but also resonate with students' needs and preferences.
Conclusion
This study offers insights into graduate students' perceptions of the impact of Quality Matters (QM) standards on their learning and engagement in online courses. The findings underscore the critical role of specific QM standards, particularly Accessibility and Usability, Assessment and Measurement, and Instructional Materials, in enhancing the online learning experience. The exploratory factor analyses revealed distinct factor structures that clarify the multifaceted nature of QM standards, contributing to a deeper understanding of how these standards influence both learning and engagement.
The emphasis on Accessibility and Usability highlights the necessity of ensuring that course materials are not only accessible but also easy to navigate, reinforcing the importance of inclusive course design in promoting effective learning. The significance of clear alignment between learning objectives, assessments, and instructional materials, as well as the value of interactive and engaging activities, suggests that instructional designers and educators must prioritize these elements to foster a supportive and engaging learning environment.
Moreover, the results point to the importance of ongoing refinement in survey design and measurement tools to better capture the complex and interconnected aspects of QM standards. Future research should continue to explore these factors in diverse educational contexts, with a focus on addressing the limitations identified in this study.
In conclusion, the insights gained from this research provide a robust framework for educators and instructional designers to enhance the quality of online courses. By focusing on the QM standards most valued by students, such as accessibility, usability, and alignment of instructional materials, educators can better meet the evolving needs of online learners and contribute to their academic success.
Limitations and Future Research
This study is subject to several limitations that should be considered when interpreting the findings. First, students were not enrolled in courses that were formally Quality Matters (QM) certified. As a result, the extent to which the QM standards were implemented may vary, potentially influencing students’ perceptions and limiting the generalizability of the results to QM-certified programs. Second, the sample size was relatively small, comprising only 41 graduate students from three online programs. This limited sample may not adequately represent the diverse population of online learners, thereby restricting the ability to generalize the findings across different contexts and programs. This limitation may be of particular concern when interpreting the findings of the EFA. EFA can reveal valuable insights about the structure of the data. Furthermore, it can yield reliable results when the data are well-conditioned (i.e., when the factors are clearly defined or few in number, or when the number of variables is high) (de Winter et al., 2009). However, the small sample size limits the generalizability of the study findings, and the factor solutions may not be applicable to all contexts. Third, the participants lacked a formal educational background in instructional design or related fields, which may have affected their understanding of the QM Rubric items. The use of specialized terminology or jargon within the survey could have led to misinterpretations, potentially impacting the accuracy of the students’ responses. Finally, another potential limitation is response bias, as students who chose to respond may have had stronger opinions – positive or negative- about their online learning experiences. Social desirability bias may also have influenced responses, particularly in the open-ended questions. To mitigate bias, the survey was anonymous, and researchers used independent coding with interrater reliability checks to reduce subjectivity in qualitative data interpretation.
To address these limitations and enhance the robustness of future research in this area, several strategies could be employed. Future studies should aim to include larger and more diverse samples to better represent the online learning population and improve generalizability. Incorporating QM-certified courses into the study design would provide a more standardized basis for evaluating the implementation of QM standards and their impact on student perceptions. Researchers should also consider providing clear definitions or explanations of QM terminology to improve participants' comprehension and the reliability of their responses. Furthermore, exploring specific design elements that contribute most significantly to students' perceptions of accessibility and usability in online learning environments could yield valuable insights for improving online course design and delivery. By addressing these limitations and expanding the scope of investigation, future research can build upon this study's findings to provide more comprehensive and actionable insights into the effectiveness of QM standards in online education.
Declarations
This work was supported by the Cahill Grant from University of North Carolina Wilmington.
The authors report there are no competing interests to declare.
References
Abuhassna, H., & Alnawajha, S. (2023). Instructional design made easy! Instructional design models, categories, frameworks, educational context, and recommendations for future work. European Journal of Investigation in Health, Psychology and Education, 13(4), 715–735. https://doi.org/10.3390/ejihpe13040054
Al-Azawei, A., Serenelli, F., & Lundqvist, K. (2016). Universal Design for Learning (UDL): A content analysis of peer reviewed journals from 2012 to 2015. Journal of the Scholarship of Teaching and Learning, 16(3), 39-56. https://doi.org/10.14434/josotl.v16i3.19295
Anderson, T., Liam, R., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Online Learning, 5(2). https://doi.org/10.24059/olj.v5i2.1875
Barreto, D., Oyarzun, B., & Conklin, S. (2022). Integration of cooperative learning strategies in online settings. E-Learning and Digital Media, 19(6), 1-21. https://doi.org/10.1177/20427530221104187
Bento, R. F., & White, L. F. (2010). Quality measures that matter. Issues in Informing Science and Information Technology, 7, 61-72.
Bogle, L., Cook, V., Day, S., & Swan, K. (2009). Blended program development: Applying the Quality Matters and Community of Inquiry Frameworks to ensure high quality design and implementation. Editor Managing Editor, 5(2).
Bolliger, D. U., & Martin, F. (2018). Instructor and student perceptions of online student engagement strategies. Distance Education, 39(4), 568–583. https://doi.org/10.1080/01587919.2018.1520041
Brown, J. D. (2009). Choosing the right type of rotation in PCA and EFA. JALT Testing and Evaluation Newsletter, 13(3), 20-25. https://teval.jalt.org/test/PDF/Brown31.pdf
Conklin, S. & Barreto (2023). Student perceptions of Quality Matters standards on engagement and learning. American Journal of Distance Education, 1-15. https://doi.org/10.1080/08923647.2023.2242734
Conklin, S., & Dikkers, A. G. (2022). Use the FORCE to create sociability and connect with online students. The Journal of Applied Instructional Design, 11(2). https://dx.doi.org/10.51869/112/scagd
Conklin, S., Morgan, Z., Easow, G., & Hanson, E. (2020). Impact of QM Professional Development on Course Design and Student Evaluations. Journal of Educators Online, 17(2), n2.
Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2019). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education, 44(1), 25-36. https://doi.org/10.1080/02602938.2018.1467877
David, T., & Frederick, T. V. (2020). The impact of multimedia in course design on students’ performance and online learning experience: A pilot study of an introductory educational computing course. Online Learning Journal, 24(3), 147–162. https://doi.org/10.24059/olj.v24i3.2069
de Winter, J. C., Dodou, D., &Wieringa, P. A. (2009). Exploratory factor analysis with small sample sizes. Multivariate Behavioral Research, 44(2), 147-181. https://doi.org/10.1080/00273170902794206
Dickison, C. (2021). Reducing cognitive load: Applying the Community of Inquiry (CoI) framework to LMS discussion boards. Nineteenth-Century Gender Studies, 17(1).
Edge, C., Monske, E., Boyer-Davis, S., Vanden Avond, S., & Hamel, B. (2022). Leading university change: A case study of meaning-making and implementing online learning quality standards. The American Journal of Distance Education, 36(1), 53-69. https://doi.org/10.1080/08923647.2021.2005414
Elias, T. (2010). Universal instructional design principles for Moodle. The International Review of Research in Open and Distributed Learning, 11(2), 110. https://doi.org/10.19173/irrodl.v11i2.869
Fedynich, L., Bradley, K. S., & Bradley, J. (2015). Graduate students’ perceptions of online learning. Research in Higher Education Journal, 27, 1–13.
Garrison, D. R. (2016). E-learning in the 21st century: A community of inquiry framework for research and practice. Routledge.
Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87-105. https://doi.org/10.1016/s1096-7516(00)00016-6
Gaston, T., & Lynch, S. (2019). Does using a course design framework better engage our online nursing students?. Teaching and Learning in Nursing, 14(1), 69-71. https://doi.org/10.1016/j.teln.2018.11.001
Guttman, L. (1954). Some necessary conditions for common factor analysis. Psychometrika, 19, 149-161. https://doi.org/10.1007/bf02289162
Hitt, A. (2018). Foregrounding accessibility through (inclusive) Universal Design in professional communication curricula. Business and Professional Communication Quarterly, 81(1), 52–65. https://doi.org/10.1177/2329490617739884
Irwin, V., De La Rosa, J., Wang, K., Hein, S., Zhang, J., Burr, R., ... & Parker, S. (2022). Report on the Condition of Education 2022. NCES 2022-144. National Center for Education Statistic
Kaiser, H. F. (1960). The application of electronic computers to factor analysis. Educational and Psychological Measurement, 20, 141-151. https://doi.org/10.1177/001316446002000116
Karunanayaka, S. P., & Naidu, S. (2021). Impacts of authentic assessment on the development of graduate attributes. Distance Education, 42(2), 231-252. https://doi.org/10.1080/01587919.2021.1920206
Konstantinidou, A., & Nisiforou, E. (2022). Assuring the quality of online learning in higher education: Adaptations in design and implementation. Australasian Journal of Educational Technology, 38(4), 127-142. https://doi.org/10.14742/ajet.7910
Lee, J., Song, H. D., & Hong, A. J. (2019). Exploring factors, and indicators for measuring students’ sustainable engagement in e-learning. Sustainability, 11. https://doi.org/10.3390/su11040985
Lee, Y., Stringer, D., & Du, J. (2017). What determines students’ preference of online to F2F class? Business Education Innovation Journal, 9(2), 97–102.
Legon, R. (2015). Measuring the impact of the Quality Matters Rubric™: A discussion of possibilities. American Journal of Distance Education, 29(3), 166–173. https://doi.org/10.1080/08923647.2015.1058114
Legon, R., & Runyon, J. (2007, August). Research on the impact of the Quality Matters course review process. Paper presented at the 23rd Annual Conference on Distance Teaching & Learning. Madison, WI.
Lynch, S., & Gaston, T. (2020). Quality Matters impact on student outcomes in an online program. Journal of Educators Online, 17(2).
Markowitz, S. S. (2021). Implementing a community of inquiry professional development course to improve adjunct facilitation of online courses (Order No. 28494852). Available from ProQuest Dissertations & Theses Global. (2530047336). Retrieved from https://www.proquest.com/dissertations-theses/implementing-community-inquiry-professional/docview/2530047336/se-2
Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education, 29(4), 531–563. https://doi.org/10.1177/1052562904271199
Martin, F., Ritzhaupt, A., Kumar, S., & Budhrani, K. (2019). Award-winning faculty online teaching practices: Course design, assessment and evaluation, and facilitation. The Internet and Higher Education, 42, 34-43. https://doi.org/10.1016/j.iheduc.2019.04.001
Miles, M. B., & Huberman, A. (1994). Qualitative data analysis: An expanded sourcebook. Sage.
Mykota, D. (2018). The effective affect: A scoping review of social presence. International Journal of ELearning & Distance Education, 33(2). https://www.ijede.ca/index.php/jde/article/view/1080
National Center for Education Statistics. (2022). Postbaccalaureate Enrollment. Condition of Education. U.S. Department of Education, Institute of Education Sciences. Retrieved May 31, 2022, from https://nces.ed.gov/programs/coe/indicator/chb.
O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education, 68(1), 1–14. https://doi.org/10.1007/s10734-013-9663-3
Otter, R. R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., … Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education, 19, 27–35. https://doi.org/10.1016/j.iheduc.2013.08.001
Rice, M. F., & Ortiz, K. R. (2021). Evaluating digital instructional materials for K-12 online and blended learning. Tech Trends, 65(6), 977-992. https://doi.org/10.1007/s11528-021-00671-z
Roig, C. (2021, March 23). Understanding the QM rubric. FIU Online Insider. https://insider.fiu.edu/understanding-the-qm-rubric/
Sadaf, A., Martin, F., & Ahlgrim-Delzell, L. (2019). Student Perceptions of the Impact of Quality Matters--Certified Online Courses on Their Learning and Engagement. Online Learning, 23(4), 214-233. https://doi.org/10.24059/olj.v23i4.2009
Shattuck, K., Zimmerman, W. A., & Adair, D. (2014). Continuous improvement of the QM rubric and review processes: Scholarship of integration and application. Internet Learning, 3(1), 5. https://doi.org/10.18278/il.3.1.3
Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate students’ perceptions of online learning: A comparative study. Journal of Information Systems Education, 20(1), 29.
You, J., Hochberg, S.A., Ballard, P, Xiao, M., & Walters, A. (2014). Measuring online course design: A comparative analysis. Internet Learning, 3(1), 35-52. https://doi.org/10.18278/il.3.1.4
Vlasenko, K., Sitak, I., Bobyliev, D., Lovianova, I., Volkov, S., Semerikov, S., Danylchuk, H., Sakhno, S., Osadchyi, V., Bondarenko, O., Nechypurenko, P., Striuk, A., Chukharev, S., Vakaliuk, T., & Solovieva, V. (2020). Usability analysis of on-line educational courses on the platform “Higher school mathematics teacher.” E3S Web of Conferences, 166, 10012. https://doi.org/10.1051/e3sconf/202016610012
Vlasenko, K. V., Bohdanova, N. H., Volkov, S. V., Sitak, I. V., Lovianova, I. V., & Chumak, O. O. (2023). Exploring usability principles for educational online courses: a case study on an open platform for online education. Educational Technology Quarterly, 2023(2), 173–187. https://doi.org/10.55056/etq.602