Analysis of Associated Factors that Influence the Accessibility of Online Higher Education

&
DOI:10.59668/722.13021
AccessibilityHigher EducationExploratory Factor AnalysisOnline EducationDistance Education
Online distance education is one of the fastest-growing sectors of postsecondary enrollment, and accessibility is becoming a more prominent issue. This study used a descriptive quantitative survey methodology to explore the characteristics of institutions and individuals who are responsible for implementing accessibility. Overall, the findings indicate growth in the implementation of accessible course design practices. The results revealed the association of several factors focused on institutional accessibility support and accessibility compliance support. Although no models or inferences can be made from these associations, they do suggest that institutional accessibility practices may have a key role in accessible online course design.

Introduction

Online distance education is one of the fastest-growing sectors of postsecondary enrollment. As more students take advantage of these opportunities, online course content has increasingly been found to be inaccessible to students with disabilities. As accessibility case law quickly changes the expectations for online course content, postsecondary institutions struggle to shift to a proactive and systemic approach to accessible design practices.

Although there have been calls for institutions to take a holistic, coordinated, and collaborative approach to accessibility that is founded on pedagogical practices such as Universal Design for Learning, research on how institutional procedures may be associated with and to what extent they may contribute to the current state of accessibility practices has not been examined (Ascough, 2002; Galusha, 1998; Linder et al., 2015; Online Learning Consortium [OLC] & WICHE Cooperative for Educational Telecommunications [WCET], 2019; Rowland et al., 2014). Accessible online education is essential to supporting students with disabilities to ensure equity in the opportunity to benefit from higher education.

Literature Review

Often designated as a nontraditional pathway to instruction, distance education programs have experienced a steady increase in enrollments throughout the last 20 years (Allen et al., 2016; Seaman et al., 2018). Distance education is defined as instruction delivered through one or more technologies to students separated from their instructors, while online education is described as instruction facilitated in an online environment (Seaman et al., 2018). The internet has made online education the fastest-growing sector of and primary format for distance education (Carlsen et al., 2016; Scagnoli, 2001).

Online Education as a System

Systems consist of interconnected elements structured to achieve a specific goal (Meadows, 2008). Systems thinking is defined as an approach to determining the relationships between various factors by using a holistic rather than a component view (Reynolds & Holwell, 2010). By using systems thinking to understand the connections within a system, it is possible to engage in a process of assessing what the system was designed to do and how to make changes to reach a different outcome (Stroh, 2015). As a system, online education is an environment that can be organized into several subsystems and be affected by various large-scale forces (Moore & Kearsley, 2012; Tamim, 2020), as shown in Figure 1. As a system that can evolve with its environment (Reigeluth, 2019), online education can respond to conditions that can redefine educational processes (Trilling & Hood, 1999). Eliminating disadvantages is a key method for promoting equity and social justice (Robinson, 2008), which means that redesigning more accessible learning environments supports parity for learners.

Figure 1

Representation of Online Education as a System


The physical, virtual, social, political, and economic environments exert influence on the online educational system. The online education system consists of the macrolevel, mesolevel, and microlevel. Aspects of course delivery, course development, curriculum, and support can be placed witthin each level.



Note. This figure depicts the conceptual elements of distance education systems, particularly online ones, and how they are related. From “Ableism Versus Inclusion: A Systems View of Accessibility Practices in Online Higher Education,” by R. Fennelly-Atkinson, 2022, Toward Inclusive Learning Design: Social Justice, Equity, and Community, edited by B. Hokanson, M. Exter, M. Schmidt, and A. Tawfik. Copyright 2022 by Rita Fennelly-Atkinson. Used with permission.

1 Concept from “Distance education: A systems view of online learning” by M. Moore, and G. Kearsley, 2012.

2 Concepts from “Improving completion in online education systems: An application of systems thinking” by H. Hemphill, L. Hemphill, and R. Runquist, 2019, Learning, Design, and Technology, doi: 10.1007/978-3-319-17727-4_104-1.

3 Concepts from “Analyzing the complexities of online education systems: A systems thinking perspective” by S. Tamim, 2020, TechTrends, doi: 10.1007/s11528-020-00538-9.

Examining Accessibility within Online Education Systems

The historical roots of distance education and disability have led to the current state of accessibility within online higher education. Accessibility is defined as the ability to access and make use of websites, hardware, software, technology, and content that has been designed in a way that allows people to perceive, understand, navigate, interact with, and contribute to online content and environments (Culp et al., 2005; Huss & Eastep, 2016; Web Accessibility Initiative & World Wide Web Consortium, 2019). Currently, many U.S. students with disabilities receive accommodations to support their ability to access and make use of the instruction and educational environments (U.S. Government Accountability Office [U.S. GAO], 2009).

However, this practice is based on the medical model of disability and has posed various barriers at the institutional and individual levels (Andrews et al., 2019; Black et al., 2015; Bogart & Dunn, 2019; Cole & Cawthon, 2015; Grimes et al., 2017; Lindsay et al., 2018; Roberts et al., 2011; Sarrett, 2018; Siebers, 2013; Thompson-Ebanks & Jarman, 2018; Toutain, 2019; U.S. GAO, 2009). The field of disability studies has shifted from a medical to a social model which normalizes accessibility through universal design (Bogart & Dunn, 2019), which influences how society fundamentally views and treats disability in any environment.

Elements Affecting Accessibility Practices

Although physical access has long been established as a disability right, web accessibility emerged with the internet (Catalano, 2014). In 1999, the first draft of Web Content Accessibility Guidelines (WCAG) was published, and it has guided web development and accessibility practices since (Kingman, 2018), which became an enforceable component of disability legislation (Kingman, 2018; Kuykendall, 2017; U.S. GAO, 2017). Experts warn that postsecondary institutions should be concerned about accessibility, as most websites and online courses are reported as noncompliant with current guidelines (Carnevale, 2005; Roig-Vila et al., 2014). Despite conflicting case law, the general consensus is that postsecondary institutions are expected to ensure that their online content and learning environment are accessible (Burke et al., 2016; Cullipher, 2017; Iglesias et al., 2014; McAfee & Taft, 2019; OLC & WCET, 2019). Students also have a growing expectation that instruction should be designed to meet the diverse needs of learners with different abilities and learning preferences (Quinlan et al., 2012).

Previous research regarding online higher education has been limited to identifying how courses may be inaccessible, who is responsible for course accessibility, and whether practitioners have sufficient knowledge of compliance guidelines (Frey & King, 2011; K. C. Green, 2010a, 2010b, 2019; Huss & Eastep, 2016; OLC & WCET, 2019; WebAIM, 2014). These studies have collectively attributed inaccessibility to the general practices of institutions and designers, however attributing responsibility fluctuates due to the high variability in how accessibility is addressed. Although there have been calls for institutions to take a holistic, coordinated, and collaborative approach to accessibility that is founded in pedagogical practices such as Universal Design for Learning (UDL), research on how institutional practices may be associated with accessibility has not been undertaken to date (Ascough, 2002; Linder et al., 2015; OLC & WCET, 2019). While the literature has identified barriers such as cost, training, policies and procedures, time for the implementation of accessible design practices, and range of stakeholders involved, the association between these factors and to what extent they may contribute to the current state of accessibility practices has not been examined (Galusha, 1998; Linder et al., 2015; Rowland et al., 2014). Therefore, this study's primary significance was to collect current information about institutions’ and designers’ characteristics of delivering online higher education courses and their accessibility practices.

Research Questions

The purpose of this study was to explore the relationship between various characteristics that may contribute to accessibility knowledge, practices, and support in higher education online courses. Characteristics and practices pertaining to institutions refer to consistent and clearly defined organization-wide systems and processes, while those concerning designers denote the practices and routines undertaken by individuals. The research questions that were addressed in this study relate both to institutional and individual designer factors, and they are:

  1. What are the characteristics of designers who are responsible for implementing accessibility in higher education online courses?

  2. What are the accessibility practices used by designers in higher education online courses?

  3. What are the characteristics of higher education institutions that offer online courses?

  4. What are the accessibility practices of higher education institutions that offer online courses?

  5. What are the associations (factor structure) amongst the surveyed characteristics and accessibility practices?

    1. To what extent do the practices of online course designers contribute to the identified factor structure?

    2. To what extent do the practices of the institution contribute to the identified factor structure?

Methods

This study used a descriptive quantitative survey research design. A descriptive research methodology is useful for exploring multiple variables and determining whether there are any correlations while using quantitative methods (Knupfer & McLellan, 2001). This section will describe the participants, instrumentation, data collection methods, data analysis procedures, and limitations.

Sample and Survey Distribution

Quantitative data were collected from designers who included professionals working as faculty and instructional designers, and those who provide accessibility support for online courses in U.S. higher education. These respondents reported on the practices and characteristics of themselves and their institutions. Survey responses were collected in two consecutive 30-day phases through the distribution of a link. Phase 1 targeted designers belonging to the Association for Educational Communications and Technology through internal calls for participation and 26 submissions were received. Phase 2 targeted designers in public social media spaces such as Facebook and Twitter using hashtags targeting instructional designers and accessibility, which resulted in 47 submissions. At the end of the 60-day window, a total of 73 surveys were received. A total of four submissions were deleted due to incompleteness. A total of seven submissions were deleted due to respondent ineligibility. This resulted in a sample of 62 (N = 62) for the study.

While this study aimed for a sample size of 300, there was an increased probability of survey fatigue due to decreased data collection methods during COVID-19 (United Nations High Commissioner for Refugees, 2020). Based on the literature, the smallest allowable sample size for this study was 58, based on an observation-to-variable ratio of 2:1. Using this method, ratios as low as 2:1 have been adequate in certain cases (de Winter et al., 2009; Guadagnoli & Velicer, 1988). While there are a host of considerations in determining an adequate sample size, Whitley (2002) suggested that sample sizes as small as 23 can detect large effects.

Research has found that many EFA sample size recommendations produce inconsistent results (Guadagnoli & Velicer, 1988). Moreover, Guadagnoli and Velicer (1988) reported that while any sample size choice can be supported in the literature, the adequacy of the sample is determined by the conditions used in the study. Because the sample size impacted key study parameters, all data analysis decisions were reported and aligned with recommendations from the literature.

Instrumentation

This study used a descriptive quantitative methodology using a web-based survey consisting of 41 closed-ended and write-in questions. A survey is an effective tool for measuring attitudes and perceptions that can be statistically analyzed (Cohen et al., 2011). Anonymous and self-administered, web-based surveys also increase the validity and level of factual reporting by respondents (Callegaro et al., 2015; Fowler, 2002).

The survey instrument used in this study was adapted from the Quality Matters™ Accessibility Survey by Frey and King (2011) and the survey measuring faculty awareness of accessibility used by Huss and Eastep (2016). The instrument was organized into seven sections which progressed from general descriptive information about the participant and institution to specific information about accessibility practices for each, as shown in Table 1.

Table 1

Distribution of Survey Questions by Thematic Section

Survey SectionNumber of Questions
Background information (participant)8
Information about the institution3
Institutional practices (accessibility)4
Institutional responsibility for online instructional content9
Institutional training and support5
Individual practices (accessibility)8
Limitations4
Total41

Adaptation of Survey Instrument

There are many benefits of using an adapted instrument based on existing surveys. According to Fowler (2002), adopting survey questions used in previous studies provides the opportunity to collect comparable data across different samples, which can aid in generalizing results. Because the instruments have already been tested, there is a higher level of confidence in the validity of the questions and quality of data even when these questions are adapted (Hyman et al., 2006).

The question format and wording were modified to ensure unidimensionality and presented in a standardized format (Cohen et al., 2011; Fowler, 2002). To achieve this, questions were reviewed to ensure clarity and that only one variable was addressed at a time (Cohen et al., 2011; Fowler, 2002; Friedman & Amoo, 1999). Question response formats were also adapted to a 5-point Likert-scale format or write-in format as appropriate. The unipolar, ratio-data, Likert-scale response format that was used in the survey was designed to increase the consistency of respondents’ interpretation of meaning, eliminate forced-choice responses, use an equal scale that excluded extreme language, and align scale labels to a numerical scale (Cohen et al., 2011; Fowler, 2002; Friedman & Amoo, 1999; Schwarz et al., 1991).

The survey instrument was pretested in the form of expert reviews and field testing. This instrument review also provided the opportunity to identify potential areas of bias (Fowler, 2002; Groves et al., 2009; Ornstein, 2013). Any issues of concern were addressed prior to the dissemination of the instrument to the target population.

Instrument Validity and Reliability Analysis

Validity refers to an instrument’s ability to represent what it was designed to measure (Cohen et al., 2011; Kimberlin & Winterstein, 2008). Internal, content, face, construct, and external validity was established using methods described in the literature (Cohen et al., 2011). Validity was primarily established through survey design format which included, ensuring anonymity, question adaptation and presentation, clear instruction and language, and configuration of the Likert scale (Chyung et al., 2018; Fowler, 2002; Hyman et al., 2006; Menold & Bogner, 2016; Nardi, 2014).

Reliability is established when an instrument achieves consistent results over time (Cohen et al., 2011; Taherdoost, 2016). Various survey design measures were also used to ensure reliability and included question design, clear instructions and language, and Likert scale construction (Bastos et al., 2014; Chyung et al., 2017, 2018; Cohen et al., 2011; Fowler, 2002; Krosnick & Berent, 1993; Menold & Bogner, 2016; Nardi, 2014; Schwarz et al., 1991). It ensures that an instrument has obtained internal consistency (Kimberlin & Winterstein, 2008). Reliability is established through Cronbach’s alpha, α (Cohen et al., 2011; Field, 2013; Kimberlin & Winterstein, 2008). Further, Cronbach’s α is considered to be particularly appropriate for Likert scales (Whitley, 2002). While the minimum coefficient is usually .70 or higher, a Cronbach’s α of .60 or above is allowable for exploratory studies (Straub et al., 2004). As reported in Table 2, an α coefficient of .78 was established for all Likert-scale items.

Table 2

Instrument Reliability Established Via Cronbach's Alpha

Likert scale itemsαn/itemsn/valid cases
All.782920
Institutional practices.63445
Institutional responsibility for online instructional content.71538
Institutional responsibility for accessibility review.63345
Institutional training and support.84541
Individual practices.80855
Limitations.78456

Note. Cronbach’s α was computed using Likert-scale survey item data that excluded the responses of “Don’t Know” and “Not Applicable.”

Construct Alignment to Research Questions

Accessibility practices consisted of information about habitual procedures adopted by institutions or designers. These constructs were based on frequently identified factors in the literature regarding accessibility within online higher education, and the questions included in this survey had been previously addressed in the literature in various capacities, as shown in Figure 2.

Figure 2

Constructs (Factors) Considered in This Study



Note. MC = Multiple choice; WI = Write in, LS = Likert scale. These factors represent 39 of 41 questions used in the final survey. The two questions not represented here represent filter questions used to determine survey eligibility and were not considered in the analysis.

Data Collection Methods

No identifying information about their location, institution, or selves was collected during the survey. This study was conducted during the summer of 2021, and participants were asked to respond based on their experiences in the previous term. Respondents answered demographic and filter questions to determine their eligibility for participation.

Data Analysis Procedures

This study was approved by the IRB prior to data collection through the Qualtrics online survey platform. The quantitative data were entered into Statistical Package for the Social Sciences, Version 26.0 for Mac software and prepared for statistical analysis. Descriptive statistics were generated for survey data related to research questions 1-4.

For research question 5, the data was analyzed using an Exploratory Factor Analysis (EFA). The assumptions for an EFA including normality, linearity, sampling adequacy, and capability of being factored were checked (Cohen et al., 2011). During this analysis, Cronbach’s alpha, α, was calculated to determine the reliability of the survey (Field, 2013). Suitability for factorization was tested using Bartlett’s test of sphericity and the Kaiser-Meyer-Olkin measure of sampling adequacy (Cohen et al., 2011).

Once assumptions were met, an EFA, a statistical technique of identifying associations between clusters of constructs, was run (Field, 2013; Hoyle & Duvall, 2011). While there are no strict guidelines for EFA sample sizes, the adequacy of the sample is heavily dependent on other elements of the analysis such as communalities, factor loadings, and cross loadings (Costello & Osborne, 2005). Based on the low sample size, the analysis of this study adhered to stricter guidelines for acceptable communalities, factor loadings, and number of factors (de Winter et al., 2009). The results of the initial EFA determined the data analysis procedure, as shown in Figure 3 (Osborne et al., 2008). Because the factors failed to load after the initial EFA, a principal component analysis was used as a data reduction method to eliminate problematic variables (Costello & Osborne, 2005; Osborne et al., 2008). The results from the initial analysis were used to assess the integrity of the data and analysis.

Figure 3

Determination of Analysis Procedure Based on Preliminary Data Analysis



With assumptions met, an initial EFA using Principal Axis Factoring (PAF) and a Promax rotation was conducted. When the assumption of multivariate normality cannot be met, an EFA using PAF is recommended (Osborne et al., 2008). In addition to an extraction method, a rotation of factors should also be considered. In the social sciences, correlations between survey items are expected (Osborne et al., 2008). Oblique rotations assume that variables are correlated and is an appropriate choice for EFAs in the social sciences (Osborne et al., 2008). A Promax rotation is commonly used oblique in EFA (Guadagnoli & Velicer, 1988; Watkins, 2018).

Once the factor extraction loaded, the data was analyzed to determine factor structures. One of the primary concerns with EFA is the number of factors to retain in the analysis (Hoyle & Duvall, 2011). Factor retention was based on Kaiser’s criterion and scree test. Only factors with eigenvalues greater than one were considered for the factor structure according to the Kaiser criterion (Cohen et al., 2011; Kaiser, 1958). These parameters were used to constrain and rerun the analysis to the specified number of factor structures suggested by the data (Whitley, 2002).

Using EFA, the factor structure of the survey items was determined using the PAF procedure shown in Figure 4. For small sample sizes, it is recommended to have fewer factor structures, with at least five factor loadings above .5 (Costello & Osborne, 2005; de Winter et al., 2009). Identified factor structures were reviewed for thematic associations.

Figure 4

Overview of Data Analysis Procedure Using Exploratory Factor Analysis



Limitations

The primary limitation of this study was its small sample size, which required the use of stricter guidelines for running and interpreting factor loadings and structures (de Winter et al., 2009). While smaller sample sizes are more prone to increased error, an EFA is designed to be exploratory and not inferential (Costello & Osborne, 2005). Though the results of this analysis will not be generalizable, they can be used in future studies with a confirmatory analysis to determine generalizability (Costello & Osborne, 2005).

Participants were recruited from an instructional designer professional organization and through social media using hashtags targeting instructional designers and accessibility. Accordingly, survey respondents likely had some familiarity and interest in the topic of accessibility within online education. Further, participation was limited to qualifying U.S. respondents. Therefore, this sample cannot be considered a representative sample.

The researcher was required to avoid the collection of the identifying information regarding the individual respondents and institutions. As a result, it is possible that multiple respondents may have been connected to the same institution. There is no method to identify if this was the case, however, if this did occur there is the potential for bias within the results.

Results

Due to this study's purpose, the descriptive quantitative methodology was selected as the best method for analyzing the research questions. A descriptive analysis was used for the first four research questions, and included reporting of report means, medians, standard deviations, and frequencies. Data was subsequently prepared for the EFA and was run to determine whether any associations between factors existed. This analysis also included the testing of assumptions and suitability for EFA.

Designer Characteristics: Research Question 1

The first research question addressed the characteristics of designers who are responsible for implementing accessibility in higher education settings. The applicable survey questions related to gender, age, role, and years of experience. The respondents predominantly identified as female, comprising 71% of participants. Respondents had an average age of 44.07 (SD = 10.22) and 10 years of experience teaching in higher education (SD = 8.22). Participants had taught online for an average of 6.84 years (SD = 6.44) and had spent 5.3 years (SD = 5.00) supporting online higher education programs non-instructionally. It is important to note that these figures do not account for corresponding experience in other settings. Most respondents’ primary role was that of faculty or instructional designer.

Designers’ Accessibility Practices: Research Question 2

The second research question addressed the accessibility practices used by designers in higher education online courses. The applicable survey questions related to designers’ individual accessibility practices and limitations they had experienced.

Individual Accessibility Practices

Course designers reported that they almost always use some type of multimedia in their online courses (M = 4.66, SD = 0.60). Designers often formatted documents with appropriate text styles (M = 4.25, SD = 0.98), used alt text for images (M = 4.08, SD = 1.19), labelled headers in tables (M = 4.09, SD = 1.17), and included captions with videos (M = 3.95, SD = 1.31). However, including transcripts with videos (M = 3.43, SD = 1.62), complying with overall accessibility guidelines (M = 3.57, SD = 1.26), and using accessibility evaluation tools (M = 3.39, SD = 1.52) were only practiced about half the time. Descriptive statistics are shown in Table 3.

Table 3

Mean Frequency of Designers' Individual Accessibility Practices for Online Courses

Accessibility PracticeMinMaxMdnModeMSESD
Using multimedia in courses355.005.004.66.080.60
Formatting documents with text styles*155.005.004.25.130.98
Including alt text with images154.505.004.08.151.19
Identifying headers on tables155.005.004.09.151.17
Including closed captions with videos155.005.003.95.171.31
Including transcripts with videos154.005.003.43.211.62
Complying with accessibility guidelines154.005.003.57.171.26
Using accessibility evaluation tools154.005.003.39.191.52

Note. To report accurate means of frequencies, responses of “Not applicable” and “Don’t know” were omitted from this data. The sample was n=2, except for one* (n=61).

Limitations in Implementing Accessibility Practices

Course designers reported that time (M = 3.28, SD = 1.51) was the most limiting factor in implementing accessibility. While most respondents reported that accessibility knowledge (n = 24, 38.7%), access to tools and software (n = 21, 33.9%), and budgetary reasons (n = 21, 33.9%) were almost never a limiting factor, a similar number of respondents found they were limited often or almost always (n = 19, 30.7%; n = 21, 33.9%; n = 18, 29.1%).

Higher Education Institution Characteristics: Research Question 3

The third research question addressed the characteristics of higher education institutions that offer online courses. The applicable survey questions related to information about type of institution and student enrollment. Participants provided information through multiple-choice responses. The higher education institutions were classified primarily as 4-year, at 77.4%, with 2-year and technical or trade, respectively, following. Most institutions were described as public, at 83.9%, followed by private non-profit. Enrollment was classified in ranges, with most institutions reporting having more than 10,000 students (63%).

Institutional Accessibility Practices: Research Question 4

The fourth research question addressed the accessibility practices of the higher education institutions that employed the designers who participated in this survey. The applicable survey questions related to institutional online course programming in the following areas: general course practices; responsibility for creating, building, or selecting instructional content; responsibility for reviewing courses for accessibility compliance; and training and support for the development of accessible courses or content.

General Online Course Practices

Institutions often offered online courses as a learning option (M = 4.07, SD = 1.31). Institutional systems or policies were often used to ensure the accessibility of online courses (M = 3.84, SD = 1.21), and courses often included disability statements or policies in course syllabi or materials (M = 4.17, SD = 1.55). However, reviews of online courses for accessibility were only conducted less than half the time (M = 2.80, SD = 1.49). Descriptive statistics are shown in Table 4.

Table 4

Mean Frequency of General Institutional Online Course Accessibility Practices

Online Accessibility Course PracticenMinMaxMdnModeMSESD
Offering of online courses*59155.005.004.07.171.31
Using systems or policies57154.005.003.84.171.29
Requiring disability statements or policies60155.005.004.17.201.55
Reviewing courses50153.001.002.80.211.49

Note. To report accurate means of frequencies, responses of “Not applicable” and “Don’t know” were omitted from this data. As a result, n has been reported to account for these omissions. *Online courses refer to courses specifically designed to be delivered online (in whole or part) and not as an emergency response to the COVID-19 pandemic.

Institutional Responsibility for Online Instructional Content

Responses in this section related to how often specific staff or departments were responsible for creating, building, or selecting content for online courses. Overall, institutions usually assigned this responsibility to faculty or instructors (M = 4.36, SD = 0.97). Instructional technologists or designers (M = 2.74, SD = 1.33) and designated online course builders (M = 2.41, SD = 1.50) were responsible for online course content less than half the time. Further, the administrators or leaders (M = 1.91, SD = 1.24) and production staff (M = 1.95, SD = 1.27) were the least likely to be responsible for online course content. Descriptive statistics are shown in Table 5.

Table 5

Mean Frequency of Designated Responsibility for Online Instructional Content

Responsible Staff or DepartmentnMinMaxMdnModeMSESD
Faculty or instructor61155.005.004.36.120.97
Instructional technologist or designer53153.002.002.74.181.33
Administrator or leader53151.001.001.91.171.24
Production staff43151.001.001.95.191.27
Designated online course builders46152.001.002.41.221.50

Note. To report accurate means of frequencies, responses of “Not applicable” and “Don’t know” were omitted from this data. As a result, n has been reported to account for these omissions. *Online courses refer to courses specifically designed to be delivered online (in whole or part) and not as an emergency response to the COVID-19 pandemic.

Institutional Responsibility for Course Accessibility Review

Responses in this section related to how often specific staff or departments were responsible for reviewing online courses for compliance with accessibility guidelines. Overall, institutions usually assigned this responsibility to individuals more than half the time (M = 3.69, SD = 1.48).

Accessibility Training and Support Offered by the Institution

Overall, respondents indicated that institutions did not require the completion of training to develop or deliver an online course. Responses in this section related to how often specific types of accessibility training or support were offered to respondents in the previous term. Overall, institutions most frequently provided support or assistance (M = 4.02, SD = 1.30) and online resources (M = 3.88, SD = 1.39) to promote online course accessibility. External courses or workshops (M = 2.07, SD = 1.37) were the least frequently provided. Descriptive statistics are shown in Table 6.

Table 6

Mean Frequency of Institutional Accessibility Support and Training

Accessibility Support or TrainingnMinMaxMdnModeMSESD
Mentoring program51152.002.002.57.191.38
Internal course or workshop61154.005.003.61.191.49
External course or workshop45152.001.002.07.201.37
Online resources60155.005.003.88.181.39
Support or assistance*58155.005.004.02.171.30

Note. To report accurate means of frequencies, responses of “Not applicable” and “Don’t know” were omitted from this data. As a result, n has been reported to account for these omissions. *Specifically for the development of accessible online courses or content.

Associations Amongst Surveyed Items: Research Question 5

Research question 5 addresses the associations, or factor structures, amongst the surveyed characteristics and accessibility practices. This analysis included two sub questions that analyzed the extent to which institutions’ and designers’ characteristics and practices contributed to the identified factor structures. The first factor structure retained nine variables, and the second structure retained five variables based on a rotated factor loading above an absolute value of .5. Table 7 shows the factor loadings and communalities after rotation.

Table 7

Summary of Factor Loadings Based on 24 Likert-Scale Items (N = 62)

Factor Loading
Variables12h2
Institutional Distributes Responsibility and Provides Support
Institutional office or department responsible for reviewing online courses for accessibility*.711.50
Online course builders responsible for online course development*.626.47
Institution reviews online courses for accessibility*.608.39
Internal course or workshop provided as training support*.577.51
Instructional technologist or designer responsible for online course development*.536.42
Administrator or leader responsible for online course development*.517.36
Online resources provided as training support*.514.40
Mentoring program provided as training support*.513.35
Individual academic department, schools, or college review online courses for accessibility*.505.25
Accessibility Compliance Supported by Resources
Designers used tables that contained headers**.673.45
Designers used documents with proper text formatting styles**.664.44
Access to tools or software was considered a limitation*-.581.37
Courses complied with accessibility guidelines**.560.38
Budgets were considered a limitation*-.517.29
Eigenvalue4.663.21
% of variance19.4213.39

Note. h2 = communality for rotated factors. Loadings < .5 suppressed. Factors converged after three iterations. Extraction used principal axis factoring with a Promax rotation. The absolute value of factor loadings is used to determine inclusion and negative loadings are interpreted in the opposite direction (Asnawi et al., 2012). *Institutional practices **Course designer practices

Thematic Associations for Identified Factors

The variables clustered on factor 1 suggest a thematic association in which accessibility is impacted by the institution’s distribution of responsibility for creating and reviewing accessible online course content while also providing training and support. Therefore, factor 1 has been designated as Institutional Accessibility Support. On the other hand, the variables loaded onto factor 2 suggest an association between online courses complying with accessibility guidelines and sufficient resources. As a result, factor 2 has been designated as Accessibility Compliance Support (See Figure 5).

Figure 5

Distribution of Designer and Institutional Variables for Each Factor Structure



Discussion

The goal of this study was to provide an updated view of overall accessibility practices and perceptions within higher education online course design and explore the relationships between them.

Designer Characteristics: Research Question 1

The convenience sample of this study consisted of designers who had experience in supporting online programs. Most respondents indicated that their primary role was faculty, instructor, or instructional designer. These results indicated that participants in this sample were more likely to be familiar with accessibility and institutional practices regarding online courses than course designers with less experience. The results of this study support previous research that indicated that faculty and instructional designers are typically responsible for online course accessibility (Green, 2010, 2019; OLC & WCET, 2019).

Designers’ Accessibility Practices: Research Question 2

Course designers overwhelmingly reported using some type of multimedia in their online courses, which highlights the relevance of accessibility. Since many course designers have experience supporting and teaching online courses, it is appropriate for course designers to be familiar with basic accessibility practices. As a result, it is appropriate for designers to employ fundamental accessible practices related to text formatting, images, and videos. However, it is concerning that other accessible practices such as including transcripts with videos, complying with overall accessibility guidelines, and using accessibility evaluation tools are practiced only half the time. This implies that designers are familiar with accessible practices and that they tend to use the ones that are easier to implement with greater frequency.

Compared to the findings of the 2016 Huss and Eastep survey, these results indicate that there has been growth in the use of multimedia and accessible practices. The Huss and Eastep (2016) survey indicated that most participants were not or did not know whether they were using accessible media practices. This study indicates that most respondents knew about accessibility practices and that they implemented them with far greater frequency, which means a fundamental awareness of accessible practices has likely been established.

Course designers reported that time was the most limiting factor in implementing accessibility, but only about half the time. Respondents appeared to be split regarding the perceived impact of the other potential limitations, which included accessibility knowledge, access to tools and software, and budgetary resources. Many respondents reported that these limitations rarely affected them, while a similar amount felt that their accessibility practices were often or always impacted by them. The findings of this study suggest that these common barriers to accessibility implementation are being reduced, which is a marked change from previous studies. When looking at the literature, costs, resources, and time were often cited in earlier studies as barriers to accessibility implementation (Frey & King, 2011; Galusha, 1998; Linder et al., 2015; Rowland et al., 2014).

Higher Education Institution Characteristics: Research Question 3

Most participants in this study worked for public, four-year higher education institutions with enrollments of more than 10,000 students. This general pattern is present, but not consistent with those reported in other recent accessibility surveys (Mancilla & Frey, 2020, 2021a, 2021b). The small sample size of the study which was conducted during the height of COVID-19 may have impacted the institutional representation in this study.

Institutional Accessibility Practices: Research Question 4

The results indicate a growing role of the institution in accessibility practices through policy, training, and support. Now, online courses are frequently offered as a learning option, and institutions often use systems or policies to ensure course accessibility. Accessibility training in the form of online resources, internal courses, and internal workshops was frequently provided to designers. In addition, institutions frequently provide support or assistance in developing accessible online courses. However, there is still opportunity for improvement.

Overall, institutions usually assign individuals the responsibility to conduct accessibility course reviews and to design accessible course content. Since instructional content is typically the purview of the instructor, it is not surprising that they are responsible for creating, building, or selecting content online content. While instructional technologists or designers and designated online course builders were reported to be involved less than half the time, it is encouraging that faculty may have access to additional personnel when building online courses. Moreover, this question solicited the most “Don’t know” answers from respondents, which indicates that university employees may not even be aware of institutional practices or support in this area.

Associations Amongst Surveyed Items: Research Question 5

The distribution of institutional and designer practices contributing to the factor solutions is shown in Figure 6. As depicted, institutional practices contribute heavily to the variables associated with accessibility compliance. However, the results also indicated that there is an association between course designers who engage in accessible course design and access to tools, software, or budgetary resources. These findings imply that the institutions can positively influence accessibility practices of individual course designers. Further, this suggests that while designers contribute to the accessibility of individual courses, this accessibility may not always extend to all courses across an institution.

Figure 6

Institutional Versus Designer Practices Contributing to the Factor Solutions



Note. 1Variables contributing to Factor 1: Institutional Accessibility Support. 2Variables contributing to Factor 2: Accessibility Compliance Support.

Accessibility policies have previously signaled an increase of some accessible practices (Thompson et al., 2013). However, no known studies have explored how a variety of institutional and designer practices may be associated to impact the accessibility of online courses. This analysis indicates that institutional practices may have a major role in accessible course design. The results of this study suggest that there may be a link between institutions that offer training, support, and resources and designers who are developing more accessible online courses. These findings further highlight the interplay between institutional and individual practices. While institutional factors are key to creating a culture that values and prioritizes accessibility, individuals still need to be empowered and supported to develop accessible courses and content.

Implications for Practice

This is the first known study that has attempted to determine whether any variables measuring accessibility practices are statistically associated in some way. Combined with previous findings, the results of this study offer specific areas in which institutions should consider increasing leadership, collaboration, and resources. As this study was able to show associations between key practices, there are some specific recommendations for how institutions can improve or extend their accessibility efforts. Institutions are encouraged to audit their accessibility practices to map how accessibility responsibilities are distributed across the organization to identify potential areas of action in the following areas:

Support for these findings

There have been many studies that have reported on the frequency of institutional and course designer practices regarding accessibility. The findings of this study support the findings of many researchers who have previously measured in their studies (Frey & King, 2011; Huss & Eastep, 2016; Mancilla & Frey, 2020, 2021a, 2021b; OLC & WCET, 2019). This study supports the existence of a positive trend in awareness about accessibility and the increased use of accessible design in online courses. Further, the corroboration of previous work in this area supports the validity of the results despite the small sample size.

Recommendations for future research

Due to the exploratory nature of this study, a follow-up confirmatory analysis study with a large sample size is recommended. With additional data and a larger sample size, additional analyses are also recommended. Specifically, future research should consider the relationship between the experience level of course designers and their accessibility practices. In addition, the relationship between institutional characteristics and course designer practices should also be considered. The impact of accessibility tools should also be considered, as these are a fairly new resource for institutions and designers. Future studies may also consider exploring the factors that contribute to some course designers experiencing more barriers to implementing accessible design compared to others.

Summary

When considering equity and social justice in instructional design, reducing accessibility barriers for learners in online distance education environments is critical. Previous research had identified that various institutional and individual practices contributed to inaccessibility. However, none addressed how these factors were connected and contributed to designing accessible online instruction. From a systems perspective, understanding these relationships is critical to enact systemic and sustainable changes that can improve accessibility practices in online education. This study has identified several institutional- and designer-level practices that can be implemented to promote systematic and effective accessible instructional design practices.

References

Andrews, E. E., Forber-Pratt, A. J., Mona, L. R., Lund, E. M., Pilarski, C. R., & Balter, R. (2019). #SaytheWord: A disability culture commentary on the erasure of “disability.” Rehabilitation Psychology, 64(2), 111–118. https://doi.org/10.1037/rep0000258

Ascough, R. S. (2002). Designing for online distance education: Putting pedagogy before technology. Teaching Theology and Religion, 5(1), 17–29. https://doi.org/10.1111/1467-9647.00114

Asnawi, A. L., Gravell, A. M., & Wills, G. B. (2012). Factor analysis: Investigating important aspects for agile adoption in Malaysia. In A. L. Asnawi, A. M. Gravell, & G. B. Wills (Eds.), AGILEINDIA ’12: Proceedings of the 2012 Agile India (pp. 60–63). IEEE Computer Society. https://doi.org/10.1109/AgileIndia.2012.13

Bastos, J. L., Duquia, R. P., González-Chica, D. A., Mesa, J. M., & Bonamigo, R. R. (2014). Field work I: Selecting the instrument for data collection. Anais Brasileiros de Dermatologia, 89(6), 918–923. https://doi.org/10.1590/abd1806-4841.20143884

Black, R. D., Weinberg, L. A., & Brodwin, M. G. (2015). Universal Design for Learning and instruction: Perspectives of students with disabilities in higher education. Exceptionality Education International, 25(2), 1–26.

Bogart, K. R., & Dunn, D. S. (2019). Ableism special issue introduction. Journal of Social Issues, 75(3), 650–664. https://doi.org/10.1111/josi.12354

Burke, D. D., Clapper, D., & McRae, D. (2016). Accessible online instruction for students with disabilities: Federal imperatives and the challenge of compliance. Journal of Law & Education, 44(2), 135–181.

Callegaro, M., Manfreda, K. L., & Vehovar, V. (2015). Web survey methodology. SAGE. https://www.google.com/books/edition/Web_Survey_Methodology/A_0aCAAAQBAJ?hl=en&gbpv=0

Carlsen, A., Holmberg, C., Neghina, C., & Owusu-Boampong, A. (2016). Closing the gap: Opportunities for distance education to benefit adult learners in higher education. UNESCO Institute for Lifelong Learning. https://unesdoc.unesco.org/ark:/48223/pf0000243264

Carnevale, D. (2005, August 12). Lawsuit charges online university does not accommodate learning-disabled students. The Chronicle of Higher Education. https://www.chronicle.com/article/lawsuit-charges-online-university-does-not-accommodate-learning-disabled-students/?cid=gen_sign_in

Catalano, A. (2014). Improving distance education for students with special needs: A qualitative study of students’ experiences with an online library research course. Journal of Library & Information Services in Distance Learning, 8(1–2), 17–31. https://doi.org/10.1080/1533290X.2014.902416

Chyung, S. Y. Y., Roberts, K., Swanson, I., & Hankinson, A. (2017). Evidence-based survey design: The use of a midpoint on the Likert scale. Performance Improvement, 56(10), 15–23. https://doi.org/10.1002/pfi.21727

Chyung, S. Y. Y., Swanson, I., Roberts, K., & Hankinson, A. (2018). Evidence-based survey design: The use of continuous rating scales in surveys. Performance Improvement, 57(5), 38–48. https://doi.org/10.1002/pfi.21763

Cohen, L., Manion, L., & Morrison, K. (2011). Research methods in education (7th ed.). Routledge.

Cole, E. V., & Cawthon, S. W. (2015). Self-disclosure decisions of university students with learning disabilities. Journal of Postsecondary Education and Disability, 28(2), 163–179. http://ahead.org/publications/jped/vol_28/no2tc

Costello, A. B., & Osborne, J. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Research, and Evaluation Practical Assessment, Research, and Evaluation, 10(1), 7. https://doi.org/10.7275/jyj1-4868

Cullipher, V. (2017). Schools and organizations that serve education: Beware OCR complaints addressing online accessibility. Microassist. www.microassist.com/wp-content/uploads/2017/11/201711MLY-OCRandEducation-FINAL-acc.pdf

Culp, K. M., Honey, M., & Mandinach, E. (2005). A retrospective on twenty years of education technology policy. Journal of Educational Computing Research, 32(3), 279–307. https://doi.org/10.2190/7W71-QVT2-PAP2-UDX7

de Winter, J. C. F., Dodou, D., & Wieringa, P. A. (2009). Exploratory factor analysis with small sample sizes. Multivariate Behavioral Research, 44, 147–181. https://doi.org/10.1080/00273170902794206

Field, A. (2013). Discovering statistics using IBM SPSS Statistics (4th ed.). SAGE.

Fowler, F. J. (2002). Survey research methods (3rd ed.). SAGE.

Frey, B., & King, D. K. (2011). Quality MattersTM accessibility survey: Institutional practices and policies for online courses. (ED520903). ERIC. https://files.eric.ed.gov/fulltext/ED520903.pdf

Friedman, H. H., & Amoo, T. (1999). Rating the rating scale. Journal of Marketing Management, 9(3), 114–123. https://www.rangevoting.org/RateRatingScales.html

Galusha, J. M. (1998). Barriers to learning in distance education. (ED416377). ERIC. https://files.eric.ed.gov/fulltext/ED416377.pdf

Green, K. C. (2010, November 13). 2010 managing online education survey. The Campus Computing Project. https://www.campuscomputing.net/content/2010/11/13/2010-managing-online-education-survey

Green, K. C. (2019). 2019 campus computing: The 30th national survey of computing and information technology in American higher education. https://static1.squarespace.com/static/5757372f8a65e295305044dc/t/5da60e02c69e0005bf93690e/1571163656824/Campus+Computing+-+2019+Report.pdf

Grimes, S., Scevak, J., Southgate, E., & Buchanan, R. (2017). Non-disclosing students with disabilities or learning challenges: characteristics and size of a hidden population. Australian Educational Researcher, 44(4–5), 425–441. https://doi.org/10.1007/s13384-017-0242-y

Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E., & Tourangeau, R. (2009). Survey methodology (2nd ed.). John Wiley & Sons.

Guadagnoli, E., & Velicer, W. F. (1988). Relation of sample size to the stability of component patterns. Psychological Bulletin, 103(2), 265–275. https://doi.org/10.1037/0033-2909.103.2.265

Hoyle, R., & Duvall, J. (2011). Determining the number of factors in exploratory and confirmatory factor analysis. In D. Kaplan (Ed.), The SAGE handbook of quantitative methodology for the social sciences (pp. 302–317). SAGE. https://doi.org/10.4135/9781412986311.n16

Huss, J., & Eastep, S. (2016). Okay, our courses are online, but are they ADA compliant? An investigation of faculty awareness of accessibility at a Midwestern University. I.E.: Inquiry in Education, 8(2), 1–22. https://digitalcommons.nl.edu/ie/vol8/iss2/2/

Hyman, L., Lamb, J., & Bulmer, M. (2006). The use of pre-existing survey questions: Implications for data quality. Proceedings of the European Conference on Quality in Survey Statistics, 1–8. https://ec.europa.eu/eurostat/documents/64157/4374310/22-Use-of-pre-existing-survey-questions-implications-for-data-quality-2006.pdf/e953a39e-50be-40b3-910f-6c0d83f55ed4

Iglesias, A., Moreno, L., Martinez, P., & Calvo, R. (2014). Evaluating the accessibility of three open-source learning content management systems: A comparative study. Computer Applications in Engineering Education, 22(2), 320–327. https://doi.org/10.1002/cae.20557

Kaiser, H. F. (1958). The varimax criterion for analytic rotation in factor analysis. Psychometrika, 23(3), 187–200. http://128.174.199.77/psychometrika_highly_cited_articles/kaiser_1958.pdf

Kimberlin, C. L., & Winterstein, A. G. (2008). Validity and reliability of measurement instruments used in research. American Journal of Health-System Pharmacy, 65(23), 2276–2284. https://doi.org/10.2146/ajhp070364

Kingman, A. (2018, May 21). A brief history of WCAG. Last Call Media. https://lastcallmedia.com/blog/brief-history-wcag

Knupfer, N. N., & McLellan, H. (2001). Descriptive research methodologies. In D. H. Jonassen (Ed.), Handbook of research for educational communications and technology (1st ed., pp. 1196–1212). Lawrence Earlbaum. https://members.aect.org/edtech/ed1/pdf/41.pdf

Krosnick, J. A., & Berent, M. K. (1993). Comparisons of party identification and policy preferences: The impact of question format. American Journal of Political Science, 37(3), 941–964. https://doi.org/10.2307/2111580

Kuykendall, H. (2017, February 22). Section 508 and WCAG: How updated federal accessibility standards map to WCAG 2.0. Microassist Digital Accessibility Digest. www.microassist.com/digital-accessibility/section-508-and-wcag

Linder, K. E., Fontaine-Rainen, D. L., Behling, K., Rontaine-Rainen, D. L., & Behling, K. (2015). Whose job is it? Key challenges and future directions for online accessibility in US Institutions of higher education. Open Learning, 30(1), 21–34. https://doi.org/10.1080/02680513.2015.1007859

Lindsay, S., Cagliostro, E., & Carafa, G. (2018). A systematic review of barriers and facilitators of disability disclosure and accommodations for youth in post-secondary education. International Journal of Disability, Development and Education, 65(5), 526–556. https://doi.org/10.1080/1034912X.2018.1430352

Mancilla, R., & Frey, B. (2020). Administrative supports for digital accessibility: Policies and processes. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/QM-Digital-Accessibility-Policy-Process-WP.pdf

Mancilla, R., & Frey, B. (2021a). Course design for digital accessibility: Best practices and tools. https://www.qualitymatters.org//sites/default/files/research-docs-pdfs/QM-Digital-Accessibility-Best-Practices-Tools-WP.pdf

Mancilla, R., & Frey, B. (2021b). Professional development for digital accessibility: A needs assessment. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/QM-Digital-Accessibility-Professional-Development-WP.pdf

McAfee & Taft. (2019, March 1). Municipalities and universities new targets in ADA website accessibility lawsuits. McAfee & Taft EmployerLINC. https://www.mcafeetaft.com/municipalities-and-universities-new-targets-in-ada-website-accessibility-lawsuits/

Meadows, D. H. (2008). Thinking in systems: A primer. Chelsea Green Publishing.

Menold, N., & Bogner, K. (2016). Design of rating scales in questionnaires: GESIS survey guidelines. GESIS - Leibniz Institute for the Social Sciences, 1–13. https://doi.org/10.15465/gesis-sg_en_015

Moore, M. G., & Kearsley, G. (2012). Distance education: A systems view of online learning (3rd ed.). Wadsworth, Cengage Learning.

Nardi, P. M. (2014). Doing survey research: A guide to quantitative methods (3rd ed.). Paradigm Publishers.

Online Learning Consortium, & WICHE Cooperative for Educational Telecommunications. (2019). Accessibility survey of OLC and WCET members. https://wcet.wiche.edu/initiatives/research/accessibility-survey-olc-wcet-2019

Ornstein, M. (2013). A companion to survey research. SAGE.

Osborne, J. W., Costello, A. B., & Kellow, J. T. (2008). Best practices in exploratory factor analysis. In J. W. Osborne (Ed.), Best practices in quantitative methods (pp. 86–99). SAGE.

Quinlan, M. M., Bates, B. R., & Angell, M. E. (2012). “What can I do to help?”: Postsecondary students with learning disabilities’ perceptions of instructors’ classroom accommodations. Journal of Research in Special Educational Needs, 12(4), 224–233. https://doi.org/10.1111/j.1471-3802.2011.01225.x

Reigeluth, C. M. (2019). Chaos theory and the sciences of complexity: Foundations for transforming educational systems. In M. J. Spector, B. B. Lockee, & M. D. Childress (Eds.), Learning, design, and technology: An international compendium of theory, research, practice, and policy (pp. 1–12). https://doi.org/10.1007/978-3-319-17727-4_95-1

Reynolds, M., & Holwell, S. (2010). Introducing systems approaches. In M. Reynolds & S. Holwell (Eds.), Systems approaches to managing change: A practical guide (pp. 1–24). Springer. https://doi.org/10.1007/978-1-84882-809-4

Roberts, J. B., Crittenden, L. A., & Crittenden, J. C. (2011). Students with disabilities and online learning: A cross-institutional study of perceived satisfaction with accessibility compliance and services. Internet and Higher Education, 14(4), 242–250. https://doi.org/10.1016/j.iheduc.2011.05.004

Robinson, R. (2008). Using distance and ICT to improve access, equity and quality in rural teachers’ professional development in Western China. International Review of Research in Open and Distance Learning, 9 (1).

Roig-Vila, R., Ferrández, S., & Ferri-Miralles, I. (2014). Assessment of web content accessibility levels in Spanish official online education environments. International Education Studies, 7(6), 31–45. https://doi.org/10.5539/ies.v7n6p31

Rowland, C., Goetze, L., & Whiting, J. (2014). GOALS cost case study: Costs of web accessibility in higher education. National Center on Disability and Access to Education. http://www.ncdae.org/documents/GOALS_Cost_Case_Study.pdf

Sarrett, J. C. (2018). Autism and accommodations in higher education: Insights from the autism community. Journal of Autism and Developmental Disorders, 48(3), 679–693. https://doi.org/10.1007/s10803-017-3353-4

Scagnoli, N. I. (2001). Student orientations for online programs. Journal of Research on Technology in Education, 34(1), 19–27. https://doi.org/10.1080/15391523.2001.10782330

Schwarz, N., Knuauper, B., Hippler, H.-J., Noelle-Neumann, E., & Clark, L. (1991). Rating scales: Numeric values may change the meaning of scale labels. Public Opinion Quarterly, 55, 570–582. https://doi.org/10.1086/269282

Seaman, J. E., Allen, I. E., & Seaman, J. (2018). Grade increase: Tracking distance education in the United States. Babson Survey Research Group. http://onlinelearningsurvey.com/reports/gradeincrease.pdf

Siebers, T. (2013). Disability and the theory of complex embodiment: For identity politics in a new register. In L. J. Davis (Ed.), The disability studies reader (5th ed., pp. 313–332). Taylor & Francis. https://www.google.com/books/edition/The_Disability_Studies_Reader/aiQlDwAAQBAJ?q=medical+model&gbpv=1#f=false

Straub, D., Boudreau, M.-C., & Gefen, D. (2004). Validation guidelines for IS positivist research. Communications of the Association for Information Systems, 13(1), 24. https://doi.org/10.17705/1CAIS.01324

Stroh, D. P. (2015). Systems thinking for social change: A practical guide to solving complex problems, avoiding unintended consequences, and achieving lasting results. Chelsea Green Publishing.

Taherdoost, H. (2016). Validity and reliability of the research instrument: How to test the validation of a questionnaire/survey in a research. International Journal of Academic Research in Management, 5(3), 28–36. https://doi.org/10.2139/ssrn.3205040

Tamim, S. R. (2020). Analyzing the complexities of online education systems: A systems thinking perspective. TechTrends, 64(5), 740–750. https://doi.org/10.1007/s11528-020-00538-9

Thompson-Ebanks, V., & Jarman, M. (2018). Undergraduate students with nonapparent disabilities identify factors that contribute to disclosure decisions. International Journal of Disability, Development and Education, 65(3), 286–303. https://doi.org/10.1080/1034912X.2017.1380174

Thompson, T., Comden, D., Ferguson, S., Burgstahler, S., & Moore, E. J. (2013). Seeking predictors of web accessibility in U.S. higher education institutions. Information Technology and Disabilities Journal, 13(1). http://itd.athenpro.org/volume13/number1/thompson.html

Toutain, C. (2019). Barriers to accommodations for students with disabilities in higher education: A literature review. Journal of Postsecondary Education and Disability, 32(3), 297–310.

Trilling, B., & Hood, P. (1999). Learning, technology, and education reform in the Knowledge Age or “We’re wired, webbed and windowed, now what ?” Educational Technology Publications, 39(3), 5–18. https://www.jstor.org/stable/pdf/44428527.pdf?refreqid=excelsior%3A4a4e727e05609543326cc7b5504a3c5c

U.S. General Services Administration. (2017, November 10). Accessibility news: The Section 508 update. Section508.Gov. https://www.section508.gov/blog/accessibility-news-the-section-508-Update

U.S. Government Accountability Office. (2009). Higher education and disability: Education needs a coordinated approach to improve its assistance to schools in supporting students. https://www.gao.gov/new.items/d1033.pdf

Watkins, M. W. (2018). Exploratory factor analysis: A guide to best practice. Journal of Black Psychology, 44(3), 219–246. https://doi.org/10.1177/0095798418771807

Web Accessibility Initiative, & World Wide Web Consortium. (2019, June 5). Introduction to web accessibility. https://www.w3.org/WAI/fundamentals/accessibility-intro/

Whitley, B. E. (2002). Principles of research in behavioral science (2nd ed.). McGraw Hill.

Rita Fennelly-Atkinson

Sam Houston State University

Kimberly N. LaPrairie

Sam Houston State University

This content is provided to you freely by EdTech Books.

Access it online or download it at https://edtechbooks.org/jaid_12_2/analysis_of_associated_factors_that_influence_the_accessibility_of_online_higher_education_.