A Study of Instructional Design Master's Programs and Their Responsiveness to Evolving Professional Expectations and Employer Demand

&
DOI:10.51869/102/igl
Instructional DesignCompetenciesEmployer demand
As world and job market trends continue to evolve, so does our definition of the instructional design field. Recent study definitions, professional standards, and employer demand suggest that beyond designing and developing instructional solutions, other competency domains from needs assessment to evaluation are essential for improving learning and performance in a variety of educational and workplace settings. This study sought to elucidate the competencies that instructional design master’s programs consider essential as illustrated by their core curriculum. Findings suggest that while there is strong alignment in some competency areas, there is insufficient evidence of alignment in others.

Evolving Definition of Instructional Design

As the world and job market continue to evolve, the instructional design field has had its share of transition. One indication of its evolution has been the various adaptations of its definition over the years that have underscored a shift from instructional media (Ely, 1963) to a focus on solving instructional problems in educational settings (Cassidy, 1982; Silber, 1970; Gentry, 1995). While some of these definitions referred to some variation of systemic and/or systematic, the focus was typically on instructional systems and in some cases (e.g., AECT, 1994) included stages of a systematic process such as design, development, utilization, management, and evaluation of instruction. In his definition, Silber (1970) offered a more concrete system orientation suggesting elements such as research followed by design, production, evaluation, support-supply, and utilization, and also including the management of system components such as messages, people, materials, devices, techniques, and settings. Over the years, others have written on the essential expansion of our professional lens to the whole performance system and outcomes (Clark & Estes, 2008; Guerra-López, 2008; Guerra-López & Hicks, 2017; Kaufman, 1996; 2019; Rummler, 2007). More recently, Reiser and Dempsey (2018) have offered the following definition:

The field of instructional design and technology (also known as instructional technology) encompasses the analysis of learning and performance problems, and the design, development, implementation, evaluation and management of instructional and non-instructional processes and resources intended to improve learning and performance in a variety of settings, particularly educational institutions and workplace. (p. 4)

This recent definition suggests recognition that the scope of instructional design professionals extends beyond designing and developing instructional solutions, and that there is an expectation that they incorporate a variety of processes from diagnosis to evaluation, in order to measurably improve learning and performance in a variety of performance settings.

The Evolving Role of Instructional Designer

The interdisciplinary roots of instructional design have contributed to the diversity of our epistemologies and allowed us to draw from a variety of theories, empirical evidence, and models. It has also contributed to professionals in the field working across a wide range of job titles, roles, and responsibilities. Instructional design professionals work under a variety of job titles including instructional designer or technologist, curriculum developer, training manager, learning and development specialist, and performance improvement consultant to-name-a-few. Klein and Kelly (2018) noted the difficulty in identifying professionals in the field due to its changing nature and in their review of 393 job announcements for instructional design jobs, found 35 different job titles across 28 industries. Evidence of this wide array of job titles can also be found in earlier studies. Furst-Bowe (1996) included 147 instructional design practitioners and found that 40 different job titles were used to describe those professionals. Adding another dimension of variability, Larson and Lockee (2008) observed that instructional design professionals work in a range of work environments such as, corporate, higher education, K-12, government, military, non-profit, and healthcare. Work settings, job roles, and varied functions contribute to the wide range of job titles used for instructional design professionals (Liu et al., 2002).

The diverse and multifaceted nature of work performed by instructional designers across job roles, contexts, areas of specialization, and career trajectories represent a particular challenge for university programs, which are trusted to effectively prepare instructional design graduates to meet evolving professional expectations and employer needs. Thus, this mixed-methods study seeks to elucidate the foundational competencies prioritized by instructional design master’s programs. While instructional design programs vary in the range of core and elective courses they offer their students to prepare for the workforce, for the purposes of this study, core courses are used as an indication of the competencies programs consider as essential and will consider electives as optional competencies. This study will therefore focus on an analysis of core cores.

Professional Expectations and Standards

A professional organization supports a particular profession and works to advance the quality, skill level, and interests of professionals working in that specific field. A variety of professional organizations have used professional standards as a framework for guiding professional practice and judging the competence of professionals across key areas. The International Society for Performance Improvement (ISPI) (2016) has proposed the following ten professional standards: (a) focusing on results or outcomes, (b) taking a systemic view, (c) adding value, (d) working in partnership with clients and stakeholders, (e) determining the need or opportunity, (f) determining the cause, (g) designing the solution(including implementation and evaluation), (h) ensuring solutions’ conformity and feasibility, (i) implementing solutions, (j) Evaluating the results and impact.

Others have sought to propose professional competencies as a way to characterize the work of instructional designers. Competencies can be defined as a set of observable knowledge, skills, attitudes, and behaviors (McLagan, 1997; Gupta, Sleezer & Russ-Eft, 2007) that enable one to effectively perform the activities of a given occupation or function to the standards expected in employment (Koszalka, Russ-Eft, & Reiser, 2013; Richey, Fields, & Foxon, 2001). Similarly, the International Board of Standards for Training, Performance and Instruction (IBSTPI, 2016) defines a competency as “a knowledge, skill, or attitude that enables one to effectively perform the activities of a given occupation or function to the standards expected in employment.” The identification of key competencies is particularly useful for educational institutions because they provide guidelines for curriculum development, program revision, and accreditation (Byun, 2000; Klein & Jun, 2004).

In 2012, IBSTPI described the work of instructional designers through 22 competencies categorized into following domains: professional foundations, planning and analysis, design and development, evaluation and implementation, and management. Meanwhile, the Association for Talent Development (ATD) has defined a set of foundational competencies that include business skills such as global mindset, industry knowledge, interpersonal skills, personal skills, and technology literacy. Areas of technical expertise include instructional design, learning technologies, performance improvement, evaluation, training delivery, managing learning programs, coaching, knowledge management, and change management (ATD, 2016). Additionally, the Canadian Society for Training and Development (CSTD) (2011) has proposed five competency areas: assessing performance needs, designing training, facilitating training, supporting the transfer of learning, and evaluating training.

Finally, while the above competency frameworks have been focused on the skills of instructional design professionals and closely related roles, in 2012 the Association for Educational Communications and Technology (AECT) adopted six key standards directed at education programs and provided them with a framework by which programs could determine student competence across six key dimensions: (a) Content knowledge, where students demonstrate the knowledge necessary to create, use, assess, and manage theoretical and practical applications of educational technologies and processes; (b) Content pedagogy, where students develop as reflective practitioners able to demonstrate effective implementation of educational technologies and processes based on contemporary content and pedagogy; (c) Learning environments, where students facilitate learning by creating, using, evaluating, and managing effective learning environments; (d) Professional Knowledge and Skills, where students design, develop, implement, and evaluate technology-rich learning environments within a supportive community of practice; and (e) Research, where students explore, evaluate, synthesize, and apply methods of inquiry to enhance learning and improve performance.

Prior research has looked at the alignment between instructional design program curriculum and what professionals, including faculty consider to be important skills. Fox and Klein (2003) surveyed 24 instructional design faculty members in 11 U.S. universities as well as 45 ISPI and ATD members and asked participants to rate the importance of having human performance technology (HPT) competencies for graduates of instructional systems/design/technology programs. The competencies listed in the survey instrument used by the authors were based on the Handbook of Human Performance Technology (Stolovitch & Keeps, 1999) and HPT courses offered by instructional design programs. Respondents indicated that instructional design graduates should have a broad knowledge of HPT and performance improvement process with training needs assessment rated as one of the top competencies. Fox and Klein (2003) also noted that most of the highly rated competencies do not receive extensive coverage in most ID programs. Specifically, they pointed out that many instructional design programs do not include human performance as a core course, and that there are numerous other discrepancies between the instructional design program curricula and the competencies their respondents considered important.

Employer Expectations

Prior research has also sought to improve our understanding of employers’ expectations (Byun, 2000; Fox & Klein, 2003; Klein & Fox, 2004; Klein & Jun, 2014; Klein & Kelly, 2018; Larson & Lockee, 2007, 2008; Richey, Fields, & Foxon, 2001; Sugar et al., 2012). Job announcement analysis has been one of the common methods for identifying competencies expected from professionals of a field (Kang & Ritzhaupt, 2015; Ritzhaupt, Martin & Daniels, 2010; Moallem, 1995; Ritzhaupt & Martin, 2013; Sugar et al., 2012). An advantage of using job announcements is first-hand information on which competencies are in demand in the market (Klein & Kelly, 2018) which are of benefit not only to university programs but also to human resource consultants and recruiters who want to find out what skills employers are seeking (Sodhi & Son, 2009).

Ritzhaupt, Martin and Daniels (2010) studied 205 job postings and 231 survey responses by instructional design professionals and found the top 4 instructional design-related competencies included ability to create effective instructional products (72%), apply sound instructional design principles (34%), conduct needs assessment (33%) and conduct evaluation (32%). Sugar et al. (2012) analyzed 615 job postings for Instructional Design and Technology and identified skills expected of instructional design program graduates. The most sought-after competency areas were knowledge of instructional design and ADDIE which appeared in more than 90% of the listings. The research identified evaluating effectiveness of training programs and conducting needs analysis as a priority for more than 50% of employers. When disaggregated into Corporate and Higher Education sectors, researchers found that while more than 61% corporate jobs require assessment skills, the number was far lower for higher education jobs at 33%. E-learning, evaluation, and project management were some other competencies highlighted by the study.

Carliner et al. (2015) explored the competencies needed by Performance Consultants. By studying 56 anonymized job descriptions for the position of a Performance Consultant in Canada, they used qualitative content analysis techniques to create a profile of the performance consultant position. In doing so, they also compared the profile to the Training and Development Professionals competency model from the CSTD. They observed that project management competencies were in demand by employers but missing in the CSTD model.

Along with being included in the Competencies for Training and Development Professionals framework, 42 out of 56 (or 75%) job descriptions have needs assessment as a priority area, making it a core area of responsibility. Design and development of programs, and managing client relationships were other competencies prioritized in the job listings. Kang and Ritzhaupt (2015) analyzed 400 job postings for 43 related job titles in the instructional design field and found that along with knowledge of instructional design models and principles (55%), the top competency areas included learning management systems (33.75%) and assessment methods (27.5%).

Most recently, Klein and Kelly (2018) analyzed 393 job announcements and interviewed 20 instructional design managers to identify instructional designer competencies expected by employers. The required competencies varied according to the job sector (e.g., corporate, higher education, healthcare, and consulting). Effective collaboration (75%) and application of learning theory and principles (60%) showed up in all sectors. Additionally, ADDIE (67%), e-learning (64%), and needs assessment (53%) were also prominent competencies across the sectors. From these employer demand studies, we can see a pattern that typically, and not surprisingly, places instructional design, needs assessment, evaluation, and technology-related competencies as top competency domains.

Methodology

The study sought to answer: What foundational competencies are prioritized by instructional design master’s programs? Content analysis techniques were used to systematically review and code existing information available through instructional design program websites (Boettger & Palmer, 2010). Frequency counts were used to estimate the occurrence of particular instances and themes. This study used core courses as an indication of essential or prioritized program competencies because of the expectation that all students take these courses as part of their preparation. Competencies met through electives are optional and students are not guaranteed to walk away with the competencies developed through those courses. The researchers reviewed core courses offered by 40 instructional design (or comparable name) master’s programs across 34 universities in the United States (U.S.). Given that universities offer programs in educational technology related fields under different program titles based on their program emphasis and course structure (Almaden & Ku, 2017), an instructional design or comparable master’s programs was defined as one having some variation of the following titles: “Learning Design and Technology”, “Educational Technology”, “Instructional Design and Technology”, “Instructional Systems Design”, “Curriculum Development and Technology”, “Organizational Performance and Workplace Learning” etc. This study did not include programs from related fields such as human resource development or organizational development because while there may be overlap in courses, those programs are rooted in different disciplinary and theoretical perspectives. Programs in our sample were selected from the information found on the list for Best Masters in Educational Technology Program on besteducationdegrees.com and by using online search engines like Google and searching for Instructional Design programs offered in the U.S.

The researchers reviewed all core courses across the 40 programs (a total of 249 courses) using content published in program websites. In most cases, core courses formed half of the total credits required to graduate. For each program, the researchers gathered data about the degree offered, name of the program, total credits, core courses, course requirements, mode of study (online or on campus), and the course objectives. Researchers also captured additional requirements for course completion such as thesis, internship, portfolio, etc.

The researchers analyzed published program descriptions in order to provide context for the analysis of courses. We noted total credits and their breakdown into core, electives and specialization for each program as part of our study database. We used a grounded theory approach to guide the process of coding of initial themes, clustering related themes into subcategories, and finally building research consensus through in-depth discussions. We developed a high-level map with identified priority competency areas in order to facilitate comparisons (Miles, Huberman & Saldana, 2014). The research team collectively reviewed and finalized the codes to be used.

Findings

Program Profiles

Various types of master’s degrees are offered. Our sample included 15 Master of Science (MS), 10 Master of Education (M.Ed.), four Master of Arts (MA), three Master of Science in Education (MSEd), two Master of Arts in Education (MAEd), two Master of Educational Technology (MET), two Master of Science in Educational Technology (MSET), one Master of Arts in Educational Technology (MAET), and one Master of Learning Technology (MLT).

While program names varied, they included terms such as instructional, technology, design, systems, learning and educational. As Figure 1 illustrates, Technology/technologies were the most commonly used terms in program names (used in 33 of 40 program names). “Instructional”, “design”, and “learning” were also frequently used terms, followed by systems and education. Less frequently used terms included curriculum, performance, training, improvement, development, innovation, psychology, sciences, library media, and entrepreneurship.

Figure 1

Frequency of Terms found in Program Names

10-2-Guerra-Lopez-Fig1.PNG
Chart showing frequency of terms found in program names

As Table 1 illustrates, 30 of the programs indicated preparing students to develop effective instruction (including design, development, and implementation) as their focus. Other commonly cited focus areas included preparing students to apply skills in a corporate environment (26 programs) and technology integration (24 programs), followed closely by application of skills in academic environments (22). Other focus areas included evaluation of effective instruction (17), improved performance (10), development of learning environments/systems (8), leadership (6), improved learning processes (4), and specific emphasis on K-12 environments (4).

Table 1

Focus Area Based on Published Descriptions

Program purpose as stated in program descriptionsCount
1. Effective instruction development (Design, Develop, Implement)30
2. Apply skills in corporate environment26
3. Technology integration24
4. Apply skills in academic environment22
5. Evaluate effectiveness of instruction17
6. Performance Improvement10
7. Development of learning environments/systems8
8. Leadership6
9. Learning processes5
10. K-123

In terms of competency areas, six overarching themes emerged from the content analysis of the 249 courses consisting of (a) Foundations of Instructional Design, Design and Development, (b) Research and Evaluation, (c) Technology, (d) Teaching and Learning, (e) Management and Performance, and (f) Needs Assessment. These overarching domains in turn consisted of a number of more specific subdomains as illustrated in Table 2.

Table 2

Program Requirements – Core Courses

Competency DomainSub-Domain (Topics) with frequencySample Course TitlesTotal number of courses
Foundations of ID, Design and Development• Overview of instructional design field (21)• Design Instructional Interventions (20)• Introduction to instructional design & development process and skills (13)• Current issues and trends (8)• Advanced ID (5)• Design Thinking (3)• Curriculum Development (1)• Text-based instruction (1)• Introduction to ID• Trends and Issues in Instructional Design• Current Trends in Instructional Technology• Instructional Design• Design Thinking and Knowledge• Advanced Instructional Design Theory• Message Design and Display72
Research and Evaluation• Evaluations (24)• Educational Research (17)• Research Methodology (14)• Learner Assessments (4)• Inquiry and Measurement (5)• Learning Analytics (3)• Measurement and Evaluation in Education• Inquiry and Measurement• Data-Driven Decision-Making for Instruction• Evaluation of Learning and Performance• Research Methods in Education• Research in Instructional Technology67
Technology• Integration of technology (21)• E-learning (13)• Emerging Technologies (8)• Social, ethical, legal issues in technology (4)• Library and Database, Digital Information (3)• Computers- hardware (1)• Multimedia in K-12 (1)• Emerging Technologies• Technology and Design• Introduction to E-learning• Learning Management Systems• Interactive Course Design• Library and Digital• Information Management• Foundations of Distance Education• Selection and Integration of Multimedia for K12 Schools51
Teaching and Learning• Learning Theories (23)• Adult Learner (3)• Educational Psychology (2)• Roles of teachers (2)• The Adult as Learner• Learning theory30
Management and Performance• Performance Improvement (6)• Project Management (5)• Human Performance Technology (4)• Communication (2)• Diversity (2)• Organizational learning (1)• Financial Impact of Training and Performance Improvement (1)• Leadership (1)• Foundations of Project Management• Leadership and Education• Social/Cultural Issues in Educational Technology• Intro to HPT• Foundations of Performance Improvement• Return on Investment in Training and Performance Improvement22
Needs Assessment• Needs Assessment (5)• Strategic Assessment (2)• Instructional Needs Assessment and Analysis• Strategic Assessment in Education7

The strongest theme to emerge from core courses was Foundations of Instructional Design, Design and Development (72 courses), followed by Research and Evaluation (67 courses), and Technology (51 courses), while Needs Assessment (seven courses) was at the bottom of the list. Among the sub-domains (i.e. specific course topics) evaluation was the most common, with 24 programs including it as core courses along with learning theories which was included by 23 courses. They were closely followed by Overview of ID and technology integration with 21 programs each.

Foundations of Instructional Design, Design and Development

The Foundations of Instructional Design, Design and Development theme includes an overview of the field of Instructional Design and Technology emphasizing history and guiding principles, knowledge, skills, and abilities, and ethical foundations of the field, as well as current trends and future implications. This category also includes design and development courses offering an overview of the instructional design process along with topics such as design thinking, curriculum development and learning strategies. Core courses, required by 33 programs, covered at least one of the topics in this domain. Our sample contained 72 courses covering topics under this domain with some programs covering more than one topic. The most commonly offered course titles were Introduction to instructional design, Trends and Issues in Instructional Design, Current Trends in Instructional Technology, Instructional Design, Advanced Instructional Design Theory, and Message Design and Display.

Research and Evaluation

Courses covering evaluation of instructional interventions and building skills for making data-based decisions related to learning and human performance, as well as courses related to research methodology and educational research are included under the Research and Evaluation domain. It is a widely covered competency domain with 33 university programs requiring at least one course from this domain. The term evaluation appeared in 24 required courses with titles such as Measurement and Evaluation in Education and Evaluation of Learning and Performance. Research courses like Research Methods in Education and Research in Instructional Technology were also included. Overall, 67 required courses fell under this domain.

Technology

Technology-focused courses had diverse titles. Integration of technology in instructional design, e-learning, emerging technologies and their potential impact on classrooms, technology applications as an effective tool in teaching and learning, learning management systems are some of the topics that make up the Technology domain. Technology was the focus of 51 required courses across twenty-eight programs and included course titles such as Emerging Technologies, Technology and Design, and Introduction to E-learning.

Teaching and Learning

This domain includes course topics related to learning theory and application to the instructional design process, adult learners, role of teachers, and an introduction to theories emerging from research on educational psychology. A total of 30 courses in our sample are captured in this domain with learning theories being the most common course – required as a core course by 23 programs.

Management and Performance

This theme includes courses that address a variety of environmental variables that affect behavior and/or performance and covers courses like Project Management, Human Performance Technology (HPT), and Performance Improvement. A total of 22 required courses fell under this domain with course titles such as Foundations of Project Management Intro to HPT, and Foundations of Performance Improvement.

Needs Assessment

This domain consists of courses that focus a variety of front-end or needs assessment processes with some variation of strategic assessment, needs assessment models, validating programs and assessing continuing validity of ongoing programs based on needs. A total of seven required courses fell under this category with course titles such as Instructional Needs Assessment and Analysis and Strategic Assessment in Education.

Discussion

A Primary Focus on Designing Instruction?

While our review of the research literature suggests that professional standards and employer demand continue to evolve, the extent to which the current core courses of instructional design programs are aligned to these expectations are variable. As suggested by Reiser and Dempsey (2018), the role of instructional design professionals includes the resolution of learning and performance problems in a variety of settings. Yet, the majority of the program descriptions in our sample (75%) indicated their focus was to prepare students to develop effective instruction with an additional five programs indicating their focus was on improving the learning process. Design was the strongest theme to emerge, with 72 (29%) core courses across the 40 programs in our sample. This is perhaps not surprising given our core identity as an instructional design field, and moreover, it would be sensible to continue to find design as a core competency domain for instructional design professionals in the future. The question is whether we are preparing instructional design professionals with a complimentary set of foundational skills to enable them to address learning and performance problems effectively and sustainably. Instruction may or may not address performance problems, and if we define our goal as developing instruction, we neglect to develop and apply other skills that are essential for effectively addressing performance problem(s).

In contrast, one-fourth (25%) of program descriptions alluded to performance improvement as part of their focus. Fox and Klein (2003) previously observed that due to the traditional focus on training solutions, instructional design programs may be struggling with the extent to which they should incorporate performance improvement or human performance technology in their curricula. By studying the responses of instructional design faculty members and members of local chapters of ISPI and ATD to their survey, they noted that instructional design graduates should have a broad knowledge of performance and performance improvement processes. The authors went on to assert that most instructional design programs do not require performance improvement courses, indicating discrepancies between the curricula offered by the programs and the competencies that professionals working in the field consider important.

A Systemic Approach Is Vital to Assessing Needs and Improving Performance

Properly preparing students to address performance issues requires more than a program description page or introductory courses to performance improvement. If we are serious about addressing learning and performance problems, or leveraging opportunities, we must begin with a real diagnosis or needs assessment. A needs assessment has the greatest utility and impact when we take a systems approach because it provides us with a holistic view of reality, helps us distinguish assumptions from facts, validates evidenced-based needs, and reduces the risk of wasting precious resources on designing solutions (particularly instruction) that will not address underlying issues or get us much closer to expected outcomes.

Identifying learning and performance issues requires us to understand the system, which is made up of interrelated factors and dynamics that create and sustain those recurring issues. A systems approach to needs assessment allows us to clearly define the outcomes that the system should deliver, the root causes or barriers that are getting in the way of achieving those outcomes, and the requirements that must be met by the solution(s)—this, in turn, gives us a strong foundation with which to judge the appropriateness of proposed solutions. (Kaufman & Guerra-López, 2013; Guerra-López, 2018; Guerra-López & Hicks, 2017). Based on this description of a systems approach to needs assessment, one might reasonably see the importance of this competency on enabling ID professionals to address learning and performance problems as stated in Reiser and Dempsey’s 2018 definition, and adding the type of measurable value that employers expect.

Interestingly, needs assessment represents the weakest area of instructional design professionals’ preparation as indicated by seven (3%) core courses on some variation of needs assessment across 40 masters’ programs. Yet across many professional organizational competency frameworks and standards (i.e., ISPI, IBSTPI, CSTD, ATD) and employer demand research (Carliner et al., 2015; Klein & Kelly, 2018; Sugar et al., 2012), needs assessment has repeatedly ranks as a priority competency. Richey, Morrison, and Foxon (2007) used the Occupational Information Network (O*NET), an online database containing hundreds of occupational definitions and work demand trends in the U.S., and found that conducting needs assessments and strategic learning assessments showed up as one of the most important tasks for the instructional technologist. This seems to suggest that it would behoove ID programs to reflect on where and how in their curriculum, they are preparing students to meet the demands of employers as it pertains to essential needs assessment approaches and skills.

Setting

Consistent with earlier definitions of the field which placed boundaries around educational settings, 22 program descriptions indicated their students are prepared to work in academic environments and another 4 programs in our sample indicated having a specific emphasis on K-12 environments. Yet, the research literature suggests that instructional design professionals work primarily in business and industry (Byun, 2000; Klein & Kelly, 2018; Heideman, 1991; Larson & Lockee, 2008; Moallem, 1995; Richey et al., 2007). Our findings suggest that there may be some moderate alignment to this target setting, with 26 of 40 (65%) programs indicating they prepare students to work in corporate environments. Setting is important because it influences our view of the system, and in turn, how we define the conditions and objectives of interest. While instructional design professionals use principles, evidence, models, and tools that can be useful across a variety of settings, the solutions we design, the stakeholders we work with, the approach to implementation and management, and many other important details must also be contextualized to the realities and dynamics of specific settings.

Technology

Not surprisingly, technology also emerged as a strong theme with 24 (60%) of the program descriptions indicating their focus included technology integration and 51 (20%) core courses in the sample fitting into a technology dimension. Berge, de Verneil, Berge, Davis, and Smith (2002) forecasted that instructional design professionals will continue to be expected to increase their competence with technology given the technological transformation that our society is undergoing. Digital disruption is already a reality. Klaus Schwab (2016), the Founder and Executive Chairman of the World Economic Forum argues that we are undergoing a fourth industrial revolution driven by technology and it is evolving at an exponential rather than a linear pace disrupting practically every industry. This has significant implications for the workplace and education. Jobs are changing with some becoming obsolete, others changing in fundamental ways, and entirely new types of jobs emerging. This has significant implications for the evolution of workplace learning and performance as well as how we educate learners in educational settings at every level.

Technology has the potential to significantly strengthen the ability of educational institutions to deliver on their core educational mission with greater quality, efficiency, and effectiveness. However, digital tools have been used largely under traditional models of teaching and learning (e.g., drill-and-skill instruction, information transmission) and not surprisingly, we have not seen significant improvement of results (Fishman & Dede, 2016). Instructional design professionals can provide leadership in instructional strategies and approaches informed by advances in learning science and technology innovations, coupled with a system’s approach and a clear focus on improving results. Thirty (12%) core courses in our sample fell under the teaching or learning theme suggesting that perhaps there is an opportunity to better balance student preparation on technology innovations with advances in learning science. Certainly, without taking a closer look at the actual course syllabi, it is difficult to say with certainty that learning science is not already being incorporated into other courses such as instructional design and technology. But given the variety of disciplines from which instructional design programs draw students, our stated intent to improve learning, and the ongoing advances in learning sciences, it would seem logical to expect a stronger prevalence of courses in learning sciences.

Research and Evaluation

Evaluation emerged as the second strongest theme with 67 (27%) of core courses offered in some variation of research and evaluation skills. This gives us a positive outlook on the contributions that instructional design professionals can make and says something about our strong tradition in the application of research and evidence in our field. As a continuous improvement tool evaluation can help instructional design professionals and those they work with focus on well-defined needs and priorities so that learning and performance improvement solutions are clearly aligned to desired results at the individual, team, institution, and external impact levels (Guerra-López, 2008). It can help us learn how, when, why, and for whom learning and performance improvement solutions work as intended. If clearly aligned to key decision-making needs, evaluation can serve as a source of timely feedback allowing us to promptly adapt learning and performance improvement solutions, ensuring alignment to targeted results (Guerra-López, 2018).

Based on the emphasis on research and evaluation in instructional design programs, it would seem that instructional design professionals tend to be well prepared in evaluation and research skills. As previously mentioned the lens we use to frame our scope can significantly impact the value we add. Applying these research and evaluation skills in the context of a system perspective can influence our understanding of desired outcomes and whether we are merely making short-term improvements without meaningful contribution to long-term results (Clark & Estes, 2008; Guerra, 2008).

Conclusion

While there seems to be some alignment between professional expectations and the aggregate instructional design masters curriculum, particularly around instructional design, research and evaluation, and technology, this study’s findings suggest there may still be gaps to further validate and address when it comes to needs assessment and systemic approaches. This is particularly important given our stated role in addressing learning and performance problems. Without fully defining what problem should be addressed and understanding the system variables, recurring loops, and dynamics sustaining that problem, we run the risk of applying our design skills to inadequately defined problems. These problem definition issues in the front-end, provide a flawed frame for the subsequent improvement efforts and ultimately lead to poorly focused evaluations that draw flawed conclusions about the effectiveness of solutions, generating a false sense of accomplishment and value. In an ever-increasingly interconnected world, the demand for system thinking and systemic approaches that involve interdisciplinary and multi-stakeholder methods to solve complex societal problems will continue to grow. This represents an important opportunity to raise the profile and value of instructional design professionals.

References

Almaden, A., & Ku, H. (2017). Analyzing the curricula of Doctor of Philosophy in educational technology-related programs in the United States. Educational Media International, 54(2), 112-128. https://doi.org/10.1080/09523987.2017.1364210

Berge, Z., de Verneil, M., Berge, N., Davis, L., & Smith, D. (2002). The increasing scope of training and development competency. Benchmarking: An International Journal, 9(1), 43–61.

Boettger, R. K. & Palmer, L. A. (2010, December). Quantitative content analysis: Its use in technical communication. IEEE Transactions on Professional Communication, 53(4), 346-357. https://doi.org/10.1109/TPC.2010.2077450

Byun, H. (2000). Identifying job types and competencies for instructional technologists: A five year analysis [Doctoral dissertation, Indiana University] Dissertation Abstracts International, 61, 4346. https://doi.org/10.17232/KSET.17.1.57

Carliner, S., Castonguay, C., Sheepy, E., Ribeiro, O., Sabri, H., Saylor, C., & Valle, A. (2015). The job of a performance consultant: A qualitative content analysis of job descriptions. European Journal of Training and Development, 39(6), 458-483. https://doi.org/10.1108/ejtd-01-2015-0006

Cassidy, M. F. (1982). Toward integration: Education, instructional technology, and semiotics. Educational Communications and Technology Journal, 20(2), 75–89.

Clark, R. E., & Estes, F. (2008). Turning research into results: A guide to selecting the right performance solutions. Information Age.

Ely, D. P., ed. (1963). The changing role of the audiovisual process in education: A definition and a glossary of related terms. TCP Monograph No. 1. AV Communication Review, 11(1), Supplement No. 6. {Google Scholar}

Fishman, B., & Dede, C. (2016). Teaching and technology: New tools for new times. In D. H. Gitomer & C. A. Bell (Eds.), Handbook of Research on Teaching (5th ed., pp. 1269 - 1334 ). American Education and Research Association.

Fox, E. J., & Klein, J. D. (2003). What should instructional designers and technologists know about human performance technology? Performance Improvement Quarterly, 16(3), 87–98.

Furst-Bowe, J. A. (Ed.) (1996). Competencies needed to design and deliver training using instructional technology. In proceedings of selected research and development presentations at the 1996 national convention of the Association for Educational Communications and Technology. (ERIC Document Reproduction Service No. ED397795) Retrieved April 24, 2021 from https://www.learntechlib.org/p/80106/

Gentry, C. G. (1995). Educational technology: A question of meaning. In G. Anglin (Ed.), Instructional technology: Past, present, and future.Libraries Unlimited.

Guerra, I. (2008). Outcome-based vocational rehabilitation: Measuring valuable results. Performance Improvement Quarterly, 18(3), 65-75. https://doi.org/10.1111/j.1937-8327.2005.tb00342.x

Guerra‐López, I. (2018). Ensuring measurable strategic alignment to external clients and society. Performance Improvement, 57(6), 33-40.

Guerra-López, I., & Hicks, K. (2017). Partner for performance: Strategically aligning learning and development. ATD Press.

Gupta, K., Sleezer, C. M., & Russ-Eft, D. F. (2007). A practical guide to needs assessment (2nd ed.). Pfeiffer/John Wiley & Sons.

Heideman, J. G. (1991). A forecast of the competencies required for effective performance by instructional technology practitioners in the year 2000 (Order No. 9127229). Available from ProQuest Dissertations & Theses Global. (303934240). https://proxy.lib.wayne.edu/login?url=https://www-proquest-com.proxy.lib.wayne.edu/dissertations-theses/forecast-competencies-required-effective/docview/303934240/se-2?accountid=14925

IBSTPI. (2016). Instructional Design Competencies. https://ibstpi.org/instructional-design-competencies/

Kang, Y., & Ritzhaupt, A. D. (2015). A job announcement analysis of educational technology professional positions. Journal of Educational Technology Systems, 43(3), 231-256. https://doi.org/10.1177/0047239515570572

Kaufman, R. (1996). Visions, Strategic Planning, and Quality—More Than Hype. Educational Technology, 36(5), 60-62. Retrieved April 24, 2021, http://www.jstor.org/stable/44428366

Kaufman, R. (2019). Revisiting mega as the central focus for defining and delivering worthy results. Performance Improvement, 58(7), 20-23. https://doi.org/10.1002/pfi.21885

Kaufman, R., & Guerra‐ López, I. (2013). Needs assessment for organizational success. ASTD Press.

Klein, J. D., & Fox, E. J. (2004). Performance improvement competencies for instructional technologists. TechTrends, 48(2), 22-25. https://doi.org/10.1007/bf02762539

Klein, J. D. & Jun, S. (2014). Skills for Instructional Design Professionals. Performance Improvement, 53:41-46. https://doi-org.proxy.lib.wayne.edu/10.1002/pfi.21397

Klein, J. D., & Kelly, W. Q. (2018). Competencies for instructional designers: A view from employers. Performance Improvement Quarterly, 31(3), 225-247. https://doi.org/10.1002/piq.21257

Koszalka, T., Russ-Eft, D., & Reiser, R. (2013) Instructional design competencies: The standards (4th ed.). Information Age Publishing.

Larson, M. B., & Lockee, B. B. (2007). Preparing instructional designers for different career environments: A case study. Educational Technology Research and Development, 57(1), 1-24. https://doi.org/10.1007/s11423-006-9031-4

Larson, M. B., & Lockee, B. B. (2008). Instructional design practice: Career environments, job roles, and a climate of change. Performance Improvement Quarterly, 17(1), 22-40. https://doi.org/10.1111/j.1937-8327.2004.tb00300.x

Liu, M., Gibby, S., Quiros, O., & Demps, E. (2002). Challenges of being an instructional designer for new media development: A view from the practitioners. Journal of Educational Multimedia and Hypermedia, 11(3), 195-219. Retrieved April 24, 2021 from https://www.learntechlib.org/primary/p/9266/

McLagan, P. (1996, January). Great ideas revisited. Training & Development, 50(1), pp. 60-66.

Miles, M. B., Huberman, A. M., & Saldaña, J. (2014). Qualitative data analysis: A methods sourcebook (3rd ed.). SAGE.

Moallem, M. (1995). Analysis of job announcements and the required competencies for instructional technology professionals Paper presentation] The Annual Meeting of the American Educational Research Association, San Francisco, CA, United States. Reiser, R. A., & Dempsey, J. V. (2018). Trends and issues in instructional design and technology. Pearson Education.

Richey, R. C., Fields, D. C., & Foxon, M. (2001). Instructional design competencies: The standards (3rd ed.). Eric Clearing-House on Information Technology.

Richey, R. C., Morrison, G., & Foxon, M. (2007). Instructional design in business and industry. In R.A. Reiser, & J.V. Dempsey (Eds.), Trends and issues in instructional design and technology (2nd ed., pp. 174–184). Prentice Hall.

Ritzhaupt, A. D., & Martin, F. (2013). Development and validation of the educational technologist multimedia competency survey. Educational Technology Research and Development, 62(1), 13-33. https://doi.org/10.1007/s11423-013-9325-2

Ritzhaupt, A., Martin, F., & Daniels, K. (2010). Multimedia competencies for an educational technologist: A survey of professionals and job announcement analysis. Journal of Educational Multimedia and Hypermedia, 19(4), 421-449.

Rummler, G. A. (2007). Serious performance consulting according to Rummler. John Wiley & Sons.

Silber, K. H. (1970). What Field Are We In, Anyhow? Audiovisual Instr, 15(5), 21-4.

Stolovitch, H. L. & Keeps, E. J. (1999). Handbook of Human Performance Technology (2nd ed.). Jossey-Bass.

Sugar, W., Hoard, B., Brown, A., & Daniels, L. (2012). Identifying multimedia production competencies and skills of instructional design and technology professionals: An analysis of recent job postings. Journal of Educational Technology Systems, 40(3), 227-249. https://doi.org/10.2190/et.40.3.b

Ingrid Guerra-López

Wayne State University

Dr. Ingrid Guerra-López is Interim Dean of the College of Education and Professor of Learning Design and Technology at Wayne State University. Dr. Guerra-López is an internationally recognized expert in human performance systems with a focus on needs assessment and planning, monitoring and evaluation, and strategic alignment of training and performance improvement interventions.
Ruta Joshi

Wayne State University

Ruta Joshi is a Ph.D. Candidate in Learning Design and Technology at Wayne State University. Her research interest is in assessing needs and exploring the influence of technology on training and performance improvement professionals, focusing on enhanced access and use of data for needs assessment and diagnostic processes.

This content is provided to you freely by EdTech Books.

Access it online or download it at https://edtechbooks.org/jaid_10_2/a_study_of_instructi.