49

What Are the Skills of an Instructional Designer?

, , , &

Editor’s Note

The Following Article Was Originally Published in the Journal of Applied Instructional Design

Sugar, W., Brown, A., Daniels, L., & Hoard, B. (2011). Instructional design and technology professionals in higher education: Multimedia production knowledge and skills identified from a Delphi study. Journal of Applied Instructional Design, 1(2), 30–46. Retrieved from https://edtechbooks.org/-PD

As Instructional Design and Technology (ID&T) educators, we have made considerable effort in understanding the specific multimedia production knowledge and skills required of entry-level professionals. Our previous studies (Sugar, Brown, & Daniels, 2009) documented specific multimedia production skills, knowledge and software applications (e.g., Flash) that ID&T students and subsequent graduates need to exhibit. As a result of these efforts, differences can be readily distinguished between instructional designers working in corporate settings and those working in higher education settings (Sugar, Hoard,Brown, & Daniels, 2011). Kirschner, van Merrienboer, Sloep, and Carr (2002) observed that instructional designers at higher education settings focus on identifying alternative solutions for a particular course whereas instructional designers within a corporate training setting are more customer-oriented. Larson and Lockee (2009) concurred with this assessment by noting “differences in the requirements listed for business and industry versus higher education jobs” (p. 2). Essentially, the organizational culture (e.g., shared beliefs and values) within a corporation is radically different than that which is found within a college or university setting. Since over 89% of our initial survey respondents (e.g., Sugar, Brown, & Daniels, 2009) worked in colleges or universities, we decided to concentrate our efforts exclusively on the multimedia production knowledge and skills of instructional designers working within higher education.

The role of the instructional designer, instructional technologist, and instructional technology consultant within a higher education setting has been well established. Recent studies have documented several quality instructional technology-related projects within higher education settings (e.g., Renes & Strange, 2011). As one might expect, teaching online has been emphasized during the past fifteen years (e.g., Barczyk, Buckenmeyer, & Feldman, 2010), as well as mobile learning technologies (e.g., El-Hussein & Cronje, 2010) and online student response systems (e.g., Stav, Nielsen, Hansen-Nygård & Thorseth, 2010). Other innovative technologies, such as interactive white boards (e.g., Al-Qirim, 2011), social networking (e.g., Conole, Galley, & Culver, 2011), Web 2.0 tools (e.g., Kear, Woodthorpe, Robertson, & Hutchison, 2010), and 21st century tools for teacher educators (e.g., Archambault, Wetzel, Foulger, & Williams, 2010) have been integrated in higher education classrooms. Several case studies document the inclusion of instructional technologies into content-specific higher education courses, such as art and design education (e.g., Delacruz, 2009), engineering (e.g., Dinsmore, Alexander, & Loughlin, 2008), and nursing (e.g., Donato, Hudyma, & Carter, 2010). “Soft” technologies, such as mentoring circles (Darwin & Palmer, 2009) also have been successfully integrated in higher education settings.

The prominence of the instructional designer within higher education settings also has been well documented (Shibley, Amaral, Shank, & Shibley, 2011). Incorporating a continuous improvement process (Wolf, 2007), encouraging higher education faculty with innovative reward and recognition structures (Bluteau & Krumins, 2008), and the importance of interacting with faculty peers (Nicolle & Lou, 2008) are examples of current best practices in facilitating successful technology adoption and integration. Considerable effort in understanding how higher education faculty adopt e-Learning activities (e.g., MacKeogh &Fox, 2009), Web 2.0 technologies (e.g., Samarawickrema, Benson, & Brack, 2010), as well as faculty members’ perceptions of roles of Learning Content Management Systems (LCMS) (e.g., Steel, 2009) have been recently initiated as well.

Purpose of Study

The intent of this study is to better comprehend the instructional designer’s role in higher education settings. Specifically, we sought to interpret multimedia production knowledge and skills required of Instructional Design and Technology professionals working in higher education. In addition, since we noted a definite interrelationship between multimedia production and instructional design skills in earlier studies (Sugar, Brown, & Daniels, 2009), we also sought to understand the relationship between these two skill sets. To accomplish this goal, we conducted a Delphi study, seeking the opinions and consensus of experienced instructional designers who work in higher education.

Method

We determined that a Delphi research methodology was the best approach to address our questions. In the early 1950’s, “Project Delphi” was developed from an Air Force-sponsored Rand Corporation study. This study sought to “obtain the most reliable consensus of opinion of a group of experts . . . by a series of intensive questionnaires interspersed with controlled opinion feedback” (Linstone & Turoff, 2002, p. 10). Delphi panelists remain anonymous to each other in order to avoid the “bandwagon effect” and ensure individual panelists do not dominate a particular decision (Linstone & Turoff, 2002). Ideally, the Delphi panel is heterogeneous; clearly representing a wide selection of the targeted group. Since the inception of Project Delphi, the Delphi technique has been a prescribed methodology for a wide variety of content areas, including government planning, medical issues, and drug abuse-related policy making (Linstone & Turoff, 2002). Several existing Instructional Design and Technology research studies utilized the Delphi method to examine phenomena such as: determining constructivist-based distance learning strategies for school teachers (Herring, 2004); understanding strategies that promote social connectedness in online learning environments (Slagter van Tryon & Bishop, 2006); best practices for using technology in high schools (Clark, 2006); optimal technology integration in adult literacy classrooms (Dillon-Marable & Valentine, 2006); and forecasting how blended learning approaches can be used in computer-supported collaborative learning environments (So & Bonk, 2010). The Delphi method has also been used to identify priorities from a select group of experts on topics that include K–12 distance education research, policies, and practices (Rice, 2009); mobile learning technologies (Kurubacak, 2007); and educational technology research needs (Pollard & Pollard, 2004).

Standards have also been determined from Delphi studies. Researchers used this method to ascertain effective project manager competencies (Brill, Bishop, & Walker, 2006), biotechnology knowledge and skills for technology education teachers (Scott, Washer, & Wright, 2006), and assistive technology knowledge and skills for special education teachers (Smith, Kelley, Maushak, Griffin-Shirley, & Lan, 2009).

This Delphi research method is an established technique to collect a consensus decision among experts about a topic that involves examination of a broad and complex problem that could be potentially subjective (Linstone & Turoff, 1975; Linstone & Turoff, 2002). The question of which multimedia production knowledge and skills are important among entry-level instructional designers is both complex and subjective; the answer depends on decisions made within organizations and the learner population the organization services.

The Delphi method provides researchers with the ability to systematically evaluate the expert decision-making process within a prescribed set of phases. This process is particularly advantageous for those participants or Delphi panelists who are in separate physical locations (Linstone & Turoff, 1975), as our participants were.

Delphi Panel

For our Delphi study, fourteen Instructional Design and Technology professionals originally agreed to participate. Ultimately, eleven of the fourteen original panelists completed all three data collection phases of the study; three individuals stopped participating for various personal reasons. The overall goal was to gather responses from a heterogeneous grouping of panelists (see Table 1) representing higher education work environments in general. The seven female and four male panelists work in a variety of higher education settings, including two-year colleges, four-year universities, public institutions, and private institutions. Eight of our panelists represent public institutions and three represent private institutions. In addition, two panelists represent two-year community colleges and four represent undergraduate-only institutions. Nine of our panelists work in administrative positions (e.g., Director) and two of our panelists work as instructional designers for their respective institutions. Ten panelists have worked in higher education setting for more than ten years. The average amount of higher education work experience was over sixteen years. The panelists are geographically diverse, representing western, mountain west, mid-west, south, southeast, mid-Atlantic, and northeast regions of the United States. One panelist works at a higher education institution in Switzerland.

Table 1. Demographic information of Delphi panelists

Gender Position Years in higher education setting Region Type of institution
Female Instructional Designer 10 West Public; 4-year degree; Undergraduate & graduate
Female Instructional Designer 12 Mountain West Public; 4-year degree; Undergraduate & graduate
Female Coordinator 4 Northeast Public; 4-year degree; Undergraduate & graduate
Female Coordinator 27 Southeast Public; 2-year degree; Undergraduate
Female Vice Provost 25 South Public; 4-year degree; Undergraduate & graduate
Male Director 29 Midwest Private; 4-year degree; Undergraduate
Male Chief Academic Officer 20 South Public; 2-year degree; Undergraduate
Male Director 19 Southeast Private; 4-year degree; Undergraduate & graduate
Female Director 14 Mid-Atlantic Public; 4-year degree; Undergraduate & graduate
Male Director 11 Switzerland Public; 3-year degree; Undergraduate & graduate
Female Team Leader 13 Northeast Private; 4-year degree; Undergraduate & graduate

Overview of Delphi Data Collection Phases

Three Delphi data collection phases were completed during this study. During the first round, panelists responded to the following three open-ended questions:

The purpose of these questions was to delineate specific multimedia production knowledge and skills, required of these professionals. The questions were open-ended in order to avoid biasing our panelists’ responses (Linstone & Turoff, 1975). The panelists responded to these questions via email.

With the intent of identifying emerging and reoccurring themes, three evaluators analyzed the panelists’ responses using a category construction data analysis method as outlined by Merriam (2009). Questionable items and themes were discussed among the three evaluators; the evaluators reached consensus on all items. Particular themes from these responses were identified. This initial set of themes was sent to the panelists for their review. Each panelist had the opportunity to respond to the overarching group of themes and the specific themes, and to add additional categories as well. All of these themes were compiled into a summative questionnaire, and this questionnaire was then distributed during the second round.

The intent of the questionnaire was to establish a quantitative appraisal of our panelists’ responses about each item and to seek a common set of responses to Instructional Design and Technology graduates’ multimedia production knowledge and skills. The panelists rated each questionnaire item with regard to the importance of each identified knowledge or skill, and the panelists’ responses were compiled and distributed via email to each panel member. Panelists were then given the opportunity to offer feedback about the questionnaire results and make any corrections, as necessary.

During the third round, the eleven panelists reviewed the Round #2 ratings and were given the opportunity to revise their own ratings. Five of the eleven panelists recommended minor incremental changes to their original rankings. None of the eleven panelists made any suggestions to either add another item or remove an existing item. Given this feedback, we determined that these minor modifications indicated there was an apparent consensus among the panel.

Results

During the initial Delphi phase, the eleven panelists generated 289 unique statements regarding the three aforementioned initial questions. From this first round of responses, 60 distinct multimedia knowledge and skills needed by Instructional Design and Technology graduates were identified and organized into seven primary categories. This list of categories was then sent back to our panelists for confirmation. Eight of the eleven panelists recommended ten additional knowledge and skills for a total of 70 items.

Table 2. Top-ranked items (M ≥ 1.45)

Rank Item Category f M SD
1 Communication skills Communication and collaboration 11 1.91 .30
2 Social skills Communication and collaboration 11 1.73 .65
3 Web design basics Production 11 1.64 .51
4 Visual communication Visual and graphic design 10 1.60 .70
5 Microsoft Office Suite Applications 11 1.55 .52
6 Online course pedagogy Instructional design and pedagogy 11 1.55 .69
7 Knowledge of learner Instructional design and pedagogy 11 1.55 .82
8 Screencasting Production 11 1.45 .69
9 Pedagogical design expertise Instructional design and pedagogy 11 1.45 1.21
10 Design multimedia content Instructional design and pedagogy 11 1.45 .82
11 Articulate advantages & disadvantages of delivering media formats Delivery and project management 11 1.45 .69
12 Determine delivery venue Delivery and project management 11 1.45 .52
13 Understanding of how disabilities impact multimedia selection Delivery and project management 11 1.45 .69
14 LCMS Online Applicatons 11 1.45 1.21
15 Video production Production 11 1.45 .52

Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0= somewhat important, 1= important, 2= essential.

Table 3. Bottom-ranked items (M≥.36)

Rank Item Category f M SD
60 XML Online Applications 11 .36 .81
61 Online plug-ins Online applications 11 .27 1.27
62 Online quiz tools Online applications 11 .18 1.08
63 Contribute Online applications 10 .10 1.10
64 Photography Productions 11 .09 .94
65 Online survey tools Online applications 11 .09 .94
66 Animation Production 11 .00 .63
67 Garageband Applications 11 .00 .63
68 Final Cut Pro Suite Applications 11 -.09 .94
69 Programming (e.g., Action-script) Production 10 -.10 1.10
70 Green screen Applications 10 -.40 1.27

Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0= somewhat important, 1= important, 2= essential.

The panelists also reacted to the seven categories. Four original categories (Visual and Graphic Design, Instructional Design and Pedagogy, Communication and Collaboration, and Delivery and Project Management) did not receive any feedback or edits and were approved. The panelists commented on the three original categories: Basic Production, Specific Software Tool and Online. Upon review of these comments, these categories were renamed Production, Applications, and Online Applications respectively. We distinguished between applications (e.g., Flash) that can create instruction for online settings as well as non-online settings, and applications (e.g., Dreamweaver) that exclusively create instruction for online settings.

In summary, Delphi panelists’ responses were organized into seven categories: Production (10 items), Applications (12 items), Online Applications (15 items), Visual and Graphic Design (6 items), Instructional Design and Pedagogy (15 items), Communication and Collaboration (4 items), and Delivery and Project Management (8 items). See Appendix for a listing of these categories and corresponding items.

During the next Delphi phase, our eleven panelists ranked these seventy items on the following scale: Essential, Important, Somewhat important, Not important, Unnecessary. Accordingly, we assigned a 2 to -2 Likert scale for these five items where Essential items received 2 points, Important items received 1 point, Somewhat important items received 0 points, Not important items received -1 point, and Unnecessary items received -2 points. Thus, the top score any item could receive would be 22 points (i.e., all 11 panelists deemed this item to be Essential) and the lowest score that an item could receive would be -22 points (i.e., all 11 panelists deemed this item to be Unnecessary). This rating system also provides the ability to weight and counterweight individual panelists’ responses about a particular item. For example, if a panelist rated one item as Important (1 point) and another panelist rated the same item as Not important (-1 point), the item would receive a combined score of zero points and would be considered as Somewhat important.

The average scores for all of the seventy items ranged from M = 1.91 to M = -.4 (see Appendix). The 15 top-ranked items that received a 1.45 average or higher are found in Table 2. The top two items, Communication (M = 1.91, SD = .30) and Social skills (M= 1.73, SD = .65) were within the Communication and Collaboration category. Three production items, Web Design Basics (M = 1.64, SD = .51), Video Production (M = 1.45, SD = .52), and Screencasting (M = 1.45, SD = .69) were including in this top-ranked list. The item, Visual communication and visualization theories (M = 1.60, SD = .70), was the fourth highest-ranked item and Microsoft Office Suite (M = 1.55, SD = .52) was the fifth highest-ranked item. Four of the fifteen Instructional Design and Pedagogy items and three of the eight Delivery and Project Management items also were distributed in this top-ranked listing. Learning Content Management Systems (LCMS) (M = 1.45, SD = 1.21) also was in this top ranking list. The eleven bottom-ranked items that received a .36 average or lower are found in Table 3. Five Online applications (XML, Online quiz tools, Online plug-ins, Contribute, and Google Forms/Survey Monkey) were located in this list of items. Three Production items (Photography, Animation, and Programming) and three Applications items (Garageband, Final Cut Pro, and Green screen) received an average of 0 or lower.

Table 4. Percentage of importance within each category

Category (n) Unnecessary to Not important -2 ≤ M < -1 % Not important to Somewhat important -1≤ M < 0 % Somewhat important to Important 0 ≤ M < 0 % Important to Essential 1 ≤ M ≤ 2 %
Communication and collaboration (n=4) 0.0 0.0 25.0 75.0
Visual and graphic design (n=6) 0.0 0.0 0.0 100.0
Delivery and project management (n=8) 0.0 0.0 12.0 88.0
Instructional design and pedagogy (n=15) 0.0 0.0 20.0 80.0
Production (n=10) 0.0 10.0 30.0 60.0
Online applications (n=15) 0.0 0.0 66.7 33.3
Applications (n=12) 0.0 16.67 66.66 16.67
Totals (n=70) 0.0 4.3 37.1 58.6

In Table 4, the percentage of importance ratings is listed for each category. Over sixty percent of the items (63.8%) from each of the seven categories received an “Important” (M > 1) to “Essential” (M < 2) ranking. All theVisual and Graphic Design (n=6) items were within this range. Fourteen of the fifteen Instructional Design and Pedagogy items received “Important to Essential” ratings; SCORM received an average score lower than 1 (M = .73, SD = .91). Three of the four Communication and Collaboration items also received “Important to Essential” ratings. Public presentation skills received an average score lower than 1 (M = .91, SD = .94). All but one Delivery and Project Management item (n=7) also received an “Important to Essential” rating; Understanding of budget constraints & funding issues received an average score lower than 1 (M = .64, SD = .81).

Sixty percent of the Production items (n=6) received an “Important” (M > 1) to “Essential” (M < 2) rating (see Table 4). A majority of the Delphi panelists categorized Web design basics (M = 1.64, SD = .51), Video production (M = 1.45, SD = .52), Screencasting (M = 1.45, SD = .69), Audio production (M = 1.36, SD = .67), Images production (M = 1.36, SD = .67), and Basic HTML commands (M = 1.09, SD = 1.10), as “Important” to “Essential” items. (see Table 5). The remaining four Production items either received a “Somewhat important” (M < 0) to “Important” (M < 1) ranking (i.e., Desktop publishing and Photography) or received a “Not important” (M < -1) to “Somewhat important” (M < 0) ranking (i.e., Animation and Programming skills).

Table 5. Production category items

Rank Production category items f M SD
3 Web design basics 11 1.64 .51
8 Screencasting 11 1.45 .69
15 Video production 11 1.45 .52
16 Audio production 11 1.36 .67
26 Images production 11 1.36 .67
38 Basic HTML commands 11 1.00 1.10
48 Desktop publication 11 .91 .75
64 Photography 11 .09 .94
66 Animation 11 .00 .63
68 Programming skills (e.g., Actionscript) 10 -.10 1.10

Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0= somewhat important, 1= important, 2= essential.

Table 6. Application category items

Rank Application category items f M SD
5 Microsoft Office suite 11 1.55 .52
33 Adobe software suite 11 1.09 .94
47 Major operating systems 11 .85 1.08
49 Photoshop 11 .82 .87
51 Audacity 11 .73 .79
56 Adobe Flash 11 .64 .93
57 Adobe Acrobat 11 .55 1.04
58 iMovie 11 .55 .82
59 Fireworks 11 .55 .93
67 Garageband 11 .00 .63
68 Final Cut Pro Suite 11 -.09 .94
70 Green screen 10 -.40 1.27

Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0= somewhat important, 1= important, 2= essential.

Only 25% of the Application items (n=3) received an “Important” (M > 1) to “Essential” (M < 2) rating (see Table 6). Two of these three applications are generic applications with regard to multimedia production items. These applications are Microsoft Office suite (M = 1.55, SD = .52) and Major operating systems (M = 1.00, SD = 1.08). The other Application item is the overall Adobe software suite (M = 1.09, SD = .94). The remaining nine Application items either received a “Somewhat important” (M < 0) to “Important” (M < 1) ranking (i.e., Audacity, Flash, Photoshop, Acrobat, iMovie, Fireworks, and Garageband) or received a “Not important” (M < -1) to “Somewhat important” (M < 0) ranking (i.e., Final Cut Pro and Green screen).

There is disagreement among the panelists regarding the importance of specific applications. As depicted in Figure 1, at least 45% of the panelists perceived the importance of the following three applications: Flash, Photoshop, and Fireworks. Six panelists perceived Flash as either an Important or an Essential multimedia production item whereas five panelists perceived Flash as either Somewhat important or Not important. Five panelists perceived both Photoshop and Fireworks as either an Important or an Essential multimedia production item whereas six panelists perceived both Photoshop and Fireworks as either Somewhat important or Not important.

Table 7. Online application category items

Rank Online application category items   M SD
14 LCMS 11 1.45 1.21
29 Web 2.0 applications 11 1.27 .79
34 Knowledge of online file structures 11 1.09 .94
39 Camtasia 10 1.00 .82
40 Web page editors 11 1.00 .78
44 Dreamweaver 11 .91 .83
45 CSS 11 .91 .70
50 Wikis 11 .82 .75
53 Captivate 11 .64 .67
55 Blogs 11 .64 .67
60 XML 11 .36 .81
61 Online plug-ins 11 .27 1.27
62 Online quiz tools 11 .18 1.08
63 Contribute 10 .10 1.10
65 Online survey tools 11 .09 .94

Responses were rated on a scale of -2 to 2, with -2= unnecessary, -1= not important, 0= somewhat important, 1= important, 2= essential.

Thirty-three percent of the Online application items (n=5) received an “Important” (M ≥ 1) to “Essential” (M ≤ 2) rating (see Table 7). Four of these five applications are generic applications with regard to multimedia production items. These applications are LCMS (M= 1.36, SD= 1.21), Web 2.0 applications (M= 1.27, SD= .79), Knowledge of online file structures (M= 1.09, SD= .94), and Web page editors (M= 1.00, SD= .78). The other Online application item is Camtasia (M= 1.00, SD= .82). The remaining 10 Application items received a “Somewhat important” (M < 0) to “Important” (M < 1) ranking.

Similar to the Application items, there is disagreement among the panelists regarding the importance of particular online applications. As shown in Figure 2, at least 45% of the panelists perceived the importance of the following two applications: Camtasia and Online plugins. Six panelists perceived Camtasia as either an Important or an Essential multimedia production item whereas five panelists perceived Camtasia as either Somewhat important or Not important. Five panelists perceived Online plugins as either an Important or an Essential multimedia production item whereas six panelists perceived these tools as either Somewhat important, Not important or Unnecessary.

Discussion

In considering these results, the Delphi panelists identified specific multimedia production skills and knowledge needed by entry-level Instructional Design and Technology (ID&T) professionals who work in higher education settings. These skills and knowledge include the following: generalized multimedia production knowledge and skills, emphasis of online learning skills, and the interrelationship between multimedia production and instructional design skills. After describing these skills and knowledge, we discuss how these results have influenced our own respective curricular practices, as well as anticipate future research studies that would provide additional understanding on how best to educate instructional designers working in higher education settings.

The Delphi panelists undoubtedly came to consensus that ID&T graduates need to be well-versed with a number of general multimedia production skills. Visual design principles, video production and audio production skills all were ranked high and were considered Essential by a majority of the panelists. Conversely, more advanced and specialized technologies (e.g., programming and green screen technology) are not as important and were ranked as Unnecessary. Also, there is a conclusive preference among the panelists regarding online learning applications and skills. Web design basics, online course pedagogy, screen-casting, and LCMS skills all were ranked as Essential. It is interesting to note that no specific computer-based instruction application besides Camtasia and Dreamweaver received an Essential or Important ranking. In fact, Delphi panelists were divided on the importance of specific software applications, including: Flash, Photoshop, Audacity, Fireworks, and Captivate.

In addition to these essential multimedia production skills, the panelists’ rankings indicate an inter-relationship between instructional design skills and multimedia production skills. Even though panelists were asked about ID&T graduates’ multimedia production knowledge and skills, eighty percent of the items from the Instructional design and pedagogy category (e.g., Knowledge of learner characteristics, Determining the appropriate delivery venue for particular content area, etc.) were ranked as Essential. Furthermore, Communication skills and Social skills were ranked first and second, respectively. This finding implies that ID&T entry-level professionals need a robust combination of general multimedia production skills and knowledge and overall instructional design skills and knowledge.

Implications

As Instructional Design and Technology faculty members, we were intrigued to receive these results from our panelists and are now considering curricular revisions for our respective courses. The results from our study indicate that multimedia production items cannot be taught in isolation and should not be linked to a particular software application. In previous semesters, our respective multimedia production courses were the default software application course (e.g., Flash, Authorware, Director, etc.). Currently, our students now use “lowest common denominator,” computer-based instruction applications (e.g., PowerPoint) to teach particular computer-based instruction methodologies (e.g., tutorial). Our respective students are introduced to innovative technologies (e.g., Prezi), but the emphasis is not solely on the particular authoring tool, but on how to integrate this tool into overall, existing instructional modules. To highlight the interrelationship between multimedia production and instructional design skills, our students are now required to complete instructional design reports when creating a multimedia production project. We view these projects as instructional design “experiments” and students complete “lab reports” with each project.

The panelists’ respective rankings and results also indicate additional areas to explore with regards to ID&T graduates’ overall multimedia production and instructional design skills and knowledge. Inquiry into the changing role of the instructional designer with respect to these two skill sets, such as Schwier and Wilson’s (2010) recent study should take place. A more in-depth understanding of what Willis (2009) refers to as process instructional design, such as a study on the best practices involving collaboration between instructional designer and client is encouraged as well. In addition, case studies on how instructional designers effectively balance multimedia production and instructional design skills should be developed. These case studies could be used as instructional tools to teach novice instructional designers best practices in integrating multimedia production skills within an overall instructional design project.

In summary, the results from this Delphi study indicate that Instructional Design and Technology professionals working in higher education settings need to be educated about overall multimedia production skills and how these skills interrelate to their set of instructional design skills. As Instructional Design and Technology educators, we look forward to considering innovative and effective approaches to our respective curricula and to continuing this dialogue with other Instructional Design and Technology educators.

Application Exercises

  • If you were to design a course for students in an instructional design program, what 3-4 areas would you focus on, based on the results of this study?
  • Look at the list of skills that were ranked as Important-Essential by the Delphi panelists. Think of one or two of those skills that you could personally develop more in your life, and make plans to do so.
  • After seeing the results of the study in this article, evaluate your own progress towards becoming an instructional designer. Do you feel like you are learning the soft and hard skills required for the job? How would you adjust your current plan to better align with what is required in the field?

References

Al-Qirim, N. (2011). Determinants of Interactive white board success in teaching in higher education institutions. Computers & Education, 56(3), 827–838.

Archambault, L., Wetzel, K., Foulger, T. S., & Williams, M. (2010). Professional development 2.0: Transforming teacher education pedagogy with 21st century tools. Journal of Digital Learning in Teacher Education, 27(1), 4–11.

Barczyk, C., Buckenmeyer, J., & Feldman, L. (2010). Mentoring professors: A model for developing quality online instructors and courses in higher education. International Journal on E-Learning, 9(1), 7–26.

Bluteau, P., & Krumins, M. (2008). Engaging academics in developing excellence: Releasing creativity through reward and recognition. Journal of Further & Higher Education, 32(4), 415–426.

Brill, J. M., Bishop, M. J., & Walker, A. E. (2006). The competencies and characteristics required of an effective project manager: A web-based Delphi study. Educational Technology Research & Development, 54(2), 115–140.

Clark, K. (2006). Practices for the use of technology in high schools: A Delphi study. Journal of Technology & Teacher Education, 14(3), 481–499.

Conole, G., Galley, R., & Culver, J. (2011). Frameworks for understanding the nature of interactions, networking, and community in a social networking site for academic practice. International Review of Research in Open & Distance Learning, 12(3), 119–138.

Darwin, A., & Palmer, E. (2009). Mentoring circles in higher education. Higher Education Research & Development, 28(2), 125–136.

Delacruz, E. (2009). Old world teaching meets the new digital cultural creatives. International Journal of Art & Design Education, 28(3), 261–268.

Dillon-Marable, E., & Valentine, T. (2006). Optimizing computer technology integration. Adult Basic Education: An Interdisciplinary Journal for Adult Literacy Educational Planning, 16(2), 99–117.

Dinsmore, D., Alexander, P., & Loughlin, S. (2008). The impact of new learning environments in an engineering design course. Instructional Science, 36(5/6), 375–393.

Donato, E., Hudyma, S., Carter, L., & Schroeder, C. (2010). The evolution of WebCT in a baccalaureate nursing program: An Alice in Wonderland reflection. Journal of Distance Education, 24(3), Retrieved from http://www.jofde.ca/index.php/jde/article/view/702/1163.

El-Hussein, M., & Cronje, J. C. (2010). Defining mobile learning in the higher education landscape. Journal of Educational Technology & Society, 13(3), 12–21.

Herring, M. C. (2004). Development of constructivist-based distance learning environments. Quarterly Review of Distance Education, 5(4), 231–242.

Kear, K., Woodthorpe, J., Robertson, S., & Hutchison, M. (2010). From forums to Wikis: Perspectives on tools for collaboration. Internet and Higher Education, 13(4), 218–225.

Kirschner, P., Carr, C., van Merrienboer, J., & Sloep, P. (2002). How expert designers design. Performance Improvement Quarterly, 15(4), 86–104.

Kurubacak, G. (2007). Identifying research priorities and needs in mobile learning technologies for distance education: A Delphi study. International Journal of Teaching & Learning in Higher Education, 19(3), 216–227.

Larson, M., & Lockee, B. (2009). Preparing instructional designers for different career environments: A case study. Educational Technology Research & Development, 57(1), 1–24.

Linstone, H. A., & Turoff, M. (Eds.). (1975). The Delphi method: Techniques and applications. Reading, MA: Addison-Wesley.

Linstone, H. A., & Turoff, M. (Eds.). (2002). The Delphi method: Techniques and applications. Retrieved from https://edtechbooks.org/-nQ

MacKeogh, K., & Fox, S. (2009). Strategies for embedding e-learning in traditional universities: Drivers and barriers. Electronic Journal of e-Learning, 7(2), 147–153.

Nicolle, P. S., & Lou, Y. (2008). Technology adoption into teaching and learning by mainstream university faculty: A mixed methodology study revealing the “How, When, Why, and Why not”. Journal of Educational Computing Research, 39(3), 235–265.

Pollard, C., & Pollard, R. (2004). Research Priorities in Educational Technology: A Delphi Study. Journal of Research on Technology in Education, 37(2), 145–160.

Renes, S., & Strange, A. (2011). Using technology to enhance higher education. Innovative Higher Education, 36(3), 203–213.

Rice, K. (2009). Priorities in K-12 distance education: A Delphi Study examining multiple perspectives on policy, practice, and research. Journal of Educational Technology & Society, 12(3), 163–177.

Samarawickrema, G., Benson, R., & Brack, C. (2010). Different spaces: Staff development for Web 2.0. Australasian Journal of Educational Technology, 26(1), 44–49.

Schwier, R. A., & Wilson, J. R. (2010). Unconventional roles and activities identified by instructional designers. Contemporary Educational Technology, 1(2), 134–147.

Scott, D. G., Washer, B. A., & Wright, M. D. (2006). A Delphi study to identify recommended Biotechnology competencies for first-year/initially certified technology education teachers. Journal of Technology Education, 17(2), 44–56.

Shibley, I., Amaral, K. E., Shank, J. D., & Shibley, L. R. (2011). Designing a blended course: Using ADDIE to guide instructional design. Journal of College Science Teaching, 40(6), 80–85.

Slagter van Tryon, P. J., & Bishop, M. J. (2009). Theoretical foundations for enhancing social connectedness in online learning environments. Distance Education, 30(3), 291–315.

Smith, D. W., Kelley, P., Maushak, N. J., Griffin-Shirley, N., & Lan, W. Y. (2009). Assistive technology competencies for teachers of students with visual impairments. Journal of Visual Impairment & Blindness, 103(8), 457–469.

So, H., & Bonk, C. J. (2010). Examining the roles of blended learning approaches in Computer-Supported Collaborative Learning (CSCL) environments: A Delphi study. Journal of Educational Technology & Society, 13(3), 189–200.

Stav, J., Nielsen, K., Hansen-Nygård, G., & Thorseth, T. (2010). Experiences obtained with integration of student response systems for iPod Touch and iPhone into e-Learning environments. Electronic Journal of e-Learning, 8(2), 179–190.

Steel, C. (2009). Reconciling university teacher beliefsto create learning designs for LMS environments. Australasian Journal of Educational Technology, 25(3), 399–420.

Sugar, W., Brown, A., & Daniels, L. (2009). Identifying entry-level multimedia production competencies and skills of instructional design and technology professionals: Results from the 2009–2010 biennial survey. Presented at the annual conference of the Association for Educational Communications and Technology (AECT), Louisville, Kentucky.

Sugar, B., Hoard, S. B., Brown, A., & Daniels, L. (2011). Identifying multimedia production competencies and skills of Instructional Design and Technology professionals: Results from recent job postings. Presented at the annual conference of the Association for Educational Communications and Technology (AECT), Jacksonville, Florida.

Willis, J. (2009). Pedagogical ID versus process ID: Two Perspectives in contemporary instructional design theory. International Journal of Technology in Teaching & Learning, 5(2), 93–105.

Wolf, P. (2007). A model for facilitating curriculum development in higher education: A faculty-Driven, data-informed, and educational developer–supported approach. New Directions for Teaching & Learning, 112, 15–20.

question mark Please complete this short survey to provide feedback on this chapter: http://bit.ly/IDSkills
William Sugar

East Carolina University

Dr. William Sugar is a professor in East Carolina University’s Instructional Technology program. His research interests focused on delineating professional instructional designer competencies. His current research concentrates on identifying the impact of an instructional technology on one's beliefs as well as developing alternative instructional methods based on oral history practices.
Abbie Brown

East Carolina University

Dr. Abbie H. Brown is an award-winning educator and scholar who helps people make best use of technology innovations in instructional settings. He engages in field work with innovative technologies in educational settings, and studies the trends and issues that influence the field of Instructional Design/Technology. He specializes in online learning, and provides consultation, presentations and workshops for school districts, colleges and universities, the U.S. government, and businesses worldwide. He has served as Professor and Chair of the Department of Mathematics, Science and Instructional Technology Education, East Carolina University, and is the author with Timothy Green of The Essentials of Instructional Design and the Trends & Issues in Instructional Design, Educational Technology, & Learning Sciences Podcast. He is also former Editor-in-Chief of the journal, TechTrends.

Lee Daniels

City of Kingsport

Lee Daniels has extensive experience in instructional design and technology as a professor, project manager, instructional designer, and researcher. He has developed instruction across multiple platforms for the higher education, K-12, medical, and industry sectors, and is currently the Director of Training for the city of Kingsport, Tennessee.

Brent Hoard

Randolph-Macon College

Brent Hoard is an instructional designer and e-learning developer with expert-level knowledge of enterprise project management, solution analysis and development, and human performance support. He possesses more than a decade of managerial experience, including budget and resource management, and summative/confirmative outcome analysis. He is currently the Director of Web Services at Randolph-Macon College.

This content is provided to you freely by EdTech Books.

Access it online or download it at https://edtechbooks.org/lidtfoundations/skills_instructional_designer.