Human Performance Technology for Instructional Designers

&
Instructional DesignTechnologyPerformanceInstructional DesignersHuman Performance Technology
The overarching goal for instructional designers is to enhance learning and performance outcomes, often achieved through needs assessments, learner analyses, and the development of targeted instructional materials. This chapter delves into the foundational principles of HP and differentiates between HPT and instructional design. The role of performance analysis is explored, focusing on organizational, environmental, and cause analyses to identify performance gaps and root causes. The role of non-instructional interventions, such as organizational design, job analysis, and feedback systems to support infrastructure are discussed. Recommendations for integrating HPT principles into instructional design through a systems approach are also provided.

The goal for all instructional designers is to facilitate learning and improve performance regardless of learning environments and assigned tasks. When working within professional organizations particularly, the goal is often to develop interventions that yield measurable outcomes in improving employee performance. This may be accomplished through conducting needs assessments and learner analyses, designing and developing instructional materials to address a gap in performance, validating instructional materials, developing evaluation instruments to measure the impact of learning, and conducting evaluations to determine to what extent the instructional materials have met their intended use.

Depending on their level of involvement in implementing change within their organization, instructional designers may need to apply concepts from the field of human performance technology. By definition, “human performance technology (HPT) is the study and ethical practice of improving productivity in organizations by designing and developing effective interventions that are results-oriented, comprehensive, and systemic” (Pershing, 2006, p. 6). The purpose of this chapter is to provide an overview for how instructional designers can integrate aspects of human performance technology into their instructional design activities. We differentiate between human performance technology and instructional design and provide examples of HPT applications in the real world. We conclude with an overview of professional resources available related to the topic of human performance technology.

Differentiating Between Human Performance Technology and Instructional Design

Human performance technology emerged in the 1960s with publications and research promoting systematic processes for improving performance gaining traction in the 1970s. The foundations of human performance technology are grounded in behaviorism, with the father of HPT, Thomas Gilbert, being a student of B. F. Skinner. Seminal works of human performance technology include Gilbert’s (1978) Behavioral Engineering Model; Rummler’s (1972) anatomy of performance, Mager and Pipe’s (1970) early introduction of measurable learning objectives, and Harless’ (1973) approach to systematic instruction in the workplace. All of these contributions were grounded in behaviorism and sought to create a systematic approach to measuring employee performance in the workplace. While these concepts can be applied to school settings, the majority of research exploring the application of human performance technology strategies has been predominant in workplace environments.

When differentiating between human performance technology and instructional design, HPT focuses on applying systematic and systemic processes throughout a system to improve performance.

Emphasis is placed on analyzing performance at multiple levels within an organization and understanding what processes are needed for the organization to work most effectively. Systemic solutions take into account how the various functions of an organization interact and align with one another. Through organizational analyses, performance technologists are able to identify gaps in performance and create systematic solutions (Burner, 2010).

While instruction may be one of the strategies created as a result of a performance analysis, it is often coupled with other non-instructional strategies. Depending on an instructional designer’s role in a project or organization, they may not be heavily involved in conducting performance assessments. When given the opportunity, it is good practice to understand how performance is being assessed within the organization in order to align the instructional solutions with other solutions and strategies.

While human performance technology and instructional design have two different emphases, they do share four commonalities: (1) evidence-based practices, (2) goals, standards, and codes of ethics, (3) systemic and systematic processes, and (4) formative, summative, confirmative evaluations (Foshay et al., 2014). Table 1 provides an overview of how these four commonalities are applied in human performance technology and instructional design.

Table 1. Four commonalities shared across human performance technology and instructional design
CommonalitiesHuman Performance TechnologyInstructional Design
Evidence-based practicesOrganizational analyses are conducted to collect data from multiple sources to evaluate workplace performance.Emphasis is placed on learner assessment to ensure instruction has been successful.
Goals, standards, and codes of ethicsISPI and ATD are two professional organizations that have created workplace standards and professional certification programs.AECT and ATD are two professional organizations that have created standards for learning and performance.
Systematic and systemic processesSystematic frameworks have been designed to conduct needs assessments and other performance analyses throughout various levels of an organization.Systematic instructional design models have been designed to guide the design of instruction for a variety of contexts.
Formative, summative, and confirmative evaluationsMultiple evaluation methods are utilized to measure workplace performance throughout the organization.Multiple assessments are conducted throughout the design phase of instruction as well as afterwards to ensure the instructional solutions have been successful.

The Role of Systems in Instructional Design Practice

Instructional designers understand that anytime they are designing, they are operating within a system. Many of our instructional design models, for example, promote a systematic process and take into account a variety of elements that must be considered for design (Dick et al., 2009; Merrill, 2002; Smith & Ragan, 2005). Similarly, human performance technology originates from behavioral psychology but also general systems theory. “General systems theory refers to one way of viewing our environment” (Richey et al., 2011, p. 11). Through this theoretical lens, instructional designers or performance technologists must take into account the whole environment and organization in which they are working.

In general terms, a “system is a set of objects together with relationships between the objects and between their attributes” (Hall & Fagen, 1975, p. 52). Systems can be open or closed (Bertalanffy, 1968). Open systems operate in a manner where they rely on other systems or can be modified based on actions occurring outside of a system. Closed systems are contained and can demonstrate resistance to changes or actions occurring outside the system in order to keep their value (Richey et al., 2011). Examples of systems could include the instructional design or training department within a larger organization. While the department is a system, it is also viewed as a subsystem functioning within something much larger. In addition, those receiving human performance training also work within systems. For example, an instructional designer may be asked to provide training based on values espoused by the CEO, but which may conflict with culture within an individual department in the organization. Other times, they may be asked to identify other instructional solutions to address performance gaps identified in a needs assessment. Or they may seek to improve employees’ performance in one area, when that performance depends on the success of another department in the organization—something outside of the employees’ control. Thus, seeking to improve organizational performance requires a broader understanding of the organization than is sometimes typical in instructional design practices.

Systems thinking impacts instructional design practices by promoting systematic and systemic processes over narrower solutions. A systems view has three characteristics:

  1. “It is holistic.
  2. It focuses primarily on the interactions among the elements rather than the elements themselves.
  3. It views systems as “nested” with larger systems made up of smaller ones” (Foshay et al., 2014, p. 42).

These characteristics affect instruction design practices in a variety of ways. Designers must take the holistic nature of the system and consider the effects on learning from all elements that exist within the system. Not only does this consider the specific instructional design tasks that learners are currently completing, but also various layers of the organization including the people, politics, organizational culture, and resources—in other words, the inputs and outputs that are driving the development and implementation of a project (Rummler & Brache, 2013). Regardless of their role on a project, the instructional designer must be aware of all the various components within their system and how it affects the instruction they create. For example, an instructional designer may be asked by senior leadership of an organization to develop health and safety training for employees working on the frontline of a manufacturing plant. It would be advantageous to understand the unique tasks and nuances associated with the frontline work responsibilities to ensure they are developing training that will be beneficial to the employees. Another example where it would be important for an instructional designer to be aware of an organization’s system or subsystems would be if they were asked to design instruction for a company that has multiple locations across the country or world. The instructional designer should clarify whether or not there are distinct differences (i.e., organizational culture, politics, processes) among these various locations and how these differences may impact the results of training.

Think About It!

Think about an organization that is familiar to you. What is its purpose? Who are the people that belong to that organization? What are the functions of that organization? How do the various people and functions interact?

In addition, considering that the fundamental goals of instructional design are to facilitate learning and improve performance, the instructional designer working within organizations should strive to create design solutions that promote sustainability. As stated by the second system characteristic, it is important to not only be aware of the various elements within a system, but also develop an understanding of how they interact with each other. The instructional designer should be aware of how their work may influence or affect, positively or negatively, other aspects of the organization. For example, if an organization is preparing to launch training on a new organizational philosophy, how will that be perceived by other departments or divisions within the organization? If an organization is changing their training methods from instructor-led formats to primarily online learning formats, what considerations must the instructional design team be aware of to ensure a smooth transition?

LIDT in the World

Emma is a principal at a public middle school. Teacher turnover has steadily increased over the last three years at Emma’s school. After reviewing personnel files, Emma found that most teachers who left were in their first or second year of teaching. Given that most of the teachers who left Emma’s school were new teachers, what factors might have contributed to teacher turnover? What other systemic factors could Emma consider?

Does the organization have the infrastructure to support online learning for the entire organization? Is the information technology department equipped with uploading resources and managing any technological challenges that may arise over time? Does the current face to face training provide opportunities for relationship-building that may not seem critical to the learning, but are important to the health and performance of the organization? If so, how can this be accounted for online? These are examples of some questions an instructional designer may ask in order to take a broader view of their instruction besides just whether it achieves learning outcomes.

Performance Analysis

Regardless of context or industry, all instructional design projects fulfill one of three needs within organizations: (1) addressing a problem; (2) embracing quality improvement initiatives; and (3) developing new opportunities for growth (Pershing, 2006). The instructional designer must be able to validate project needs by effectively completing a performance analysis to understand the contextual factors contributing to performance problems. This allows the instructional designer to appropriately identify and design solutions that will address the need in the organization—what is often called the performance gap or opportunity.

The purpose of performance analysis is to assess the desired performance state of an organization and compare it to the actual performance state (Burner, 2010; Rummler, 2006). If any differences exist, it is important for the performance improvement consultant (who may sometimes serve as the instructional designer as well) to identify the necessary interventions to remove the gap between the desired and actual states of performance.

Performance analysis can occur in multiple ways, focusing on the organization as a whole or one specific unit or function. Organizational analysis consists of “an examination of the components that strategic plans are made of. This phase analyzes the organization’s vision, mission, values, goals, strategies and critical business issues” (Van Tiem et al., 2012, p. 133). Items that are examined in close detail when conducting an organizational analysis include organizational structure, centrally controlled systems, corporate strategies, key policies, business values, and corporate culture (Tosti & Jackson, 1997). All of these can impact the sustainability of instructional design projects either positively or negatively.

An environmental analysis not only dissects individual performance and organizational performance, it also expands to assess the impact that performance may have outside the system. Rothwell (2005) proposed a tiered environmental analysis that explores performance through four lenses: workers, work, workplace, and world. The worker level dissects the knowledge, skills, and attitudes required of the employee (or performer) to complete the tasks. It assesses the skill sets that an organization’s workforce possesses. The work lens examines the workflow and procedures; how the work moves through the organizational system. The workplace lens takes into account the organizational infrastructure that is in place to support the work and workers. Examples of items taken into consideration at this phase include checking to see if an organization’s strategic plan informs the daily work practices, the resources provided to support work functions throughout the organization, and tools that employees are equipped with to complete their work (Van Tiem et al., 2012). World analysis expands even further to consider performance outside of the organization, in the marketplace or society. For example, an organization might consider the societal benefits of their products or services.

While instructional designers do not have to be experts in organizational design and performance analysis, they should be fluent in these practices to understand how various types of performance analyses may influence their work. Whether an analysis is limited to individual performance, organizational performance, or environmental performance, they all seek to understand the degree to which elements within the system are interacting with one another. These analyses vary in terms of scalability and goals. Interactions may involve elements of one subsystem of an organization or multiple subsystems (layers) within an organization. For example, an instructional design program would be considered a subsystem of a department with multiple programs or majors. The department would be another system that would fall under a college, and a university would be composed of multiple colleges, each representing a subsystem within a larger system.

Cause Analysis

A large part of human performance technology is analyzing organizational systems and work environments to improve performance. While performance analysis helps to identify performance gaps occurring in an organization, it is important to identify the causes that are contributing to those performance gaps. The goal of cause analysis is to identify the root causes of performance gaps and identify appropriate sustainable solutions.

While conducting a cause analysis, a performance technologist will consider the severity of the problems or performance gaps, examine what types of environmental supports are currently in place (i.e. training, resources for employees) and skill sets of employees (Gilbert, 1978). The performance technologist engages in troubleshooting by examining the problem from multiple viewpoints to determine what is contributing to the performance deficiencies (Chevalier, 2003).

Non-instructional Interventions

Once a performance technologist has identified the performance gaps and opportunities, they create interventions to improve performance. “Interventions are deliberate, conscious acts that facilitate change in performance” (Van Tiem et al., 2012, p. 195). Interventions can be classified as either instructional or non-instructional. As mentioned in the discussion of general systems theory, it is imperative that the instructional designer is aware of how they interact with various elements within their system. In order to maintain positive interactions between these organizational elements, non-instructional interventions are often needed to create a supportive infrastructure. Considering politics within an organization and promoting an organizational culture that is valued by all departments and individuals within the system and carried out in processes and services are examples of infrastructural support needed for an organization (or system) to be successful. While there are a variety of different strategies that may be carried out to promote stability within an organization, the non-instructional strategies most commonly seen by instructional designers include job, analysis, organizational design, communication planning, feedback systems, and knowledge management. Table 2 provides examples of how non- instructional strategies may benefit the instructional design process.

Table 2. Non-instructional strategies
Non-Instructional StrategiesBenefit to the Instructional Design Process
Job analysisUp to date job descriptions with complete task analyses will provide a detailed account for performing tasks conveyed in training.
Organizational designA plan that outlines the organizational infrastructure of a company. Details are provided to demonstrate how different units interact and function with one another in the organization.
Communication planningPlans that detail how new initiatives or information is communicated to employees. Examples may include listervs, performance reviews, and employee feedback.
Feedback systemsDetailed plans to provide employees feedback on their work performance. This information may be used to identify individual training needs and opportunities for promotion.
Knowledge managementInstallation of learning management systems to track learning initiatives throughout the organization. Electronic performance support systems are used to provide just-in-time resources to employees.

Organizational design and job analysis are two non-instructional interventions that instructional designers should be especially familiar with especially, if they are involved with projects that will result in large scale changes within an organization. They should have a solid understanding of the various functions and departments within the organization and the interactions that take place among them. Organizational design involves the process of identifying the necessary organizational structure to support workflow processes and procedures (Burton et al., 2015). Examples include distinguishing the roles and responsibilities to be carried out by individual departments or work units, determining whether an organization will have multiple levels of management or a more decentralized approach to leadership, and how these departments work together in the larger system.

Job analyses are another area that can affect long term implications of instructional interventions. A job analysis is the process of dissecting the knowledge, skills, and abilities required to carry out job functions listed under a job description (Fine & Getkate, 2014). Oftentimes, a task analysis is conducted to gain a better understanding of the minute details of the job in order to identify what needs to be conveyed through training (Jonassen et al., 1999). If job analyses are outdated or have never been conducted, there is a very good chance that there will be a misalignment between the instructional materials and performance expectations, thus defeating the purpose of training.

Feedback systems are often put in place by organizations to provide employees with a frame of reference in regards to how they are performing in their respective roles (Schartel, 2012). Feedback, when given properly, can “invoke performance improvement by providing performers the necessary information to modify performance accordingly” (Ross & Stefaniak, 2018, p. 8). Gilbert’s (1978) Behavioral Engineering Model is a commonly referenced feedback analysis tool used by practitioners to assess performance and provide feedback as it captures data not only at the performer level but also at the organizational level. This helps managers and supervisors determine the degree of alignment between various elements in the organization impacting performance (Marker, 2007).

The most recognizable non-instructional interventions may be electronic performance support systems (EPSSs) and knowledge management systems. These are structures put in place to support the training and performance functions of an organization. Oftentimes EPSSs are used as a hub to house training and supports for an employee. Examples extend beyond e-learning modules to also include job aids, policies and procedures, informative tools or applications, and other just-in-time supports that an employee may need to complete a task. Knowledge management systems serve as a repository to provide task-structuring support as well as guidance and tracking of learning activities assigned or provided to employees (Van Tiem et al., 2012).

Other examples of supportive systems could also include communities of practice and social forums where employees can seek out resources on an as needed basis. Communities of practice are used to bring employees or individuals together who perform similar tasks or have shared common interests (Davies et al., 2017; Wenger, 2000; Wenger et al., 2002). When selecting an intervention, it is important to select something that is going to solve the problem or address a particular need of the organization. Gathering commitment from leadership to implement the intervention and securing buy-in from other members of the organization that the intervention will work is also very important (Rummler & Brache, 2013; Spitzer, 1992; Van Tiem et al., 2012).

LIDT in the World

Ilona is the Training and Development Manager for a Fortune 500 medical informatics company. In her role, she leads a team of instructional designers that provide training and resource materials for their customers. The company is finalizing a new mobile technology product that they intend to launch in the next six months.

The majority of training materials have been designed for on-site instructor-led tutorials as health care organizations implement the new software package. Ilona’s director of strategic projects has asked Ilona to create a plan for implementing instructor-led and asynchronous training that can be accessed on a mobile platform. What should Ilona consider as she maps out her plan for her team?

Whether the intervention to improve performance is instructional or non-instructional, Spitzer (1992) identified 11 criteria for determining whether an intervention is successful. Table 3 shows these criteria and provides examples of questions instructional designers should ask when selecting or designing interventions.

Table 3. Examples of questions to ask when selecting or designing interventions
Spitzer's (1992) 11 CriteriaQuestions to Ask
Design should be based on a comprehensive understanding of the situation.
  • Will the intervention align with the performance gaps and opportunities identified by the cause analysis?
Interventions should be carefully targeted.Will the intervention target...
  • The right people,
  • In the right setting,
  • At the right time?
An intervention should have a sponsor.
  • What individual or groups within the organization will champion this intervention?
Interventions should be designed with a team approach.
  • Will stakeholders be involved in designing this intervention?
  • Will the intervention consider the expertise of other individuals within the organization?
Intervention design should be cost-sensitive
  • Will the intervention be the most cost-effective option?
  • Will all costs (finances, time, labor-force, etc.) be considered?
Interventions should be designed on the basis of comprehensive, prioritized requirements, based on what is most important to both the individual and the organization.
  • Will there be alignment between the intervention and the priorities of the organization and stakeholders at various levels?
A variety of intervention options should be investigated because the creation of a new intervention can be costly.
  • Will various intervention options be considered before a decision is made?
Interventions should be sufficiently powerful.
  • What will be the short-term effectiveness of the intervention?
  • What will be the long-term effectiveness of the intervention?
Interventions should be sustainable.
  • How will this intervention be embedded in the organizational culture over time?
Interventions should be designed with viability of development and implementations in mind.
  • What human and organizational resources will support this intervention throughout implementation and over time?
Interventions should be designed using an iterative approach.
  • What formative strategies will be used to evaluate the intervention?
  • How many revisions will be necessary?

Leveraging Human Performance Technology to Support Inclusive Design

One particular challenge in the field of learning, design, and technology is that not all instructional designers are aware of the impact that HPT strategies can have on their design work. Not all instructional design training programs incorporate human performance technology coursework. Oftentimes, the extent that instructional design students may be introduced to needs assessment and evaluation does not extend beyond the learner analysis and an evaluative assessment at the end of an e-learning module.

HPT strategies can be used to help instructional designers support and promote inclusive design. If an instructional designer adheres to Spitzer’s (1992) criteria for designing a performance intervention, they will develop a sufficient understanding of the situation and be able to prioritize their design practices accordingly.

To date, researchers have been to look at how HPT can be leveraged to support instructional design through the adoption of an organizational justice lens (Giacumo et al., 2021). By applying this lens on typical activities that occur in HPT (e.g., cause analysis, needs assessment, environmental analysis, organizational analysis, intervention selection and design, and evaluation), instructional designers are positioned to address issues such as fairness, equity, and ethical behavior in different contexts (Cropanzano & Stein, 2009; Stefaniak & Pinckney-Lewis, in press).

The processes of collecting and analyzing data to understand situations warranting instructional design support is fundamental in HPT practices. For several decades, strategies have been offered to provide recommendations for how best to approach and account for the layers and complexities inherent in more organizations. It is also important that individuals responsible for collecting data are collecting appropriate data that provides sufficient information pertaining to the cultural needs of their audience (Asino & Giacumo, 2019).

In an effort to triangulate data sources that capture and discrepancies within and amongst different cultures and populations that may exist with the system (organization) being examined, Peters and Giacumo (2020) proffer guidelines to support cross-cultural data collection. They recommend that individuals responsible for data gathering demonstrate respect for cultural beliefs, allow for the addition of time to gather information and understand the situation, build trust with members of the population, and take a participatory approach to include members of the population in the process. These cross-cultural HPT practices can extend to all facets of instructional design; thus promoting inclusive design environments where the learners have a significant role in the interventions being designed to support their needs.

Conclusion

While it is not necessary for instructional designers to engage in human performance technology, they may find themselves frequently in their careers working more like performance technologists than they originally supposed they would. In addition, those that use human performance technology thinking may be better positioned to design sustainable solutions in whatever their organization or system. Human performance technology offers a systems view that allows for the instructional designer to consider their design decisions and actions. By recognizing the systemic implications of their actions, they may be more inclined to implement needs assessment and evaluation processes to ensure they are addressing organizational constraints while adding value. With the growing emphasis of design thinking in the field of instructional design, we, as a field, are becoming more open to learning about how other design fields can influence our practice (i.e., graphic design, architecture, and engineering), and human performance, as another design field in its own right, is one more discipline that can improve how we do our work as instructional designers.

Recommended Readings

There are a variety of resources available for instructional designers who are interested in learning more about how they can utilize concepts of human performance technology in their daily practice. This section provides an overview of journals and important books related to the field.

Recommended Readings

Compared to other disciplines, the field of human performance technology is considered a relatively young field dating back to the early 1960s. The following is a list of books that may be of interest to individuals who are interested in learning more about human performance technology:

Journals

While a number of instructional design journals will publish articles on trends related to the performance improvement, the following is a list of academic journals focused specifically on the mission of human performance technology:

Additional Reading

Another useful chapter on performance technology is available in The Foundations of Instructional Technology, available at https://edtechbooks.org/-cx

Stefaniak, J. (2018). Performance technology. In R.E. West (Ed.), Foundations of learning and instructional design technology: The past, present, and future of learning and instructional design technology. EdTech Books. https://edtechbooks.org/lidtfoundations/performance_technology

References

Asino, T. I., & Giacumo, L. A. (2019). Culture and global workplace learning: Foundations of cross-cultural design theories and models. In V. H. Kenon & S. V. Pasole (Eds.), The Wiley handbook of global workplace learning (pp. 395–412). Wiley.

Bertalanffy, L. (1968). General systems theory: Foundations, development, applications. George Braziller.

Burner, K. J. (2010). From performance analysis to training needs assessment. In K. H. Silber & W. R. Foshay (Eds.), Handbook of improving performance in the workplace: Instructional design and training delivery (vol. 1, pp. 144–183). Pfeiffer.

Burton, R. M., Obel, B., & Håkonsson, D. D. (2015). Organizational design: A step-by-step approach. Cambridge University Press.

Chevalier, R. (2003). Updating the behavior engineering model. Performance Improvement, 42(5), 8-14. https://doi.org/10.1002/pfi.4930420504

Cropanzano, R., & Stein, J. H. (2009). Organizational justice and behavioral ethics: Promises and prospects. Business Ethics Quarterly, 19(2), 193–233. https://doi.org/10.5840/beq200919211

Davies, C., Hart, A., Eryigit-Madzwamuse, S., Stubbs, C., Aumann, K., & Aranda, K. (2017). Communities of practice in community-university engagement: Supporting co-productive resilience research and practice. In J. McDonald & A. Cater-Steel (Eds.), Communities of practice: Facilitating social learning in higher education (pp. 175–198). Springer.

Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design of instruction (7th ed.). Pearson.

Fine, S. A., & Getkate, M. (2014). Benchmark tasks for job analysis: A guide for functional job analysis (FJA) scales. Psychology Press.

Foshay, W. R., Villachica, S. W., & Stepich, D. A. (2014). Cousins but not twins: Instructional design and human performance technology in the workplace. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 39–49). Springer.

Giacumo, L. A., MacDonald, M., & Peters, D. (2021). Promoting organizational justice in cross-cultural data collection, analysis, and interpretation: Towards an emerging conceptual model. Journal of Applied Instructional Design, 10(4). https://dx.doi.org/10.51869/104/lgi

Gilbert, T. F. (1978). Human competence: Engineering worthy performance. McGraw- Hill.

Hall, A. D., & Fagen, R. E. (1975). Definition of system. In B. D. Ruben & J. Y. Kin (Eds.), General systems theory and human communications (pp. 52–65). Hayden Book Company, Inc.

Harless, J. (1973). An analysis of front-end analysis. Improving Human Performance: A Research Quarterly, 4, 229–244.

Jonassen, D. H., Tessmer, M., Hannum, W. H. (1999). Task analysis methods for instructional design. Routledge.

Mager, R. F., & Pipe, P. (1970). Analyzing performance problems: Or you really oughta wanna. Fearon Pub, Inc.

Marker, A. (2007). Synchronized analysis model: Linking Gilbert's behavioral engineering model with environmental analysis models. Performance Improvement, 46(1), 26–32. https://doi.org/10.1002/pfi.036

Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development, 50(3), 43–59. https://doi.org/10.1007/BF02505024

Pershing, J. A. (2006). Human performance technology fundamentals. In J. A. Pershing (Ed.), Handbook of human performance technology (3rd ed., pp. 5–26). Pfeiffer.

Peters, D. J. T., & Giacumo, L. A. (2020). Ethical and responsible cross-cultural interviewing: Theory to practice guidance for human performance and workplace learning professionals. Performance Improvement, 59(1), 26–34. https://doi.org/10.1002/pfi.21906

Richey, R. C., Klein, J. D., & Tracey, M. W. (2011). The instructional design knowledge base: Theory, research, and practice. Routledge.

Ross, M., & Stefaniak, J. (2018). The use of the behavioral engineering model to examine the training and delivery of feedback. Performance Improvement, 57(8), p. 7–20. https://doi.org/10.1002/pfi.21786

Rothwell, W. (2005). Beyond training and development: The groundbreaking classic on human performance enhancement (2nd ed.). Amacom.

Rummler, G. A. (1972). Human performance problems and their solutions. Human Resource Management, 11(4), 2–10. https://doi.org/10.1002/hrm.3930110402

Rummler, G. A. (2006). The anatomy of performance: A framework for consultants. In J. A. Pershing (Ed.), Handbook of human performance technology (3rd ed., pp. 986–1007). Pfeiffer.

Rummler, G. A., & Brache, A. P. (2013). Improving performance: How to manage the white space on the organization chart (3rd ed.). Jossey-Bass.

Schartel, S. A. (2012). Giving feedback—An integral part of education. Best Practice & Research Clinical Anaesthesiology, 26(1), 77–87. https://doi.org/10.1016/j.bpa.2012.02.003

Smith, P. A., & Ragan, T. L. (2005). Instructional design (3rd ed.). Wiley.

Spitzer, D. R. (1992). The design and development of effective interventions. In H. D. Stolovitch & E. J. Keeps (Eds.), Handbook of human performance technology (pp. 114–129). Pfeiffer.

Stefaniak, J., & Pinckney-Lewis, K. (in press). A systemic approach toward needs assessment to promote inclusive learning design. In B. Hokanson, M. Exeter, M. Schmidt, & A. A. Tawfik (Eds.), Educational communications and technology: Issues and innovations. Springer. 

Tosti, D., & Jackson, S. D. (1997). The organizational scan. Performance Improvement, 36(10), 2-26. https://doi.org/10.1002/pfi.4140361007

Van Tiem, D., Moseley, J. L., & Dessinger, J. C. (2012). Fundamentals of performance improvement: A guide to improving people, process, and performance (3rd ed.). Pfeiffer.

Wenger, E. (2000). Communities of practice and social learning systems. Organization, 7(2), 225–246. https://doi.org/10.1177/135050840072002

Wenger, E., McDermott, R. A., & Snyder, W. (2002). Cultivating communities of practice: A guide to managing knowledge. Harvard Business Press.

Jill E. Stefaniak

University of Georgia

Jill Stefaniak is an Associate Professor in the Learning, Design, and Technology program in the Department of Workforce Education and Instructional Technology at the University of Georgia. Her research interests focus on the professional development of instructional designers and design conjecture, designer decision-making processes, and contextual factors influencing design in situated environments.

Lauren Bagdy
Dr. Bagdy studies informal Learning, instructional design, program evaluation, K-12 technology integration, social media environments as informal learning spaces, digital and media literacy, teacher professional development, and the evaluation of open educational resources. She received her Ph.D. in Instructional Systems and Learning Technologies from Florida State University.

This content is provided to you freely by EdTech Books.

Access it online or download it at https://edtechbooks.org/foundations_of_learn/hpt.