The ongoing evolution of educational research has given rise to various paradigms and methods that can be loosely organized under two broad categories of qualitative (qual) and quantitative (quan) research. Qualitative generally refers to any methods that rely on subjective interpretations of rich textual or experiential data, whereas quantitative generally refers to methods that rely upon objective analyses of measurable data via descriptive or statistical methods.
Recognizing the relative strengths and weakness of both qual and quan approaches, researchers and funding agencies have increasingly utilized and encouraged mixed methods approaches as they can be useful for solving educational problems in more authentic and situated ways and can provide evidence that appeals to a wide array of stakeholders. This appeal seems to stem from a philosophical stance that values methodological pluralism (wherein both qual and quan methods are seen as valid ways of understanding the world) and that assumes some level of commensurability or belief that alternate paradigms can work together for achieving desirable goals (as opposed to rigid incommensurability).
Mixed methods refers to any research approach that utilizes both qual and quan data in some form. However, mixed methods as a descriptor is very broad, and the term alone does not guide researchers through specific processes of research or toward an understanding of how qual and quan methods and data should be used together. In fact, there are many ways of doing mixed-methods research, and each exhibits its own benefits, weaknesses, and pitfalls.
In this chapter, I will first explain two ways in which various mixed-methods approaches may differ from one another: temporal order and emphasis. I will then provide specific examples of mixed-methods designs–including triangulated, explanatory, exploratory, embedded, transformative, and multiphase–which may be useful for guiding researchers in planning exactly how qual and quan methods should interact in their mixed methods studies. I will then conclude by providing some guidelines for conducting good mixed-methods research.
Though all mixed-methods studies involve both qual and quan data, how and when these data are collected and analyzed may differ across studies with researchers either engaging in a parallel (+), linear (→), or iterative (→) process (see Figure 1).
Parallel mixed-methods designs involve collecting and analyzing both qual and quan data in tandem, where each process operates independently of the other. An example of this would be conducting surveys and focus groups simultaneously, analyzing the results of each, and reporting findings in a manner that compares and contrasts them. Such approaches typically involve multiple researchers or teams of researchers who work separately to conduct divergent analyses and who then come together for determining what the results mean in relation to one another. In parallel designs, researchers are essentially conducting separate studies simultaneously that are treated as distinct pieces of evidence for answering the guiding research question.
Linear mixed-methods designs, on the other hand, involve sequential steps that alternate between qual and quan methods where the second step is informed by or expounds upon what is found in the first. An example of this would be to do an experimental study comparing student scores on a test (quan) and then conducting interviews with some low-performing students to see why they struggled (qual). An inverted example of this would be to first conduct focus groups with students to discover what they might anticipate struggling with on a test (qual) and then designing the test with these considerations in mind and conducting the test with students (quan). In both cases, there were separate, clearly-delineated steps in the data collection and analysis process, and the second step built upon the first. This notably differs from parallel structures, because what is actually done in the second step might not be clear until after the first step is complete (e.g., what questions will be asked on the test or in interviews), allowing for refining of some methods as the study progresses.
Iterative mixed-methods designs are special cases of linear designs that simply take three or more steps in data collection and analysis where each step informs the subsequent step. An example of this would be conducting interviews with participants to initially understand a phenomenon (qual), using these results to construct a survey that is then administered to a larger sample (quan), and then doing follow-up interviews with participants who answered the survey in particular ways (qual). In this case, the first qual step is used to inform the design of the survey needed for the quan step, which is taken to generalize results. But, then another qual step is taken to clarify or hone in on specific topics or nuances that were identified in the quan step. This process can then continue as long as needed (e.g., testing an intervention quantitatively) as each step informs the next.
In practice, the difference between a linear and an iterative design often just comes down to where the line is drawn between studies within the larger scope of a research project or agenda. Given that most research studies are reported in academic journals with restrictive word limits, describing the methods and results of three different stages of data collection and analysis (e.g., qual→quan→qual) may not be feasible. Also, it may just not make sense to report on all stages of a research project in the same report. As a result, an iterative design like qual→quan→qual might be broken up into three single-method studies (e.g., a qual study, a quan study, and another qual study) or some combination of mixed-methods and single-method studies (e.g., a qual→quan mixed-methods study and a qual single-method study). That is to say that even if a project might employ mixed methods overall, this does not mean that all reports would need to be framed as mixed methods.
In addition to temporal order, mixed-methods designs also vary in the relative importance they ascribe to qual vs. quan methods, either prioritizing qual (QUAL>quan), prioritizing quan (QUAN>qual), or treating the two approaches with equal weight (QUAL=QUAN). For instance, in a study that measures student performance on a test after playing a learning game, researchers might focus their inquiry on the effects of the game upon measurable learning outcomes (QUAN) while also collecting a few responses from students on what they liked about the game (qual). In contrast, another study might conduct in-depth interviews with students about their experiences with educational games (QUAL) while also collecting and reporting on survey results from all students in a school regarding time spent playing games (quan). In both of these cases, one methodology is prioritized as being central to the study while another is utilized to contribute slightly to the results (e.g., as isolated comments or contextual descriptives).
In cases where one method is prioritized over another, expectations of scholarly rigor will be more heavily emphasized with the prioritized method. That is, if QUAN is prioritized, then researchers will need to more carefully ensure that they are meeting the expectations of quantitative rigor necessary for generalizability, predictive power, validity, etc. but may give limited attention to expectations of qualitative rigor (e.g., negative case analysis). In contrast, if QUAL is prioritized, then researchers will need to carefully ensure that they are meeting expectations of qualitative rigor necessary for transferability, positionality, trustworthiness, etc. but may give limited attention to expectations of quantitative rigor (e.g., reliability). Such prioritization reveals that though both qual and quan data may be used in the same study, each type of data may be used through the lens of a paradigm that will treat one as more important, useful, or meaningful than the other.
Because qual and quan methods are used to answer different research questions, this also means that prioritizing one over another will impact the type of questions we can answer. As a result, many mixed methods studies will have a primary research question, which will be answered with the prioritized method, and also have one or more secondary research questions, which will be answered with the deprioritized method. An example of this would be asking a primary question like "What percent of U.S. teens experience bullying in schools?" (QUAN) accompanied with a secondary question like "How do some teens describe the trauma associated with these experiences?" (qual). In this case, each question requires different data and analyses to find an answer, and since the first question is the primary research question, its methods will be prioritized to the second.
The upshot of this is that in studies where QUAN and QUAL methods are prioritized equally, researchers must carefully attend to the rigor expectations and requirements of each paradigm, attending to potentially contradictory requirements in the same study (e.g., objectivity vs. positionality, generalizability vs. contextual understanding). An example of this would be doing a generalizable survey of U.S. teachers on the topic of teacher attrition (QUAN) while also conducting in-depth interviews with diverse teachers regarding their experiences leaving the profession (QUAL). Doing this in a single study is very difficult because it may require researchers to operate within multiple epistemological paradigms simultaneously and grapple with how to navigate contradictory evidence and values (e.g., how much attention to give to individual teachers' experiences that do not reflect the norm).
Part of the strength of doing both qual and quan methods in the same study is that it allows you to approach problems from multiple lenses, drawing upon the benefits of both approaches. However, if done poorly, sloppy mixed methods studies that fail to meet the rigor requirements of their necessary paradigms may instead make results susceptible to the weaknesses of both, meaning that results are neither generalizable nor transferrable. All mixed methods studies should rigorously adhere to the requirements of at least one approach and (ideally) to both if it is possible and appropriate for answering the targeted research questions.
Because mixed-methods designs differ in temporal order (parallel vs. linear) and emphasis (QUAL vs. QUAN), there are at least nine different ways of structuring a mixed-methods study (see Table 1). Some of these are more common than others and have been recognized by researchers as useful in many settings, while others might be theoretically possible but are less commonly used in education research.
Mixed Methods Designs as Determined by Methodological Priority and Temporal Order
Triangulated mixed-methods design, also known as convergent parallel design, is done when both QUAL and QUAN data are collected concurrently and are equally prioritized (Creswell, 2008). Collection and analysis of each type of data occur separately, and findings are merged during the interpretation phase to allow for robust conclusions that are supported by multiple data sources. An example of triangulation would be a case study on an intervention's effects on a school where both QUAL and QUAN data are collected from students to determine what is happening in terms of academic performance, behaviors, attitudes, and perceptions. The primary benefit of a triangulated design is that it can draw upon the strengths of both QUAL and QUAN data (e.g., complex understanding and predictive power), but wisdom must be exercised in interpreting results that might be contradictory or when results do not easily converge.
Explanatory mixed-methods design is done when QUAN data is collected first and prioritized, and qual data is then later collected to help illustrate or expound upon what the quan methods found (Creswell, 2008). Since collection occurs sequentially and the qual data is collected in a supportive role, standards of quantitative rigor are emphasized as being of utmost importance, and then confirming qual data fills in any conceptual gaps. An example of an explanatory design would be conducting pre- and post-tests on professional development trainings with teachers to determine training efficacy and then collecting short sound bites from participating teachers to draw attention to high and low points, which otherwise might not be apparent in the QUAN data. The primary benefit of an explanatory design is that it can allow researchers to add some meaning or richness to the results of a QUAN study, but a major concern is that any qual data will be collected and analyzed with various a priori assumptions in place that may lead to confirmation biases or oversimplifications.
Exploratory mixed-methods design is done when QUAL data is collected first and prioritized, and quan data is then later collected to test themes or instruments developed from the QUAL process (Creswell, 2008). Since collection occurs sequentially and the quan data is collected in a supportive role, standards of qualitative rigor are emphasized as being of utmost importance, and then testing QUAL data with quan processes helps to verify or validate results. This approach is most commonly used in instrument creation, wherein researchers might interview a sample of participants about a poorly-understood phenomenon and then construct a survey or test based on their findings to determine how accurate their themes were. The primary benefit of an exploratory design is that it can allow researchers to create instruments for measuring novel factors on solid QUAL grounding, but an exploratory design normally only constitutes a first step in instrument validation, and the quan component can lengthen the time and increase the cost of a study considerably.
Embedded mixed-methods design is done in any setting where one method is prioritized and another method is used to supplement it (Creswell, 2008). Though generally done in parallel (qual+QUAN or QUAL+quan), embedded designs can be sequential, and the data collection and analyses processes are separate and are used to answer different questions, such as "RQ1: How were student scores influenced by the intervention?" (QUAN) and "RQ2: What was the intervention like for them as they experienced it?" (qual). In this case, QUAN is prioritized simply because it is used to answer the primary research question, whereas qual is used to answer the secondary question, but the situation could be reversed depending upon the researcher's goal. Similar to triangulation, embedded designs are useful for augmenting results with diverse data, but since one source is deprioritized, the secondary research question may be somewhat trivialized or not as rigorously answered as it could be.
Transformative mixed-methods design builds upon one or more of the aforementioned designs but orients its efforts toward achieving some form of social change or educational improvement (Creswell, 2008). This imposes ideologies or value systems to the design's activities (typically through the lens of critical theory, emancipation, or liberation) and means that any results of the mixed-methods study will yield suggestions or moral mandates for societal reforms.
Multiphase mixed-methods design applies iteration to one or more of the aforementioned designs (Creswell, 2008), iteratively prioritizing both QUAL and QUAN methods in cycles. This occurs across multiple studies as QUAL and QUAN methods build upon one another to answer progressing research questions within larger research programs and means that multiphase design might be an accurate descriptor for long-term research agendas held by researchers or labs, but the term should rarely (if ever) be used to describe a single study.
As with any approach to research, there are ways of doing mixed methods well and ways to do it poorly. There is also responsible and ethical use of mixed methods as well as irresponsible or unethical use. Here I will touch on a few considerations to help guide you in engaging in mixed methods in a manner that is both effective and responsible.
All good research effectively aligns research questions with methods, because if the methods we are using cannot answer the questions we are asking, then what is the point of doing research? With mixed methods, this means that we need to ask questions that align with both qual and quan methods and that we properly apply the data collected via each method toward answering its specific question.
Most researchers are not deeply trained in both qual and quan methods, which means that when they engage in mixed methods work they will often adhere to standards of rigor for the method they are familiar with but fail to do so for the other. This leads to sloppy mixed methods, which I explore elsewhere as irresponsible opportunism (Kimmons & Johnstun, 2019), and seeks to show (at least on the surface) an interest in pluralistic approaches to inquiry that have been adopted by divergent communities without actually adhering to the standards established by those communities.
All this means is that if we are claiming to do qual, then we should actually do qual, and if we are claiming to do quan, we should actually do quan. We should not merely do qual and then sprinkle in a smattering of numbers and call it mixed methods. Similarly, we should not merely do quan and then sprinkle in a few quotes and call it mixed methods. Good mixed methods honors both traditions of inquiry, which requires researchers to abide by certain standards of rigor to ensure that results are valid, trustworthy, etc.
Because the paradigms undergirding qual and quan methods are so divergent, it is difficult for a single researcher (or even a team of similarly-trained researchers) to do mixed methods well, because they will inevitably operate under the assumptions of a dominant paradigm. To combat this, we should build our research teams with complementary skills to ensure that both qual and quan methods are correctly done and that the communities that are drawn to each approach will agree that we have done it justice.
Creswell, J. W. (2008). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (third edition). Upper Saddle River, NJ: Pearson.
Kimmons, R., & Johnstun, K. (2019). Navigating paradigms in educational technology. TechTrends, 63(5), 631-641. doi:10.1007/s11528-019-00407-0