Designing for Productive Disciplinary Engagement and Responsive Assessment with Situated Cognition and Expansive Framing

Formative AssessmentSituated cognitionExpansive framingProductive disciplinary engagement
This paper summarizes over two decades of design-based research that insistently draws on situative theories of cognition. This research is consistent with the current focus of AECT and the AECT Research and Theory Division. This focus primarily concerns enduring problems in education and only secondarily concerns the technologies used to help address such problems. The first decade of research consisted of collaborations with leading innovators in multimedia and immersive learning. This resulted in a “multi-level” model of assessment that balanced formative and summative assessment, balanced extrinsic and intrinsic motivators, and boosted performance without “teaching to the test.” The second decade primarily concerned online contexts and included extended research of “open digital badges” and other web-enabled digital credentials. Embedding the design principles from Randi Engle (1965-2012) for productive disciplinary engagement and expansive framing resulted in a comprehensive framework called Participatory Learning and Assessment (PLA). This research qualified widely-held assumptions about “authentic” and “real world” instruction to address enduring problems such as online instructor “burnout,” student social isolation, synchronous vs. asynchronous learning, and secure online assessment. Recent and current efforts extended these ideas to address historical and continuing inequities in education and help define a new consensus on our theories of learning transfer.

Watch on YouTube

The Research and Theory Division (RTD) supports the core mission of the Association of Educational Communication and Technology (AECT). The division:

These goals are consistent with evolving goals of the broader association. The reorganized Handbook of Research on Educational Communication and Technology in the fifth edition (Bishop et al., 2020) highlighted a move away from educational technologies and toward educational problems. In the Foreword of the fifth edition, Reeves and McKenney pointed out that “the bulk of the scholarship in educational technology and communications has been focused on ‘things’ rather than on ‘problems’” (2020, p. vii). In response, most chapters in the latest handbook “focus on difficult problems and how they can be addressed through innovative designs and appropriate technology” (p. vi, emphasis added). Reeves and McKenney pointed to persistent problems such as racial equity, teacher training, sustainability, and external partnerships as areas where AECT members should be directing their efforts.

In this spirit of innovative designs and appropriate technologies, this article summarizes a comprehensive approach to instruction and assessment that my colleagues and I call Participatory Learning and Assessment, or PLA. This approach emerged across two decades of design-based research. As will be elaborated, the first decade consisted of research-practice partnerships with leading innovators in instructional multi-media and educational video games; the second decade of research was mostly carried out in online learning using widely available tools.

This research has focused on enduring problems that transcend technology and that arguably eluded prior theories. Some of these problems have been relatively general:

Other problems targeted by this research have been more specific to online learning.

This research draws intensively on contemporary situative theories of cognition (i.e., Lave & Wenger, 1991; Greeno, 1998). This work goes well beyond widespread characterizations of situated learning in terms of “authentic” and “real-world” contexts for instruction and assessment (e.g., Herrington et al., 2014). In particular, this work draws on the design principles that emerged in the research of Randi Engle (1965-2012) and colleagues. These principles suggest caution when drawing on authentic contexts defined by disciplinary experts. Instead, these principles suggest that learners problematize (i.e., contextualize) instruction from their own perspectives (which experts may or may not deem authentic).

This research addresses a fundamental issue that emerged across two “expert consensus study reports” from the US National Research Council (NRC) and its successor, the National Academy of Sciences, Engineering, and Mathematics (NASEM). These reports are written by a carefully selected group of experts following a strict process that captures the consensus of that committee at that time. How People Learn: Brain, Mind, Experience, and Schooling (NRC, 2001) provided a consensus view of learning and transfer that strongly reflected the socio-constructivist orientation of committee co-chair John Bransford and was embraced by many cognitive scientists and educational psychologists at that time. The title and contents of How People Learn II: Learners, Contexts, and Cultures (NASEM, 2018) reflected the “social turn” (e.g., Gee, 1999) that was already underway when ‘HPL I” was drafted. HPL II revealed a new consensus that (a) contexts and culture matter a lot in learning and (b) that contexts and culture matter a lot more than a similar group of experts concluded two decades earlier. It is noteworthy that HPL I included a chapter on the transfer of learning but not a chapter on culture, while HPL II included a chapter on culture but not a chapter on transfer. This suggests a lack of consensus and perhaps open disagreement about the implications of this social turn for our theories of transfer. This is important because our theories of transfer have profound implications for designing instruction, assessing learning, and testing achievement. The research summarized here aims to support a new consensus by deeply exploring the implications of situative transfer theories for instruction, assessment, and testing in light of other more established theories (Hickey & Pellegrino, 2005).

Responsive “Multi-Level” Assessment

The first phase of this research aligned formative and summative functions across multiple levels of increasingly formal assessments. This research drew on the assessment levels used in the more naturalistic evaluations of STEM curricula in Ruiz-Primo et al. (2002) and the innovative “conversational” approach to formative assessment in Duschl & Gitomer (1997). This phase consisted of design-based research in partnership with educators and students learning to use the innovations of leading technology innovators. The core approach emerged in studies of the GenScope multi-media curriculum for teaching introductory inheritance developed by Paul Horwitz (Hickey, Kindfield, et al., 2003). The approach was quasi-experimentally validated in studies of GenScope (Hickey & Zuiker, 2012), the Quest Atlantis videogame (Barab et al., 2007; Hickey, Ingram-Goble, & Jameson, 2009), and STEM multi-media programs from the NASA-funded Classroom of the Future (e.g., Hickey, Taasobshirazi, & Cross, 2012). The approach was further refined in studies of participatory new digital media developed by the media scholar Henry Jenkins (Hickey, McWilliams, & Honeyford, 2011; Hickey, Honeyford, &McWilliams, 2013), and considerations of vocational education (Hickey, 2005).

At the core of this approach is the idea that a broader situative view of learning allows the same assessment to serve a summative function for one form of learning and a formative function for a different form of learning. Doing so sidesteps socio-constructivist concerns that summative assessment purposes undermine formative purposes (e.g., NRC, 2001; Pellegrino, 2002). The four assessment levels aligned in most of this research included the following:

Thus, for example, the close reflections on engagement can summatively assess prior engagement while also formatively assessing conceptual understanding. Changing knowledge representations across levels requires learners to transform the knowledge gained from formative feedback at the prior level. Directly addressing the widely cited concerns of Messick (1994) regarding construct-irrelevant easiness (i.e., “teaching to the test”), this preserves the validity of scores as meaningful evidence of learning. As long as learners were not directly prepared for each assessment, those scores are reliable estimates of transfer when grading work, measuring achievement, and evaluating curricula.

Productive Disciplinary Engagement and Participatory Learning and Assessment

Our approach was first labeled PLA and further theorized when it moved into online learning around 2009. This approach was deemed “responsive” for responding to the different forms of knowing assessed at each level. As shown in Table 1, the assessments across levels capture (a) increasingly formal knowledge representations, (b) broader curricular orientation, (c) increasingly private interactions (after Hall & Rubin, 2013), and (d) lengthier timescales of learning (after Lemke, 2000).

Table 1

Responsive Assessment in Online Learning








Informal assessment of discourse via instructor comments on learner annotations and artifacts


Very informal




Summative assessment of engagement and formative assessment of understanding via student reflections






Summative assessment understanding and formative assessment of achievement via self-assessments






Summative assessment of achievement via secure time-limited multiple-choice achievement tests





At the advice of James Gee, we embedded the design principles for productive disciplinary engagement (PDE; Engle & Conant, 2002) within these multiple assessment levels. These principles suggest that learners (a) problematize content from their own perspectives, (b) be given authority to solve the problems that result, (c) be held accountable to the discourse of the discipline, and (d) be given resources necessary to accomplish the first three principles. As introduced in Hickey and Rehak (2013), this synergy resulted in the comprehensive PLA framework. This new framework consisted of the five following situative design principles:

  1. Use public contexts to give meaning to knowledge tools.

  2. Recognize and reward productive disciplinary engagement.

  3. Grade artifacts through local reflections on engagement.

  4. Let individuals assess their understanding privately.

  5. Measure achievement discreetly as necessary and appropriate.

In short, the first two PLA principles embrace Engle’s PDE principles and the first assessment level, while the last three PLA principles represent the three other assessment levels.

A central idea in PLA is that public (to the class) contexts for student engagement were possible and reasonable when each student was problematizing content from their own perspective. Once each student has defined an initial context for problematizing course content, peers are ideally positioned to learn from each other, borrowing examples and insights, but necessarily transferring them to their own contexts, while thwarting plagiarism. 

This public engagement was initially supported by the wiki tools in the open-source Sakai LMS, in what we called “wikifolios.” Subsequently, we have come to rely on google docs (i.e., “G-portfolios”) or the headers of new discussion forums in the Canvas and Blackboard LMS (“E-portfolios”). Working in Google Docs and discussion forum headers supports discussion via threaded instructor and student comments. But conventional forums are never used for substantive discussions in PLA. This reflects concerns that Thomas (2002) and others have raised about the incoherence and abstraction of traditional discussion forums. Instead, most interaction occurs via threaded comments directly on student artifacts or via social annotations on learning resources. In many cases, regular hyperlinked instructor announcements support additional focused interaction (see Feguson et al., 2010). This concrete interaction directly involving student artifacts and annotations supports a key insight that emerged early on in this work: most students are happy to engage in discussions when their own work and their own ideas are in play. Thus student posts or discussions are never graded and are seldom even required. This minimizes the dreary discussions that mandatory and graded posts often cause.

Extending PLA with Expansive Framing

The design principles for expansive framing emerged when Engle (2006) re-analyzed the student engagement and assessment performance from the original 2002 study. Engle searched for interactions responsible for “generative” learning that transferred readily and widely to performance assessments that were very different from the learning environment. She concluded that generative learning was best supported when students were (a) pushed to find numerous connections with people, places, topics, and times beyond the boundaries of the course and (b) encouraged to position themselves as accountable authors who were contributing to a larger conversation that extends over time and space. These principles were rigorously validated in the experimental study in Engle et al. (2011) that compared expansively framed tutoring sessions in biology with “bounded framing” where learners were not pushed to make connections or position themselves as authors.

Engle et al. (2012) further elaborated on expansive framing. They explained how expansive framing could support intercontextuality (Bloome et al., 2009). This is where learners form so many connections between their prior experience, current outside goals, and future contexts that the learning process becomes part of a larger “encompassing context.” Experienced writers will recognize this state as the epitome of authorship. In support of this seemingly elusive goal for students, Engle et al. (2012) provided five compelling explanations for why expansive framing and intercontextuality should support generative learning:

  1. More intercontextuality between settings. Helping learners imagine how they will use what they are learning in the future naturally leads them to adopt more productive learning strategies rather than simply completing the assignment.

  2. Recognizing relevance. If learners have successfully imagined using what they are learning, they are more likely to recognize those opportunities for transfer when they present themselves. 

  3. Transfer in prior knowledge. The third explanation of expansive framing is positioning learners as authors of their own ideas will encourage them to transfer in more of their own relevant prior experiences. To the extent that peers find these experiences useful for their own learning, learners will be encouraged to transfer in even more of their prior experiences.

  4. Accountability to content. If students successfully author new content at the intersection of their experiences and course content, they are more likely to feel a sense of “ownership” over that content. This is likely to engender confidence when using that knowledge in potential transfer settings.

  5. Generalization of authorship. If students take up enough opportunities to author new knowledge, this experience should generalize to other knowledge in the domain and ultimately to knowledge in new domains.

Engle and colleagues provided empirical examples from other research to support each of these explanations. Presciently, in Engle’s last publication before her untimely death, the 2012 paper also detailed a potential research agenda for expansive framing in the form of three types of studies that might be used to examine how expansive framing impacts transfer. These studies included disentangling experiments, comparative classroom studies, and microgenetic investigations. In key respects, our program of research aims to lay the theoretical and methodological groundwork for pursuing such efforts and moving this research into online settings.

Motivational Implications of PDE, Expansive Framing, and PLA

Before considering our current and future efforts, we point out that embedded within and (sometimes) alongside this assessment-related work is a situative consideration of motivation. This strand is rooted in early efforts to reframe motivation research from the study of individual factors to the study of broader social and cultural factors (e.g., Hickey, 1997; Hickey & McCaslin, 2001; Sivan, 1986). This work aimed to help resolve the acrimonious debates over the consequences of “extrinsic” rewards for “intrinsic” motivation (e.g., Cameron & Pierce, 1994, vs. Ryan & Deci, 1996). Readers are reminded that hundreds of experimental studies of the so-called “overjustification effect” (Lepper et al., 1973) have shown that intrinsic motivation and engagement in free-choice activity are diminished when learners are provided with arbitrary “extrinsic” rewards for their efforts (Tang & Hall, 1995). Readers are also reminded that this seemingly intractable debate illustrates how experimental and meta-analytic methods can’t resolve such fundamental issues rooted in competing theories of learning (see Lepper et al., 1999).

Central to both the motivation and assessment work is one of the most important but least appreciated aspects of situative theory. The so-called situative synthesis concerns the way that one reconciles the tensions between (a) the way humans behave and process information, (b) the way that humans make sense of the world around them, and (c) the way that social and cultural systems function. As introduced in Greeno and Moore (1993) and elaborated in Greeno (1998), the prevailing “aggregative” reconciliation uses the aggregated activity of individuals to explain the functioning of larger sociocultural systems. In motivation, this is perhaps best illustrated by Bandura’s (2000) theory of collective efficacy, whereby the efficacy of the group is entirely made up of the self-efficacy of the members of the group. This aggregative characterization of social and cultural activity assumes that there is no “disembodied group mind” (p. 76) that is independent of the collective beliefs of the members. 

In contrast, the situative synthesis characterizes the activity of individuals as “special cases” of sociocultural activity. In other words, human behavior & information processing, and human sense-making are ultimately best explained in terms of participation in social and cultural practices. A worthwhile caveat here highlights the difference between the work of learning scientists and the work of cognitive scientists. As a learning scientist, I have pursued new solutions to enduring problems by assuming that the situative synthesis is true (rather than arguing that it is true). In doing so, I sidestep the more fundamental naturalistic debates about the nature of human cognition, as played out in the widely-cited debate between Anderson et al. (1996) and Greeno (1997). Rather, I ask whether assuming that intrinsic and extrinsic sources of motivation are both special cases of socially situated activity reveals a new path forward in this seemingly intractable debate (Hickey, 2003). In pragmatic terms, this means moving beyond the prevailing study of sociocultural influences on individual motivation to explore how sociocultural and situative theories of learning lead to entirely new theories of motivation (summarized in Hickey & Granade, 2004) and related issues such as classroom management (Hickey & Schafer, 2006)

This new way of thinking about extrinsic incentives was explored in a quasi-experimental study of two pairs of carefully matched classrooms playing Quest Atlantis. Students in two classrooms were provided extrinsic incentives (i.e., “badges” for their physical and virtual avatars); students in the other two classrooms were instead encouraged to fulfill their curiosity and interests (following Lepper & Malone, 1987). Multiple levels of assessment and motivational measures revealed that the students in the extrinsic incentive condition showed significantly larger gains in disciplinary learning knowledge and significantly greater disciplinary engagement and situational interest. Rejecting the overjustification hypothesis, the students who were rewarded with extensive incentives demonstrated slightly larger gains in personal interest in the topics studied (Hickey & Filsecker, 2012; Filsecker & Hickey, 2014)

In a subsequent program of research, the situative synthesis uncovered promising new ways of using and studying digital “badges” and other “micro-credentials.” In contrast to conventional assessments and credentials, these new open digital badges could contain actual claims of competence and links to web-enabled digital evidence supporting those claims. As elaborated in Casilli and Hickey (2016), these developments have profound implications because they downplay the value of traditional approaches to validation (of evidence) and accreditation (of credential value) while elevating the importance of perceived credibility (which has long been dismissed as an element of “face” validity, Popham, 1990). Naturally, these new ways of recognizing learning and achievement helped rekindle the debate over extrinsic incentives. Following their widely reported introduction in a national competition, several influential observers argued that such badges were inherently extrinsic (i.e., Jenkins, 2012; Resnick, 2012). But a longitudinal study of the thirty badges systems funded in the national competition (Hickey, Willis, & Quick, 2015) uncovered four very different types of badge systems:

A follow up study found that the participation-based badge systems were far more likely to result in a thriving badge-based “ecosystem” after the initial funding was exhausted. It turned that most of the other badge systems were unable to sustain (or, in some cases, even implement) their ambitious plans for individualized assessment of competency or completion (Hickey & Chartrand, 2020).

Reflecting the value of a new situative synergy involving both motivation and assessment, this work resulted in an extensive set of design principles for “where badges work better” (Hickey & Willis, 2017; Hickey, Willis, & Quick, 2015), a new framework for badge functions (Hickey, Uttamchandani, & Chartrand, 2020), and seven “new arguments” about motivating learning with digital badges. As elaborated in Hickey & Schenke (2019), these new arguments are:

  1. Digital badges are inherently meaningful. Relative to traditional incentives and credentials, digital badges can contain specific claims and evidence supporting those claims that make them more meaningful

  2. Open digital badges are particularly meaningful. Digital badges consistent with the Open Badges Infrastructure metadata system can be readily shared and read by humans and machines and can circulate in social networks where they can gain additional meaning

  3. Open digital badges are particularly consequential credentials. Newly available “endorsement” features make it possible for issuers of open digital badges to invite organizations to formally endorse a class of badges before they are issued and then endorse individuals (e.g., members or that organization) to endorse a specific badge after examining the claims it contains in light of the provided evidence.

  4. The negative consequences of extrinsic rewards are overstated. While the overjustification effect is real, a focus on it overlooks other relevant factors. These include the potentially positive short-term behavioral consequences of extrinsic incentives, their apparently positive neurological consequences (Hidi, 2016), and their joint potential for fostering “communal” motivation. It seems possibly and even likely that these other factors may well overwhelm any potential negative consequences that follow from the overjustification effect.

  5. Focus primarily on social activity and secondarily on individual activity. This follows directly from the situative synthesis above. Theoretically, this means understanding both extrinsic incentives and intrinsic motivators and as special cases of socially engaged participation. Practically, this means studying the consequences of badge systems for learner discourse before considering their consequences for individual behavior or cognition.

  6. Situative models of engagement are ideal for studying digital credentials. Most open digital badge systems are used to recognize learning in digital networks, most of which feature social components. Brown and Adler (2008) pointed out that digital social networks represent a fundamentally new form of engagement that transcends traditional models that focus on individual behavior or cognition.

  7. Study motivation and digital credentials at three increasingly formal levels. Consistent with the study reported in Filsecker and Hickey (2014), this means examining the impact of badges on (a) productive disciplinary engagement, (b) intrinsically motivated engagement and situational interest, and (c) changes in personal interest and the broader educational ecosystem.

Recent and Current Efforts

Recent years have included efforts to use PLA to design a range of online learning contexts. This includes undergraduate courses in learning theory (Chartrand et al., 2020), graduate courses in education (Hickey, 2015a, 2015b), a fully online secondary school (Itow, 2018), self-paced technical training courses (Hickey, Duncan, et al., 2020), and “big” open online courses completed by hundreds (rather than thousands) of open learners (Hickey & Uttamchandani, 2017).

Another strand of our recent efforts is presenting the PLA framework in a less-theoretical manner for broader audiences. Hickey, Chartrand, and Andrews (2020) translated the five PLA design principles into 15 discreet steps for instructional designers who are presumably more grounded in cognitive-constructivist theories of learning. Hickey, Duncan, et al. (2020) presented a subset of those steps for designing g-portfolio courses. These steps were presented without the design principles or theory and were intended for educators who are moving online but have little or no grounding in learning theory. These two papers have been useful in two self-paced professional development courses used by hundreds of secondary teachers who transitioned to online learning during the pandemic (Chartrand & Hickey, 2020; Hickey & Harris, 2020).

One strand of our current efforts is advancing new design principles that are specific to particular audiences. Hickey and Harris (2021) extended PLA into a new set of design principles specifically concerning assessment in online courses: 

  1. Embrace situative reconciliation over aggregative reconciliation.

  2. Focus on assessment functions rather than purposes.

  3. Synergize multiple complementary types of learner interaction. 

  4. Use increasingly formal assessments that capture longer timescales of learning.

  5. Embrace transformative functions and systemic validity.

  6. Position learners as accountable authors.

  7. Reposition minoritized learners for equitable engagement.

  8. Enhance the validity of evidence for designers, evaluators, and researchers.

  9. Enhance the credibility of scores and efficiency for educators.

  10. Enhance the credibility of assessments and grades for learners.

Readers may note that this set combines principles from our prior efforts with a new principle from other current efforts (described next). This synergy illustrates a key point of design-based research—that it evolves continually.

This new strand of work embraces contemporary “asset-based” responses to historical and continuing inequities and discrimination to provide a practical and theoretical alternative to outdated “deficit-based” responses. This work draws directly from the critique of Engle’s PDE principles by Agarwal and Sengupta-Irving (2019). Reviewing the four PDE principles summarized above, they first conceded that (1) problematizing invites students to transfer in their unique cultural and linguistic experiences, (2) granting authority to explore those problems gives students an active role in constructing knowledge, (3) accountability ensures that authority comes with a justification that is open to critique, and (4) providing culturally relevant resources might tap into diverse student interests.

Agarwal and Sengupta-Irving’s (2019) concessions are important because they argue that PDE is inherently more asset-based than most other modern (e.g., socio-constructivist) curricular frameworks. Illustrating progress in this regard in the learning sciences, they questioned the extent to which the existing PDE principles will support students who identify with minoritized communities, given the extensive evidence that such students are routinely “positioned out” of classroom discourse and/or positioned as lazy or disruptive by teachers and peers who are more advantaged and/or privileged (e.g., Paris, 2012). This means that implementing PDE (and by extension, expansive framing and PLA) while ignoring the realities of power and privilege limits support for equity. For example, Agarwal and Sengupta-Irving argued that problematizing content in ways that challenge culturally dominant ways of knowing can lead to racialized controversies that can harm minoritized learners. In response, they introduced new Connective and Productive Disciplinary Engagement (CPDE) principles for inviting minoritized students to position themselves as having unique experiences and (therefore) expertise. These principles are (1) use sociopolitical uncertainties (SPUs) to help problematize disciplinary knowledge, (2) curb undue social authority, (3) ensure equitable accountability, and (4) treat sociopolitical controversies as resources. 

We have been using the CPDE principles to further refine a graduate course on educational assessment that has long served as the testbed for PLA ideas and practices. Over three annual cycles of iterative refinement, we have piloted new course features and then included the effective ones across assignments (Hickey & Lam, 2022, 2023). This work has resulted in a new framework that we are calling culturally-sustaining educational assessment (CSEA) that consists of a new set of assessment design principles:

  1. Include optional sociopolitical uncertainties (SPUs) that invite, but do not require, students who identify with minoritized communities to position themselves as having valued relevant experiences and expertise.

  2. Instructor informal assessment of disciplinary discourse should position all students as having valued experiences and expertise and carefully introduce additional SPUs as appropriate.

  3. Encourage students to reflect on how social and cultural factors impacted their engagement without requiring them to speak for their community.

  4. Instructors should read cultural reflections and directly respond to particularly compelling ones in public or private feedback as appropriate.

  5. Include formative self-assessments for all student choice controversies to efficiently foster familiarity with those issues.

  6. Include exam items that assess the achievement of some (but not all) of the controversies and SPUs and ensure that those items function appropriately.

Initial evidence from student engagement and anonymous course evaluations suggests that this approach has indeed been successful. For example, these new features appear to have encouraged students to introduce new SPUs beyond those presented in the assignments and to reveal minoritized identities (e.g., LGBTQ+) that might otherwise have remained hidden.

The other strand of our current work is helping ensure that Randi Engle is credited for the ideas that we are enthusiastically building on. While Engle (2011) summarized applications of the PDE principles through 2010 (and other related research), the larger body of work on PDE and then expansive framing has yet to be systematically reviewed. To this end, Hickey, Chartrand, and Harris (2021) systematically analyzed the 31 empirical studies that included these actual terms in their titles or abstracts. We are currently conducting a thematic review of the 2715 peer-reviewed publications that reference these two frameworks. Within this effort, Freedman et al. (2023) thematically analyzed the 32 publications concerning diversity, equity, and inclusion, while Harris et al. (2023) thematically analyzed the subset of this research concerning teacher education.

In this spirit of supporting Engle’s legacy, we recently published an extended entry in an educational encyclopedia as a tribute to Engle’s contributions (Hickey, 2022). Finally, a chapter in a forthcoming Handbook of Educational Psychology (Hickey & Lam, in press) argues for a new model of transfer that insistently includes ideas from Agarwal and Sengupta-Irving. This brings equity and justice for the first time in discussions of transfer. In this way, we hope both sets of ideas find their way into a new situative/sociocultural consensus on transfer theory.

Appreciations, Concessions, Conundrum, and Conclusions

My colleagues and I deeply appreciate that these efforts were recognized by the runner-up 2022 Theory Spotlight Award from AECT’s Research and Theory Division. We also appreciate the foundations, agencies, and university initiatives (listed below) that supported many of these efforts, as well as the support and (sometimes) patience of the students, departmental colleagues, and administrators at the schools and universities where this research was conducted.

We concede that some of our ideas qualify (and potentially challenge) assumptions about learning theory and practice held by many members of AECT and the broader Learning Design and Technology community. We assume there will never be a “sociocultural revolution” that overturns or supplants prevailing theories and associated practices. For complex reasons, this is actually consistent with our goals and the situative synthesis summarized above. 

While on the margins of this paper and this work, we believe that the situative synthesis can help address another theoretical conundrum in the LDT community and beyond. This conundrum concerns the divide between proponents of socio-constructivist theories and practices and proponents of cognitive-associationist theories that underpin artificially intelligent tutors, personalized adaptive learning technologies, and competency-based education. This divide helped fuel the dissolution/rebranding of the International Association of K-12 Online Learning (iNACOL) in 2018 and the establishment of several new organizations. Proponents of personalized competency-based learning within iNACOL formed the new Aurora Institute and broadened their focus beyond technology-based learning, with the continued support of the Gates and Zuckerberg foundations (Barnum, 2018, 2019). iNACOL members who advocated the socio-constructivist Community of Inquiry framework (e.g., Archambault et al., 2022) joined the newly formed National Standards for Quality and helped draft a new set of online teaching standards (National Standards for Quality, 2018) that emphasized collaborative learning and instructor-student relationships over personalized competency-based learning.

On the one hand, skeptics question the theory and evidence behind personalized competency-based learning (e.g., Enyedy, 2014, see Barnum, 2020; Pane, 2018; Pane et al., 2017). On the other hand, proponents point to supporting evidence (e.g., Koedinger et al., 2010; Connections Academy, 2018), while influential foundations and many policymakers appear convinced of the potential of personalized learning. Indeed our own research (Hickey, Robinson et al., 2019) found that one intelligent tutoring system was significantly more effective and dramatically more efficient in preparing students for success in a “gateway” undergraduate STEM course, compared to a remedial “developmental education” course. More pragmatically, hundreds of thousands of U.S. K-12 students are already attending full-time virtual schools or taking virtual courses from K-12 Stride Inc. or Connections Academy Inc. that largely rely on personalized computer-adaptive learning technologies. It seems that our academic programs might be well-served to more systematically advance and study these approaches.

In conclusion, we are encouraged by and encouraging of the many efforts in LDT and related communities to address the discrimination and injustice that continues to plague our educational systems. While much work remains to be done, recent years have seen a surge of relevant research in LDT (e.g., Glazewski and Ertmer, 2020), the Learning Sciences (e.g., Esmonde & Booker, 2016), and Educational Psychology (e.g., Zusho & Kumar, 2018). A central question that deserves further theoretical and empirical consideration is whether the claim that learner “agency” as supported by personalized competency-based learning is indeed the most effective and appropriate response to these challenges (as argued by Sturgis & Casey, 2018, see the Aurora Institute, 2023). It seems impossible that any single class of approaches can address this complex and multifaceted issue. Sustained, concerted, and well-supported effort is needed on all fronts.


Agarwal, P., & Sengupta-Irving, T. (2019). Integrating power to advance the study of connective and productive disciplinary engagement in mathematics and science. Cognition and Instruction, 37(3), 349-366. 

Anderson, J. R., Reder, L. M., & Simon, H. A. (1996). Situated learning and education. Educational Researcher, 25(4), 5-11.

Archambault, L., Leary, H., & Rice, K. (2022). Pillars of online pedagogy: A framework for teaching in online learning environments. Educational Psychologist, 57(3), 178-191. 

Association for Educational Communication and Technology (2023, March 1). Research and Theory Division Mission Statement [Website]. 

Aurora Institute (2023, March 1). The importance of diversity, equity, and inclusion in education [Website and resource collection]. 

Bandura, A. (2000). Exercise of human agency through collective efficacy. Current Directions in Psychological Science, 9(3), 75-78. 

Barab, S., Sadler, T., Heiselt, C., Hickey, D., & Zuiker, S. (2007). Relating narrative, inquiry, and inscriptions: A framework for socioscientific inquiry. Journal of Science Education and Technology, 16, 59-82. 

Barnum, M. & Darville, S. (2018, September 10). Lifting the veil on education’s newest big donor: Inside Chan Zuckerberg’s $300 million push to reshape schools. Chalkbeat. 

Barnum, M. (2019, October 28). Online learning group iNACOL rebrands, says focus is less on tech. more on transformation. Chalkbeat. 

Barnum, M. (2020, May 28). Pandemic boosts “mastery-based” learning, though evidence remains thin. Chalkbeat. 

Bishop, M. J., Boling, E., Elen, J., & Svihla, V. (Eds.). (2020). Handbook of research in educational communications and technology (5th ed.). Springer. 

Bloome, D., Beierle, M., Grigorenko, M., & Goldman, S. (2009). Learning over time: Uses of intercontextuality, collective memories, and classroom chronotopes in the construction of learning opportunities in a ninth-grade language arts classroom. Language and Education, 23(4), 313-334. 

Brown, J., & Adler, R. P. (2008). Open education, the long tail, and learning 2.0. EDUCAUSE Review, 43(1), 16-20.

Cameron, J., & Pierce, W. D. (1994). Reinforcement, reward, and intrinsic motivation: A meta-analysis. Review of Educational Research, 64(3), 363-423.

Casilli, C., & Hickey, D. T. (2016). Transcending conventional credentialing and assessment paradigms with information-rich digital badges. The Information Society, 32(2), 117-129. 

Chartrand, G. T., Andrews, C. D., & Hickey, D. T. (2020). Designing for generative online learning: A situative program of research. In B. Hokason et al. (Eds.), Intersections across disciplines: Educational communications and technology: Issues and Innovations (pp. 81-92). Springer. 

Chartrand, G. T., & Hickey, D. T. (2020). Responsive engagement and virtual learner assessment (REVLA). An open online self-paced course for educators in Canvas for IU Expand. 

Connections Academy. (2018). New research on the efficacy of connections academy virtual public schools reveals positive academic outcomes. [Press Release]. 

Duschl, R. A., & Gitomer, D. H. (1997). Strategies and challenges to changing the focus of assessment and instruction in science classrooms. Educational Assessment, 4(1), 37-73.

Engle, R. A. (2006). Framing interactions to foster generative learning: A situative explanation of transfer in a community of learners classroom. The Journal of the Learning Sciences, 15(4), 451-498. 

Engle, R. A. (2011) The productive disciplinary engagement framework: Origins, key concepts, and developments. In Y. Dai (Ed.), Design research on learning and thinking in educational settings: Enhancing intellectual growth and functioning (pp. 161-200). Routledge. 

Engle, R. A., & Conant, F. R. (2002). Guiding principles for fostering productive disciplinary engagement: Explaining an emergent argument in a community of learners classroom. Cognition and Instruction, 20(4), 399-483. 

Engle, R. A., Lam, D. P., Meyer, X. S., & Nix, S. E. (2012). How does expansive framing promote transfer? Several proposed explanations and a research agenda for investigating them. Educational Psychologist, 47(3), 215-231. 

Engle, R. A., Nguyen, P. D., & Mendelson, A. (2011). The influence of framing on transfer: Initial evidence from a tutoring experiment. Instructional Science, 39, 603-628. 

Enyedy, N. (2014). Personalized instruction: New interest, older rhetoric, limited results, and the need for new direction for computer-mediated learning. National Educational Policy Center. 

Esmonde, I., & Booker, A. N. (Eds.) (2016). Power and privilege in the learning sciences. Taylor and Francis. 

Ferguson, R., Whitelock, D. & Littleton, K. (2010). Improvable objects and attached dialogue: new literacy practices employed by learners to build knowledge together in asynchronous settings. Digital Culture & Education, 2 (1), 103-123. 

Filsecker, M. K., & Hickey, D. T. (2014). A multilevel analysis of the effects of external rewards on elementary students’ motivation, engagement and learning in an educational game. Computers & Education, 75, 136-148. 

Freedman, E., Hickey, D., Harris, T., Chartrand, G. Schamberger, B., & Luo, Q. M. (2023, June). Supporting diversity, equity, and inclusion through productive disciplinary engagement and expansive framing. In J. Slotta, L. Charles. A. Breulleux, T. Laferriére, & R. Casidy (Eds.), Building knowledge and sustaining our community. Proceedings of the annual meeting of the International Society for the Learning Sciences. Montreal, Canada.

Gee, J. P. (1999). The future of the social turn: Social minds and the new capitalism. Research on Language & Social Interaction, 32(1-2), 61-68. 

Glazewski, K. D., & Ertmer, P. A. (2020). Fostering complex problem solving for diverse learners: Engaging an ethos of intentionality toward equitable access. Educational Technology Research and Development, 68, 679-702. 

Greeno, J. G. (1997). On claims that answer the wrong questions. Educational Researcher, 26(1), 5-17.

Greeno, J. G. (1998). The situativity of knowing, learning, and research. American Psychologist, 53(1), 5-30.

Greeno, J. G., & Moore, J. L. (1993). Situativity and symbols: Response to Vera and Simon. Cognitive Science, 17, 48-55.

Hall, R., & Rubin, A. (2013). There’s five little notches in here: Dilemmas in teaching and learning the conventional structure of rate. In J. G. Greeno & S. V. Goldman (Eds.), Thinking practices in mathematics and science learning (pp. 199-246). Routledge. 

Haris, T., Freedman, E., Jongewaard, R., Hickey, D. T., & Schamberger, B. (2023, June). Productive disciplinary engagement and expansive framing as foundations for teacher education. In J. Slotta, L. Charles. A. Breulleux, T. Laferriére, & R. Casidy (Eds.), Building knowledge and sustaining our community. Proceedings of the annual meeting of the International Society for the Learning Sciences. Montreal, Canada.

Herrington, J., Reeves, T. C., & Oliver, R. (2014). Authentic learning environments. Springer. 

Hickey, D. T. (1997). Motivation and contemporary socio-constructivist instructional perspectives. Educational Psychologist, 32(3), 175-219.

Hickey, D. T. (2003). Engaged participation vs. marginal non-participation: A stridently sociocultural model of achievement motivation. Elementary School Journal, 103(4), 401-429. 

Hickey, D. T. (2005). Levels, representations, and iterations: A design-based assessment framework with potential for enhancing vocational education. In H. Gruber, C. Harteis, R. H. Mulder, & M. Rehrl (Eds.), Bridging individual, organizational, and cultural perspectives on professional learning (pp. 327–344). Roderer.

Hickey, D. T. (2015a). A framework for interactivity in competency-based courses. EDUCAUSE Review. 

Hickey, D. T. (2015b). A situative response to the conundrum of formative assessment. Assessment in Education: Principles, Policy & Practice, 22(2), 202-223. 

Hickey, D. T. (2022). Productive disciplinary engagement and expansive framing: The situative legacy of Randi Engle. In M. McCaslin & T. Good (Eds.) Routledge Online Encyclopedia of Education.

Hickey, D. T., & Chartrand, G. T. (2020). Recognizing competencies vs. completion vs. participation: Ideal roles for web-enabled digital badges. Education and Information Technologies, 25(2), 943-956. 

Hickey, D. T., Chartrand, G. T., & Andrews, C. D. (2020). Expansive framing as a pragmatic theory for instructional design. Educational Technology Research and Development, 68(2), 751-782. 

Hickey, D. T., Chartrand, G. T., & Harris, T. (2021, April 24). A systematic review of research on productive disciplinary engagement and expansive framing [Paper presentation]. The Annual Meeting of the American Educational Research Association. 

Hickey, D. T., Duncan, J., Gaylord, C., Hitchcock, C., Itow, R., & Stephens, S. (2020).  gPortfolios: A pragmatic approach to online asynchronous assignments. Information and Learning Sciences, 121(5/6), 273-283. 

Hickey, D. T., & Filsecker, M. K. (2012). Participatory assessment for organizing inquiry in educational videogames and beyond. In K. Littleton, E. Scanlon, & M. Sharples, (Eds.) Orchestrating inquiry learning (pp. 146-174). Taylor and Francis. 

Hickey, D. T., & Granade, J. B. (2004). The influence of sociocultural theory on our theories of engagement and motivation. In D. McInerney and S. Van Etten (Eds.), Big theories revisited: Sociocultural influences on motivation and learning (Volume 4, pp. 200-223). Information Age Publishing.

Hickey, D. T., & Harris, T. (2020). Responsive engagement and virtual learner assessment (REVLA). An open online self-paced course for educators in Google Classroom. 

Hickey, D. T., & Harris, T. (2021). Re-imagining online grading, assessment, and testing using situated cognition. Distance Education, 42(2), 290-309. 

Hickey, D. T., Honeyford, M. A., & McWilliams, J. C. (2013). Participatory assessment in a climate of accountability. In H. Jenkins & W. Kelly (Eds.), Reading in a participatory culture: Remixing Moby-Dick in the English classroom (pp. 169-184). Teachers College Press.

Hickey, D. T., Ingram-Goble, A., & Jameson, E. (2009). Designing assessments and assessing designs in virtual educational environments. Journal of Science Education and Technology, 18, 187-208. 

Hickey, D. T., Kindfield, A. C. Horwitz, P., & Christie, M. A. (2003). Integrating curriculum, instruction, assessment, and evaluation in a technology-supported genetics environment. American Educational Research Journal, 40(2) 495-538. 

Hickey, D. T., & Lam, C. (2022, April 24). Centering situative transfer for culturally-sustaining educational assessment [Paper presentation]. The Annual Meeting of the American Educational Research Association. 

Hickey, D. T., & Lam, C. (2023, April 15). Culturally-sustaining educational assessment: A new approach to enduring challenges [Paper presentation]. The Annual Meeting of the American Educational Research Association. 

Hickey, D. T., & Lam, D. (in press). Emerging perspectives on the transfer of learning. In A. O'Donnell & J. Reeve (Eds.), Handbook of educational psychology. Oxford University Press. Manuscript accepted, June 14, 2021.

Hickey, D. T., & McCaslin, M. (2001). Comparative and sociocultural analyses of context and motivation. In S. Volet & S. Järvelä (Eds.), Motivation in learning contexts: Theoretical and methodological implications (pp. 33-56). Pergamon/Elsevier.

Hickey, D. T., McWilliams, J. T., & Honeyford, M. A. (2011). Reading Moby-Dick in a participatory culture: Organizing assessment for engagement in a new media era. Journal of Educational Computing Research, 44(4), 247-27. 

Hickey, D. T., & Pellegrino, J. W. (2005). Theory, level, and function. Three dimensions of transfer for understanding student assessment. In J. Mestre (Ed.), Transfer of learning from a modern multidisciplinary perspective (pp. 251-293). Information Age Publishing.

Hickey, D. T., & Rehak, A. (2013). Wikifolios and participatory assessment for engagement, understanding, and achievement in online courses. Journal of Educational Media and Hypermedia, 22(4), 229-263.

Hickey, D. T., Robinson, J., Fiorini, S, & Feng, Y. (2019). Internet-based alternatives for equitable preparation, access, and success in gateway courses. The Internet in Higher Education, 44(January),1-14. 

Hickey, D. T. & Schafer, N. J (2006). Design-based, participation-centered approaches to classroom management. In C. Evertson and C. Weinstien (Eds.), Handbook for classroom management: Research, practice, and contemporary issues (pp. 887-908). Merill-Prentice Hall. 

Hickey, D. T., & Schenke, K. (2019). Open digital badges and reward structures. In K. A. Renninger & S. E. Hidi (Eds.), The Cambridge handbook on motivation and learning (pp. 209-237). Cambridge University Press. 

Hickey, D. T., Tassoobshirazi, G., & Cross, D. (2012). Assessment as learning. Enhancing discourse, understanding, and achievement in innovative science curricula. Journal of Research in Science Teaching, 49, 1240-1270. 

Hickey, D. T., & Uttamchandani, S. L. (2017). Beyond hype, hyperbole, myths, and paradoxes: Scaling up participatory learning in a big open online course. In L. Losh (Ed.), The MOOC moment: Experiments in scale and access in higher education (pp. 13-36). The University of Chicago Press.

Hickey, D. T., Uttamchandani, S. L., & Chartrand, G. T. (2020). Competencies in context: New approaches to capturing, recognizing, and endorsing learning. In Bishop, M. J., Boling, E., Elen, J., & Svihla, V. (Eds.). Handbook of research in educational communications and technology (5th ed., 547-592). Springer. 

Hickey, D. T., & Willis, J. E. (2017).  Where open badges appear to work better: Findings of the Design Principles Documentation Project. Indiana University Center for Research on Learning and Technology.. 

Hickey, D. T., Willis, J. E., & Quick, J. D. (2015). Where badges work better. EDUCAUSE Learning Initiative Brief. 

Hickey, D. T., & Zuiker, S. J. (2012). Multi-level assessment for discourse, understanding, and achievement in innovative learning contexts. The Journal of the Learning Sciences, 22(4), 1-65. 

Hidi, S. (2016). Revisiting the role of rewards in motivation and learning: Implications of neuroscientific research. Educational Psychology Review, 28(1), 61–93.

Itow, R. C. (2018). Professional development is not a summer job: Designing for teacher learning that is valuable and valued. [Doctoral Dissertation, Indiana University].

Jenkins, H. (2012, March 4). How to earn your skeptic “badge.” Confessions of an Aca-Fan. The official blog of Henry Jenkins [Blog post]. 

Koedinger, K. R., McLaughlin, E. A., & Heffernan, N. T. (2010). A quasi-experimental evaluation of an on-line formative assessment and tutoring system. Journal of Educational Computing Research, 43(4), 489-510. 

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge University Press.

Lemke, J. L. (2000). Across the scales of time: Artifacts, activities, and meanings in ecosocial systems. Mind, Culture, and Activity, 7(4), 273-290. 

Lepper, M. R., Greene, D., & Nisbett, R. E. (1973). Undermining children's intrinsic interest with extrinsic reward: A test of the" overjustification" hypothesis. Journal of Personality and Social Psychology, 28(1), 129-137

Lepper, M. R., Henderlong, J., & Gingras, I. (1999). Understanding the effects of extrinsic rewards on intrinsic motivation—uses and abuses of meta-analysis: Comment on Deci, Koestner, and Ryan (1999). Psychological Bulletin, 125, 669–676. 

Lepper, M. R., & Malone, T. W. (1987). Intrinsic motivation and instructional effectiveness in computer-based education. In R. E. Snow & M. J. Farr (Eds.), Aptitude, learning and instruction: III. Conative and affective process analyses (pp. 255-286). Erlbaum.

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13-23.

National Academies of Sciences, Engineering, and Medicine. (2018). How people learn II: Learners, contexts, and cultures. National Academies Press. 

National Research Council. (2000). How people learn: Brain, mind. Experience, and schooling. National Academies Press. 

National Research Council. (2001). Knowing what students know: The science and design of educational assessment. National Academies Press. 

National Standards for Quality. (2018). National standards for quality online teaching. 

Pane, J. F. (2018). Strategies for implementing personalized learning while evidence and resources are underdeveloped: Perspective. RAND Corporation.

Pane, J. F., Steiner, E. D., Baird, M. D., Hamilton, L. S., & Pane, J. D. (2017). Informing progress: Insights on personalized learning implementation and effects [Research report]. RAND Corporation. 

Paris, D. (2012). Culturally sustaining pedagogy: A needed change in stance, terminology, and practice. Educational Researcher, 41(3), 93-97. 

Pellegrino, J. W. (2002). Knowing what students know. Issues in Science and Technology, 19(2), 48-52.

Popham, W. J. (1990). Face validity: Siren song for teacher testers. In W. C. Conoley, J. Y. Mitchell Jr., S. L. Wise, & B. S. Plake (Eds.). Assessment of teaching: Purposes, practices, and implications for the profession (pp. 1-14). Erlbaum

Reeves, T., & McKenney, S. (2020). Forward. In M. J. Bishop, E. Boling, J. Elen, J., & V. Svihla, (Eds.). Handbook of research in educational communications and technology (5th ed., pp. v-x). Springer. 

Resnick, M. (2012, February 27). Still a badge skeptic [Blog post]. HASTAC.  

Ruiz‐Primo, M. A., Shavelson, R. J., Hamilton, L., & Klein, S. (2002). On the evaluation of systemic science education reform: Searching for instructional sensitivity. Journal of Research in Science Teaching, 39(5), 369-393. 

Ryan, R. M., & Deci, E. L. (1996). When paradigms clash: Comments on Cameron and Pierce’s claim that rewards do not undermine intrinsic motivation. Review of Educational Research, 66(1), 33-38. 

Sivan, E. (1986). Motivation in social constructivist theory. Educational Psychologist, 21(3), 209-233.

Sturgis, C., & Casey, K. (2018, April). Designing for equity: Leveraging competency-based education to ensure all students succeed. Aurora Institute. 

Tang, S. H., & Hall, V. C. (1995). The overjustification effect: A meta‐analysis. Applied Cognitive Psychology, 9(5), 365-404.

Thomas, M. J. (2002). Learning within incoherent structures: The space of online discussion forums. Journal of Computer Assisted Learning, 18(3), 351-366. 

Zusho, A., & Kumar, R. (2018). Critical reflections and future directions in the study of race, ethnicity, and motivation [Special Issue]. Educational Psychologist, 53(2). 


This work described here was generously supported by the U.S. National Science Foundation, the U.S. Department of Education, the MacArthur Foundation, Google, Indiana University, the University of Georgia, The NASA-funded Center for Educational Technologies, the National Endowment for the Humanities, and the Indiana Governor’s Office. I thank the many colleagues who have contributed to the ideas and efforts described here. My Ph.D. advisors, mentors, and committee members included James Pellegrino, John Bransford, Susan Goldman, David Cordray, Ted Hasselbring and Dan Schwartz. Co-investigators and co-authors and postdoctoral fellows include Kate Anderson, Sasha Barab, Michael Beam, Elizabeth Bonsignore, Carla Casilli, J Duncan, Eric Freedman, Melissa Gresalfi, Derek Hansen, Paul Horwitz, Rebecca Itow, Ann Kindfield, Diane Lam, Mitzi Lewison, Mary McCaslin, Steven McGee, Nate Otto, James Pellegrino, Jill Robinson, Phillip Piety, Katerina Schenke, James Willis III, and Edward Wolfe. Graduate Research Assistants and research staff include Chris Andrews, Grant Chartrand, Christina Chow, Dionne Cross, Michael Filsecker, Ellen Jameson, Tripp Harris, Michelle Honeyford, Jeremy Granade, Rebekah Jongewaard, Adam Ingram-Goble, Charmian Lam, Hyejeong Lee, Marina Michael., Jake McWilliams, Qianxu Morgan Luo, Joshua Quick, Andrea Rehak, Nancy Schafer, Bianca Schamberger, Xinyi Shen, Gita Taasoobshirazi, Cathy Tran, Suraj Uttamchandani, and Steven Zuiker. 

Daniel Hickey

Indiana University

Daniel T. Hickey is a Professor with the Learning Sciences Program at Indiana University. He studies new approaches to instruction, assessment, testing, and credentialing, mostly using situative theories of learning and mostly in online learning settings. His research has been supported by the U.S. National Science Foundation, the U.S. Department of Education, The MacArthur Foundation, Google, Indiana University, and others.

This content is provided to you freely by EdTech Books.

Access it online or download it at