25

Ten principles of alternative assessment

DOI:10.59668/279.12260
EquityInclusionCo-creationAlternative AssessmentAuthentic assessment
Assessment design is an integral part of learning design. This work came to life due to the demand for alternative ways of assessing students highlighted by the COVID-19 pandemic. This chapter presents a summary of ten principles for alternative assessment in higher education. The chapter is divided into four sections. In this first section, I locate myself and describe the context of the chapter. In the second section, I provide a brief background of why this chapter is needed. In the third section, I describe the theoretical frameworks that inspired my work. The last section contains a description of each of the principles. Within the description, there are concrete examples and resources that will help readers in the process of rethinking their assessment design. This chapter is significant to higher education educators as well as educational developers.

Introduction

In 2020, the move to remote learning imposed by the COVID-19 pandemic reminded educators that we need to explore other options when it comes to assessing students’ learning. Exams are not equitable as they carry the notion that all students have the same opportunities for success (Singer-Freeman & Robinson, 2020). Additionally, the pandemic showed educators that it is possible for students to demonstrate their learning in more meaningful, engaging and equitable ways.

The pandemic induced many changes. On the surface, educators saw concerns about assessment proctoring and academic integrity. On a deeper level, educators noticed changes related to flexibility, timing and equity. The uncertainty of the geographical location and home duties of the students implied that assessments needed to be more flexible; which in turn implied that instructors needed to consider that students should be able to take the exam at a time that worked for them.

Instructors also had discussions about students’ learning and constructive alignment. Those changes paved the way for deeper changes or a paradigm shift. There was now an opportunity to think about assessment differently and to discuss how to reconceptualise or redefine assessment. Questions arose: What would higher education look like without exams? What could educators do better, and how could we do it?

This chapter presents a summary of ten principles for alternative assessment in higher education. Work on the principles was inspired by my own personal teaching and educational development experience and designed around evidence-based approaches. The chapter is divided into four sections. In this first section, I locate myself and describe the context of the chapter. In the second section, I provide a brief background of why this chapter is needed. In the third section, I describe the theoretical frameworks that inspired my work. The last section contains a description of each of the principles. I end the chapter with a summary of the goal of the chapter.

Locating myself and my interests

As I reflect on my view of the world, the word “journey” comes to mind – a journey across an educational divide and a journey across multiple identities. I grew up in a working-class family in Lebanon. Inside me, I hold both a working-class and an academic mindset, the latter of which was acquired later in life. I started my career working in the K-12 system and now I work in higher education. I am also a woman who is considered a visible minority in Canada. English is my third language. All my life experiences define the way I view and experience assessment.

In my academic career working in universities in Canada, I have worn many hats. In the last 13 years, I have been a teacher, an instructional designer, an educational developer and an assistant professor. In the last two years, as an academic developer, I have worked with instructors from several disciplines to create more meaningful assessment plans for their courses. I have encountered many instructors who are looking for new ways to assess students and many of my discussions with them have revolved around allowing students to demonstrate their learning. The idea to provide instructors with information about how they can assess their students differently brought this chapter to life.

Locating the assessment and student contexts

It is also important to describe the assessment context in which this chapter originated. With the pandemic preventing institutions from conducting face-to-face exams, instructors were searching for alternative ways of assessing students. It is important to note that in this particular assessment context, the use of proctoring software was discouraged and limited; instructors were instead encouraged to find an alternative means of assessment that did not add stress to the students and supported academic integrity.

The instructors I worked with taught a diverse body of students. The diversity of the student body is too large to describe in one paragraph, but I have tried to include some of the characteristics. Students belonged to different demographic groups, consisting of undergraduate and graduate students, international and domestic, in Canada and abroad, native English speakers and English language learners, neurodiverse and neurotypical, and traditional and non-traditional students. Students also belonged to different socio-cultural backgrounds and different learning exceptionalities. The diversity of students impacted the choice of assessment, as the students did not have the same access to technology and/or other resources.

The changes incurred by the pandemic allowed for conversations on reframed academic rigour, the cultural relevance of assessment, assessment bias and equity in assessment, among other topics. I recall chatting with educators about assessment in science, technology, engineering and mathematics (STEM) and whether bias exists in these settings. The pandemic also brought forth discussions about the mental stresses that equity-minded educators undergo and the need to have these conversations in a community where they feel they are not alone. The pandemic allowed these important conversations that educators were not having before to surface. Undoubtedly, these conversations will change the face of assessment in higher education in years to come.

Traditional exam-based model: Why change is needed

Conventionally, a uniform timed exam for students has been considered the objective way of measuring learning. Higher education leadership and instructors have used exams to obtain an objective assessment of students’ learning Although concerns have been raised about inequities in assessment (Montenegro & Jankowski, 2017), exams are still the primary mode of assessment. Curtis and Anderson (2021) summarised the state of assessment, noting that “assessment in the classroom is one of the most highly guarded and protected aspects of higher education and one of the last holdouts of sole faculty ownership” (p. 56). Traditional assessment favours one type of knowledge, one way of demonstrating that knowledge, and one type of language. It is time and location-specific.

Studies have highlighted several concerns related to traditional forms of assessment. The first concern is that exams are not as objective as they claim to be. Orr (2007) and Sadler (2009) questioned the objectivity of exams. Struyven et al. (2005) found that exams encouraged surface rather than deep learning; while Kellaghan and Greaney (2019) found exams failed to measure the various skills included in the curriculum, which was also corroborated by Bloxham (2007).

A second concern with exams is that it marginalises students for several reasons. Kellaghan and Greaney (2019) found that exams marginalised indigenous students and students from poor socio-economic backgrounds. Hartley et al. (2007) found that exams presented gender bias and Burke and Jackson (2007) found that the traditionally recognised structure of a good essay was deeply gendered. Similarly, Hounsell (2007) found that exams marginalised diverse students

Despite the many examples that I provide in this chapter, alternative assessment remains at the periphery of traditional assessment practices, which continue to occupy the central location. The dominant approach of a single, one-way transmission of knowledge (traditionally from instructor to student) enforces peripheries. Traditional assessment rewards one type of knowledge and views all students as the same, without careful consideration of the following kinds of factors that put them at a disadvantage.

These are only a few examples of factors that put students at the periphery of traditional assessment, which considers all students to have the same chance of success in one-size-fits-all exams.

An alternative assessment framework: Overarching theoretical approach

I explored several theoretical frameworks for alternative assessment. No single, overarching framework covers all the aspects I wish to address. I have therefore used multiple frameworks as a basis for articulating my 10 principles.

The most relevant theoretical framework I have drawn on in developing my 10 principles is Universal Design for Learning (UDL) (CAST, 2018). Like UDL, my work is rooted in a social justice framework and I aim to address the dynamics of oppression and privilege in my design. I draw upon theories of multiculturalism, critical pedagogy and anti-oppressive education as social justice frameworks. The principles are also inspired by a feminist approach to education (Noddings, 1984).

I propose a framework for alternative assessment that takes students who are at the periphery into consideration and helps educators design for the success of all students, thus not leaving any student at the periphery. This approach allows instructors to minimise gaps and create a welcoming space to connect all students. I invite instructors to consider the following 10 principles and their applications in the design of alternative assessments. These 10 principles are practice- and evidence-based; incorporate the ideals of equity, diversity, and inclusion; and include elements of innovation in teaching and learning. In addition, the principles are a result of advancements in technology and the possibilities of online education, as well as a recognition of the importance of engaging students. In the next section, I present each of the principles, give a few theoretical and practical examples, and offer recommendations for those who do assessment design.

Ten principles of alternative assessment

Figure 1 presents a graphic visualisation of the 10 principles of alternative assessment. Alternative assessment takes multiple forms and is authentic, equitable, flexible, renewable, interdisciplinary, co-created, continuous, culturally responsive and engaging. Each of these 10 principles is explained in detail below. 

Figure 1

Ten principles of alternative assessment 

Figure 1 presents, in no particular order, the ten principles of alternative assessment: alternative assessment is authentic, equitable, flexible, renewable, interdisciplinary, co-created, continuous, culturally responsive and engaging.


Principle 1: Alternative assessment uses a variety of forms and mediums

You can check for students’ understanding in a variety of ways apart from the traditional exam or essay. Technology, social media and multimedia have opened doors to a myriad of ways that students can demonstrate their learning. Similarly, projects have taken on a whole new dimension where community focus and service learning are woven into the design. Have you been thinking about removing the exam from your course? Depending on your learning objectives, you could ask your students to interview a classmate or expert, create a blog to reflect on their learning or record a tutorial.

If you are unsure of the appropriate medium to use to assess students, a first suggestion is to give them a list of assessment mediums that they can choose from (Garside et al., 2009; Mogey et al., 2019; O’Neill, 2011, 2017). A second suggestion is to ask the students what assessment would help their learning. Alternatively, you could combine those options, giving students a range of choices and allowing them to suggest one that works for them. For 35 alternatives to traditional assessment, consult A Guide to Alternative Assessments (Elkhoury, 2020).

Principle 2: Alternative assessment is authentic

Authentic assessment replicates real-life situations. These situations include examples learners might encounter in their current life (Frey et al., 2012) or situations that might come up in their future jobs (Gulikers et al., 2004). In addition, authentic assessment is designed to allow students to use the skills needed in professional life situations (Mueller, 2005). Authentic assessment also includes activities that are meaningful to the students (Mueller, 2005). Based on these characteristics, the goal of authentic assessment is to prepare students for real life and in our fast-changing world part of that goal is to prepare them for a changing world.

When moving to the design of authentic assessments, consider using real, contemporary examples. In your quest for contemporary topics of interest, refer to Google Trends or trends pages on social media platforms (e.g. Twitter). This information will allow you to gauge contemporary events and interests. To give students more autonomy, an approach is to guide them to choose a topic of interest and use assessment to explore that topic. It is a great way to motivate learners by allowing them to bring their selves, life circumstances and values to the classroom (Wlodkowski & Ginsberg, 2017).

One way to design authentic assessments is to use an experiential learning assessment. For instance, you could allow the students to find a community partner. Are you teaching your students to code? You could ask them to find a community partner such as a library or museum that would benefit from coding a program that would address their needs. The best approach to design is to ask yourself what evidence would allow you to know that the intended learning has occurred and design around that expectation (Bull, 2015).

The University of Tennessee, Knoxville (n.d.) has more information on different types of experiential learning. If you wish to learn more about authentic assessment, I recommend exploring Ashford-Rowe et al.’s (2014) eight critical questions as a baseline to evaluate what constitutes authenticity within assessment activities. Finally, Mueller (2018) has also created a helpful framework for designing authentic assessment.

Principle 3: Alternative assessment is equitable

Designing alternative assessment that is equitable is a journey. The conversation on equity and assessment has expanded since I started this work and one thing I am sure of is that there is much still to be done. In this section, I describe my work designing a massive open online course on equity in assessment design.

Equity in assessment is a broad concept that implies designing assessments that allow every student the opportunity to succeed. Work on equity starts with oneself. It is important to examine positionality and bias in the design process. In addition, through assessment, you can help students build a community and have a sense of belonging. Another approach to equity in assessment is to provide equitable feedback, such as by building feedback literacy in your class early on. Students come from different feedback cultures and respond differently to cultural feedback. You can try to ensure that your feedback is free of cultural references and ambiguous English.

You can read more about equity in assessment in Montenegro and Jankowski’s (2020) report, A New Decade for Assessment: Embedding Equity Into Assessment Praxis.

Principle 4: Alternative assessment is flexible

Flexibility can take many forms at different levels. Flexibility in assessment recognises that students have a wide range of inflexible responsibilities, both academic and personal, and many professionals consequently provide alternatives to them. Rumsey (1994) explained that flexible assessment practices “can accommodate the scope of knowledge and skills encompassed by the assessment criteria, the variations in context in which assessment may be conducted, and the range of needs and personal situations of potential candidates” (p. 20). Wood and Smith (1999) distinguished nine methods that allow instructors to be flexible in assessment: flexibility in components, style, tools, feedback, grouping, weighting, content and marking. Collis and Moonen (2011) suggest that flexibility consists of offering different options for the “how, what, where, when, and with whom” (p. 15) students partake in learning.

Giving assessment choice to students has been explored in various studies. Adams et al. (2017) found that giving students choice in their assessment increased their motivation and supported inclusion. Other studies found that choice of assessment has great potential to support equity of assessment in diverse student cohorts (Garside et al., 2009; Mogey et al., 2019; O’Neill, 2017). Most recently, O’Neill and Padden (2021) found that giving students choice in their assessment increased engagement and empowerment. In addition, students favoured a wider range of assessment than teachers perceived (Morris et al., 2019). Educators have multiple approaches to flexibility in assessment. Mealy (2018; 2019) show how they applied flexibility in assessment over two consecutive years. In 2018, Mealy asked students to decide on what type of assessment they wanted in the first week of the course. In 2019 and following students’ feedback, Mealy adjusted the approach and students could make decisions on a week-by-week basis. Leung and Kier (2017) gave students three grading schemes to choose from, whereas Pretorius et al. (2017) used a combination of allowing students to make decisions and providing them with choices. Smith (2021) redesigned his course with no deadlines during the pandemic to allow students to manage workload.

If you wish to know more, you can consult Ryerson University’s (n.d.) Flexible Learning Resource; Teesside University’s (n.d.) blog post, “Flexible assessment for a hybrid model”, which offers best practices for flexible assessment; and “Flexibility in Assessment” in de Bie and Brown’s (2017) Forward with Flexibility. Additional practical examples of how to use choice in assessment can be found in UCD Teaching and Learning’s (2010) A Practitioner’s Guide to Choice of Assessment Methods Within a Module.

Principle 5: Alternative assessment is renewable

In a panel I organised for students to give their insights about the alternative assessment they experienced during the pandemic, one of the students indicated that the most meaningful assessment they had experienced was creating a Wikipedia page for the course. When asked why, the student indicated that they felt that the assessment was not forgotten after the submission, and they could build on it throughout the semester. They also felt that the assessment was not written only for the instructor’s eyes. Most importantly, the student said that they could go back to the assessment while taking other courses and make modifications. The student was happy that what they had written could reach a wide audience.

This example demonstrates the idea behind renewable assessments. First, students are no longer consumers of knowledge; they are also producers of knowledge. Examples of renewable assessment include creating a website, tutorial or even an open textbook with your students. You can see multiple examples of how instructors have created renewable assessments in Bruff’s (2013) blog entry on students as producers and in Wallis’s (n.d.) “Renewable assignments” chapter in Structured Renewable Assignments.

Principle 6: Alternative assessment is interdisciplinary

Interdisciplinary learning is defined as a process by which “learners integrate information, data, techniques, tools, perspectives, concepts, and/or theories from two or more disciplines to craft products, explain phenomena, or solve problems, in ways that would have been unlikely through single-disciplinary means” (Mansilla & Learning, 2017, p. 289). Interdisciplinary assessment allows students to make connections between the different pieces they are learning about. Interdisciplinary assessment helps prepare students for the 21st century (Klein, 2018) and allows them to develop a global view. Ringby and Duus’s (2017) article about using an innovation camp in health education is a good example of interdisciplinarity.

Principle 7: Alternative assessment is co-created

Partnership is an important aspect of alternative assessment. Partnership in this context is twofold, including partnership with and among students. Pedagogical partnership is “a collaborative, reciprocal process [whereby] … all participants have the opportunity to contribute equally, although not necessarily in the same ways, to curricular or pedagogical conceptualization, decision making, implementation, investigation, or analysis” (Cook-Sather et al., 2014, pp. 6–7). In brief, students’ partnership in assessment implies that students involved in assessment must be a part of the entire process. It also implies that students cannot simply be recipients of assessments, they must be central to the practices in the classroom.

Researchers who have explored the concept of students as partners, such as Cook-Sather et al. (2014), provide a wealth of information on the benefits of this approach for both the educator and the students. Hinchcliffe et al. (2021) published a student partnership in assessment guide, in which you can find practical examples. Student Partnerships in Quality Scotland (Sparqs, n.d.) has a participation matrix in its staircase exercise that will help you reflect on the role of students.

Partnership among students is also an important aspect of alternative assessment and allows educators to build a pragmatic sense of community in which students get to know and appreciate one another. This partnership can support students during difficult moments and build connections beyond the assessment.

I have collected a list of examples of what you can do to involve your students as partners in your assessment. One option is a class-constructed assessment rubric, in which you can include your students in the construction of the rubric. This practice will help you gain an understanding of the different interpretations that students have of the criteria and the descriptors. It also empowers students to feel that their voices are being heard. You can always start by reviewing an existing rubric with your students instead of creating one from scratch.

Another idea is to use partnership in questions; for example, you could ask your students to provide assessment questions. Students can use these questions for practice and you can choose some of them to include in the assessment. If you are asking students to do presentations, you can ask them to provide assessment questions after their presentation. These questions could be circulated to students to practice, and be used in the assessment. PeerWise Publications (n.d.) provides a number of examples of co-created assessments on its website.

Decide together with students on the grading scheme, medium of assessment, deadline and the content of the assessment (a topic students are interested in). When giving feedback on assessments, involve students as partners. This involvement requires getting feedback from students on your assessment. You can also seek feedback on the syllabus. For example, provide a shared document link to the syllabus and ask students to provide comments. Another way to co-create assessment with students is to seek anonymous feedback from them on the assessment. A good practice is to curate the information you receive from the feedback, share it with your students and let them know what actions you will take.

Principle 8: Alternative assessment is continuous

According to Hernández (2012), continuous assessment holds a formative function for feedback and summative assessment for grades. In addition, there are no additional tasks for a final exam – the assessment and the course tasks are the same. Continuous assessment strengthens the feedback function, as it allows time for students to incorporate feedback before the next assessment and avoid last-minute cramming that happens usually before a final exam (Trotter, 2006).

Continuous assessment reminds me of a discussion I was having with an instructor who was worried that the students were skipping a whole assignment that was worth 20% of the final grade. The assessment required them to learn a skill that was used only for this assessment. As a result, students who were not invested in learning that skill skipped the whole assessment. During my conversations with the instructor, we both realised that it was a standalone assessment; skipping it did not impact the class. To avoid this situation, you can use continuous or embedded assessment where students need to use the knowledge learned in the assessment to continue learning throughout the course.

For more information on continuous assessment, check out Bjælde et al.’s (2017) article on continuous assessment in higher education in Denmark. Coimbra Group (2018) also has an informative White Paper chronicling current trends in assessment in Europe.

Principle 9: Alternative assessment is culturally responsive

Culturally responsive pedagogy acknowledges the significance of culture on student learning (Ladson-Billings, 1994). As Strange and Banning (2015) state, student cultures “can play an important role, for good or otherwise, in introducing students to and maintaining their engagement in the learning process” (p. 53). Culturally responsive assessment also creates opportunities for students to experience deep learning by honouring their prior knowledge and experiences. Adjacent and complementary to culturally responsive assessment are culturally sensitive assessment and culturally safe assessment.

Many significant studies have been conducted on cultural responsiveness in assessment, of which I would like to mention a few to ground this discussion. First, it is important to understand the cultural disconnect and the fallacies related to culture and assessment. To learn more about this topic, you can explore Estrin and Nelson-Barber (1995), who outlined three aspects of cultural disconnect in assessment: (1) over-reliance on Eurocentric context and content within assessment instruments; (2) tension between individual versus collaborative strategies for assessment task completion, and (3) fixed versus flexible pacing and timelines for assessment completion. Sedlacek (1994) has identified five fallacies related to culture and assessment, stressing that most measures have not been designed with non-traditional or underserved populations in mind.

Alternative assessment is also culturally relevant. The student body is becoming increasingly culturally, linguistically and ethnically diverse. Our students’ diversity influences the way they perceive information and make sense of it. Work on creating culturally responsive assessment is a journey of reflection and action.

Principle 10: Alternative assessment is engaging

There are many recommendations about the importance of making your lessons engaging for all students. Engagement can take many forms, such as creating fun, collaborative learning and interactions with peers, educators and the community. Part of making your lessons engaging includes designing engaging assessment. Given that other aspects of engagement, such as collaboration, interaction and community, have been covered in previous subsections, I focus here on engaging, fun and creative assessment.

A scan of the literature shows that many instructors have used creative assessments to engage and motivate their students. Of the many examples, I describe using social media and gamification to engage learners. Content in social media takes multiple forms: short videos, a limited number of characters, longer posts, visual posts or strictly professional posts. You can engage learners with social media content or social media–like content, meaning that learners can create the content but not necessarily post it online. One example I particularly like is asking students to create a meme about the course content. I then take two minutes to go over the memes at the beginning of the next class. This technique serves to engage students in a fun way while also reviewing the content.

Gamification in assessment uses elements of games in the assessment design. Gamification is known to decrease test anxiety (Pitoyo, 2019) and increase learners’ motivation and engagement (Menezes & Bortoli, 2016). Approaches to gamification in assessment are varied, such as choose-your-own-adventure, badges, unique reward systems and checkpoints. Elements of gamification can be seen in everyday life, such as progress bars, the galaxy achievement system in Khan Academy and trophies in Snapchat.

In this subsection, I highlight the choose-your-own-adventure approach to gamified assessment. This concept debuted in the 1970s and became popular in the 1980s. Lately, it has been popularised in Netflix’s newest Black Mirror film, Bandersnatch (Sims, 2018). The interactive nonlinear approach has been used in books, tutorials, quizzes, presentations and orientations. For example, the choose-your-own-adventure approach has been used in education for medicine and pharmacy (Scott et al., 2021), anesthesia (Buhl et al., 2021), podiatry (Wilson-Stewart, 2017), engineering (Edington et al., 2020), and theology (Duperon, 2017).

For more concrete examples of how to design choose-your-own-adventure assessments, I recommend Stachowiak’s (2015a; 2015b) two-part blog post on her personal experiences. Tucker (2020) also has an engaging blog post on designing a choose-your-path learning adventure.

Conclusion

The goals of this chapter were threefold. First, I wanted to raise awareness about alternative assessment and stimulate conversation about its characteristics and potential for increased student learning. To that end, I have presented the 10 principles of alternative assessment as a practical blueprint. Second, I have grounded the 10 principles in research and praxis. They are based on common standards used by educators and designers, and I have given examples to help educators apply them.

The principles are a work in progress. Most importantly, the principles are stronger when they are contextualised. I expect them to evolve when instructors apply them in their contexts. I call on instructors to explore the strengths and weaknesses of these principles and to share them with the community. I also encourage instructors to seek feedback from students and to reflect on them. Finally, we need to acknowledge the central role that institutional support and institutional resources play in encouraging instructors to explore alternative assessment.

Now it is time to continue the conversation about alternative assessment to meet the third goal: raising awareness of the peripheries that sit at the border of traditional assessment and that disadvantage some students. For this goal, each of us has a part to play.

References

Adams, N., Little, T. D., & Ryan, R. M. (2017). Self-determination theory. In L. M. Wehmeyer, K. A. Shogren, T. D. Little, & S. J. Lopez (Eds.), Development of self-determination through the life-course (pp. 47–54). Springer Netherlands.

Ashford-Rowe, K., Herrington, J., & Brown, C. (2014). Establishing the critical elements that determine authentic assessment. Assessment & Evaluation in Higher Education, 39(2), 205–222. https://doi.org/10.1080/02602938.2013.819566 

Bjælde, O. E., Jørgensen, T. H., Lindberg, A. B. (2017). Continuous assessment in higher education in Denmark: Early experiences from two science courses. https://stll.au.dk/fileadmin/stll.au.dk/Continuous_assessment_in_higher_education_in_Denmark_-_Early_experiences_from_two_science_courses.pdf

Bloxham, S. (2007, October 26). A system that is wide of the mark. Times Higher Education Supplement. www.timeshighereducation.co.uk/story.asp?storyCode=310924&sectioncode=26 

Bruff, D. (2013, September 3). Students as producers: An introduction. Vanderbilt University. https://cft.vanderbilt.edu/2013/09/students-as-producers-an-introduction/ 

Buhl, L. K., Wong, V., & Jones, S. B. (2021). Teaching medical students to create anesthetic plans using a branched-chain learning module. A & A Practice, 15(4), e01446. https://doi.org/10.1213/xaa.0000000000001446

Bull, B. (2015). Learning beyond letter grades: Exploring the promise, power & possibility of feedback & assessment [Webinar]. Educause. https://events.educause.edu/eli/courses/2015/eli-course-learning-beyond-letter-grades-exploring-the-promise--possibility-of-assessment

Burke, P. & Jackson, S. (2007) Reconceptualising lifelong learning. Feminist interventions. Routledge.

CAST (2018). Universal Design for Learning Guidelines version 2.2. http://udlguidelines.cast.org 

Coimbra Group. (2017). Current trends in assessment in Europe—The way forward. https://www.coimbra-group.eu/wp-content/uploads/WP-Trends-in-assessment-FINAL.pdf 

Collis, B., & Moonen, J. (2011). Flexibility in higher education: Revisiting expectations. Comunicar: Media Education Research Journal, 19(2). https://doi.org/10.3916/c37-2011-02-01 

Cook-Sather, A., Bovill, C., & Felten, P. (2014). Engaging students as partners in learning and teaching: A guide for faculty. Jossey-Bass.

Curtis, N. A., & Anderson, R. D. (2021). Moving toward student–faculty partnership in systems-level assessment: A qualitative analysis. International Journal for Students as Partners, 5(1), 57–75. https://doi.org/10.15173/ijsap.v5i1.4204 

De Bie, A., & Brown, K. (2017). Flexibility in assessment. In Forward with Flexibility: A teaching and learning resource on accessibility and inclusion. McMaster University. https://flexforward.pressbooks.com/chapter/flexibility/

Duperon, M. (2017). A choose‐your‐own‐adventure strategy for unlocking theory in religious studies: Describe a successful classroom teaching tactic that could be replicated by other instructors. Teaching Theology & Religion, 20(3), 263–264. https://doi.org/10.1111/teth.12394

Eberly Center. (n.d.). Give students choice, where appropriate. Carnegie Mellon University. https://www.cmu.edu/teaching/designteach/teach/classroomclimate/strategies/choice.html 

Edington, S., Cameratti-Baeza, C. G., Knudsen, R., & Marsik, F. J. (2020, June). Choose your own adventure: Introducing student choice into a first-year experience course [Paper presentation]. 2020 ASEE Virtual Annual Conference. https://doi.org/10.18260/1-2--34282

Elkhoury, E. (with R. D. Winkler). (2020, November 23). A guide to alternative assessments (Version 1.0). https://www.yorku.ca/teachingcommons/wp-content/uploads/sites/38/2021/07/Guide_alternative_assessments-UPDATED-July-2021.pdf

Estrin, E. T., & Nelson-Barber, S. (1995). Issues in cross-cultural assessment: American Indian and Alaska native students. Far West Laboratory, 12, 18. https://www.wested.org/resources/issues-in-cross-cultural-assessment-american-indian-and-alaska-native-students/

Frey, B. B., Schmitt, V. L., & Allen, J. P. (2012). Defining authentic classroom assessment. Practical Assessment, Research, and Evaluation, 17(1), Article 2.https://doi.org/10.7275/sxbs-0829

Garside, J., Nhemachena, J. Z., Williams, J., & Topping, A. (2009). Repositioning assessment: Giving students the ‘choice’ of assessment methods. Nurse Education in Practice, 9(2), 141–148. https://doi.org/10.1016/j.nepr.2008.09.003

Gulikers, J. T., Bastiaens, T. J., & Kirschner, P. A. (2004). A five-dimensional framework for authentic assessment. Educational Technology Research and Development, 52(3), 67–86. https://doi.org/10.1007/BF02504676

Hartley, J., Betts, L. and Murray, W. (2007) Gender and assessment: differences, similarities and implications. Psychology Teaching Review13(1), pp.34–43.

Hernández, R. (2012). Does continuous assessment in higher education support student learning? Higher Education, 64(4), 489–502. https://doi.org/10.1007/s10734-012-9506-7 

Hinchcliffe, T., Bovill, C., & Matthews, K. (2021). Student partnership in assessment (SPiA). Advance HE. https://www.advance-he.ac.uk/knowledge-hub/student-partnership-assessment-spia?s=09

Hounsell, D. (2007) Integrative assessment. Blending assignments and assessments for high-quality learning. Guide number 3. The Quality Assurance Agency for Higher Education.

Kellaghan, T., & Greaney, V. (2019). Public examinations examined. World Bank Publications.

Khalifa, M. A., Gooden, M. A., & Davis, J. E. (2016). Culturally responsive school leadership: A synthesis of the literature. Review of Educational Research, 86(4), 1272–1311. https://doi.org/10.3102/0034654316630383f

Kirkness, V. J., & Barnhardt, R. (1991). First Nations and higher education: The four Rs—respect, relevance, reciprocity, responsibility. Journal of American Indian Education, 30(3), 1–8. https://www.afn.ca/uploads/files/education2/the4rs.pdf

Klein, J. T. (2018). Learning in transdisciplinary collaborations: A conceptual vocabulary. In D. Fam, L. Neuhauser, & P. Gibbs (Eds.), Transdisciplinary theory, practice and education (pp. 11–23). Springer.

Ladson-Billings, G. (1994). What we can learn from multicultural education research. Educational Leadership, 51(8), 22–26.

Leathwood, C. (2005) Assessment policy and practice in higher education: purpose, standards and equity. Assessment and Evaluation in Higher Education. 30(3), pp.307–324.

Leung, A., & Kier, C. A. (2017). On students’ perception of a multi-scheme assessment method. Journal for Economic Educators, 17(1), 40–52. https://libjournals.mtsu.edu/index.php/jfee/article/view/1513/1091

Mansilla, B., & Learning, V. I. (2017). A cognitive-epistemological foundation. In R. Frodeman, J. Thompson Klein, & R. C. Dos Santos Pacheco (Eds.), The Oxford handbook of interdisciplinarity (2nd ed.). Oxford University Press.

Mealy, B. J. (2018, March 25). Assessing student assessment in a flipped classroom [Paper presentation]. American Society for Engineering Education (ASEE) Zone IV Conference, Boulder, CO. 

Mealy, B. J. (2019, March 20). Enhancing student success using flexible assessment [Paper presentation]. American Society for Engineering Education (ASEE) PNW Conference. https://peer.asee.org/31877

Menezes, C. C. N., & Bortoli, R. D. (2016). Potential of gamification as assessment tool. Creative Education, 7(4), 561–566. https://doi.org/10.4236/ce.2016.74058

Mogey, N., Purcell, M., Paterson, J. S., & Burk, J. (2019). Handwriting or typing exams—Can we give students the choice? In F. Khandia (Ed.), 12th CAA International Computer Assisted Assessment Conference: Proceedings of the conference on 8th and 9th July 2008 at Loughborough University (pp. 221–238). Loughborough University. https://hdl.handle.net/2134/5555

Montenegro, E., & Jankowski, N. A. (2017, January). Equity and assessment: Moving towards culturally responsive assessment (Occasional Paper #29). National Institute for Learning Outcomes Assessment. https://files.eric.ed.gov/fulltext/ED574461.pdf

Montenegro, E., & Jankowski, N. A. (2020, January). A new decade for assessment: Embedding equity into assessment praxis (Occasional Paper #42). National Institute for Learning Outcomes Assessment. https://files.eric.ed.gov/fulltext/ED608774.pdf

Morris, C., Milton, E., & Goldstone, R. (2019). Case study: Suggesting choice: Inclusive assessment processes. Higher Education Pedagogies, 4(1), 435–447. https://doi.org/10.1080/23752696.2019.1669479

Mueller, J. (2005). The authentic assessment toolbox: Enhancing student learning through online faculty development. Journal of Online Learning and Teaching, 1(1), 1–7. https://jolt.merlot.org/documents/vol1_no1_mueller_001.pdf

Mueller, J. (2018). What is authentic assessment? Authentic Assessment Toolbox. http://jfmueller.faculty.noctrl.edu/toolbox/whatisit.htm 

Noddings, Nel. (1984). Caring, a Feminine Approach to Ethics & Moral Education. University of California Press.

O’Neill, G. (Ed.). (2011). A practitioner’s guide to choice of assessment methods within a module. University College Dublin Teaching and Learning. https://www.ucd.ie/teaching/t4media/choice_of_assessment.pdf

O’Neill, G. (2017). It’s not fair! Students and staff views on the equity of the procedures and outcomes of students’ choice of assessment methods. Irish Educational Studies, 36(2), 221–236. https://doi.org/10.1080/03323315.2017.1324805

O’Neill, G., & Padden, L. (2021). Diversifying assessment methods: Barriers, benefits and enablers. Innovations in Education and Teaching International, 1–12. https://doi.org/10.1080/14703297.2021.1880462

Orr, S. (2007). Assessment moderation: Constructing the marks and constructing the students. Assessment & Evaluation in Higher Education, 32(6), 645–656.

PeerWise Publications. (n.d.). List of publications relating to PeerWise. https://peerwise.cs.auckland.ac.nz/docs/publications/ 

Pitoyo, M. D. (2019). Gamification based assessment: A test anxiety reduction through game elements in Quizizz platform. Indonesian Journal of Educational Research, 4(1), 22–32. 

Preston, J. P., & Claypool, T. R. (2021). Analyzing assessment practices for Indigenous students. Frontiers in Education, 6. https://doi.org/10.3389/feduc.2021.679972  

Pretorius, L., van Mourik, G. P., & Barratt, C. (2017). Student choice and higher-order thinking: Using a novel flexible assessment regime combined with critical thinking activities to encourage the development of higher order thinking. International Journal of Teaching and Learning in Higher Education, 29(2), 389–401. https://files.eric.ed.gov/fulltext/EJ1146270.pdf 

Ringby, B., & Duus, L. (2017). Innovation camp as a method in health education: A study on interdisciplinarity, learning and participation. European Journal of Physiotherapy, 19(1), 22–24. https://doi.org/10.1080/21679169.2017.1381308

Rumsey, D. (1994). Assessment practical guide. Australian Government Publishing Service.

Ryerson University. (n.d.). Flexible learning resource. https://www.ryerson.ca/content/dam/mental-health-wellbeing/Documents/Flex-Learning-Res-Full.pdf

Sadler, D. (2009) Indeterminacy in the use of preset criteria for assessment and grading. Assessment and Evaluation in Higher Education. 34(2), 159–179. https://doi.org/10.1080/02602930801956059 

Scott, D., Cernasev, A., & Kiles, T. M. (2021). Reimagining pharmacy education through the lens of a choose your own adventure activity—a qualitative evaluation. Pharmacy, 9(3), 151. https://doi.org/10.3390/pharmacy9030151

Sedlacek, W. E. (1994). Issues in advancing diversity through assessment. Journal of Counseling and Development, 72, 549–553.

Sims, D. (2018, December 28). The branching horrors of Black Mirror’s ‘Bandersnatch’. The Atlantic. https://www.theatlantic.com/entertainment/archive/2018/12/black-mirror-chooses-your-own-adventure-bandersnatch/579118/

Singer-Freeman, K. E., & Robinson, C. (2020). Grand challenges for assessment in higher education. Research & Practice in Assessment, 15(2), 1–20. https://www.rpajournal.com/dev/wp-content/uploads/2020/11/Grand-Challenges-for-Assessment-in-Higher-Education.pdf

Smith, J. A. (2021, June 17). Pedagogy in a pandemic: Teaching without exams. YorkU. https://www.yorku.ca/professor/drsmith/2021/06/17/pedagogy-in-a-pandemic-teaching-without-exams/

Sparqs. (n.d.). Student partnership staircase. sparqs. https://www.sparqs.ac.uk/resource-item.php?item=254

Stachowiak, B. (2015a, May 5). Choose your own adventure learning. Teaching in Higher Ed. https://teachinginhighered.com/2015/05/05/choose-your-own-adventure-learning/

Stachowiak, B. (2015b, June 16). Choose your own adventure learning (Part 2). Teaching in Higher Ed. https://teachinginhighered.com/2015/06/16/choose-your-own-adventure-learning-part-2/ 

Strange, C. C., & Banning, J. H. (2015). Designing for learning: Creating campus environments for student success. John Wiley & Sons.

Struyven, K., Dochy, F., & Janssens, S. (2005). Students’ perceptions about evaluation and assessment in higher education: A review. Assessment & Evaluation in Higher Education, 30(4), 325–341.

Teesside University. (n.d.). Flexible assessment for a hybrid model. https://blogs.tees.ac.uk/lteonline/learning-and-teaching/flexible-hybrid-assessment/

Trotter, E. (2006). Student perceptions of continuous summative assessment. Assessment & Evaluation in Higher Education, 31(5), 505–521. https://doi.org/10.1080/02602930600679506

Tucker, C. (2020). Design a choose your path learning adventurehttps://catlintucker.com/2020/10/choose-your-own-adventure/ 

UCD Teaching and Learning. (2010). A practitioner’s guide to choice of assessment methods within a module: Case studies from University College Dublin. https://www.ucd.ie/teaching/t4media/choice_of_assessment.pdf 

Universal Design for Learning. (n.d.). Choice of assessment: Multiple means of expression. https://alludl.ca/create/assessment/choice-of-assessment/ 

University of Tennessee, Knoxville. (n.d.). Experience learning: Teaching and innovation. https://experiencelearning.utk.edu/about/types/

Verwood, R., Mitchell, A., & Machado, J. (2011). Supporting Indigenous students through a culturally relevant assessment model based on the medicine wheel. Canadian Journal of Native Education, 34(1), 49–66. 

Villarroel, V., Bloxham, S., Bruna, D., Bruna, C., & Herrera-Seda, C. (2018). Authentic assessment: Creating a blueprint for course design. Assessment & Evaluation in Higher Education, 43, 840–854. https://doi.org/10.1080/02602938.2017.1412396

Wallis, P. (n.d.). Renewable assignments. In Structured renewable assignments. Pressbooks. https://uw.pressbooks.pub/structuredrenewableassignments/chapter/renewable-assignments/

Wilson-Stewart, K. (2017). Choose your own adventure in podiatry. Medical Education, 51(5), 539. https://doi.org/10.1111/medu.13311

Wlodkowski, R. J., & Ginsberg, M. B. (2017). Enhancing adult motivation to learn: A comprehensive guide for teaching all adults. John Wiley & Sons.

Wood, L. N., & Smith, G. H. (1999). Flexible assessment. In W. Spunde, R. Hubbard, & P. Cretchley (Eds.), The challenge of diversity: Proceedings of the [Delta] ‘99 symposium on undergraduate mathematics (pp. 154–158). University of Southern Queensland Press.

Eliana Elkhoury

Athabasca University

Dr. Eliana Elkhoury is an assistant professor, human experience in Open, Digital, and Distance Education at Athabasca University. In her role, Elkhoury conducts research on alternative assessment practices in multiple disciplines, equity in assessment and innovation in teaching and learning.

This content is provided to you freely by EdTech Books.

Access it online or download it at https://edtechbooks.org/ldvoices/alternative_assessment.