The next step in our framework is to plan your learning assessments. In this chapter, we will discuss the alignment of assessments with course outcomes, different types of assessments, and strengths and weaknesses of assessments. As always, a formative evaluation will be conducted to help you thoroughly think through your design decisions.
Chapter Learning Outcome: I can develop an assessment plan that aligns with my learning outcomes.
Sub-section outcomes:
You will use the skills presented in this chapter to develop a digital assessment plan for your blended course.
Open the Digital Assessment Plan Template and save a copy. Label the copy Your Name Assessment Plan. (Example: Hyun Joo Assessment Plan.)
Each section in this chapter guides you through elements of your digital assessment plan and progress toward a completed plan for your course.
This chapter discusses the third part of the course design framework: Learning Assessments. We will introduce you to some of the big ideas around digital learning assessment and help you get started designing a digital assessment plan for your blended course. Use the resources listed at the end of each lesson to dive deeper into any of the assessment topics you are interested in.
In this section, you will begin to address alignment.
Learning Outcome: I can plan assessments that are aligned with my course learning outcomes.
Assessment: Blueprint Challenge Part 2 Alignment Table.
What to look for: Why alignment is important and one way to ensure alignment.
As you align assessments with your learning outcomes, consider Bloom's Taxonomy to help guide your choices. Keep in mind that higher-order outcomes require assessments that measure higher-order skills. Figure 4-1 shows some examples of coordinating Bloom's higher-order outcomes and different assessments. However, this does not show all possibilities, so use your own creativity and judgment as you build your assessment plan.
You can use an alignment table to articulate the relationship between your desired learning outcomes and the assessments you are choosing. Table 4-1 displays learning outcomes from various disciplines and examples of assessments aligned or misaligned with those outcomes.
Table 4-1. Example of alignment table for possible assessments for various learning outcomes
Learning Outcome Include Bloom’s Taxonomy | Aligned Assessments | |
Art Learning Objective: Bloom’s Taxonomy—Create. |
|
(Neither of these assessments evaluate the students' ability to create and shade their own still life drawing.) |
History Learning Objective: Bloom’s Taxonomy—Analyze. |
|
(These assessments do not require the student to analyze or compare the causes.) |
Physical Health Learning Objective: Bloom’s Taxonomy—Evaluate, Create. |
|
(Knowing how to do a specific activity and being assigned to do it are not the same as creating a personalized plan to improve physical health.) |
Chemistry Learning Objective: Bloom’s Taxonomy—Apply. |
|
(Knowing the steps in a process and knowing chemistry terminology are not the same as being able to balance chemical equations.) |
Blueprint Challenge Part 2 Alignment Table.
Open your assessment plan document.
Read the directions for Part 2 Alignment Table. Review the example. Then fill in the table with your ideas.
In this section, you will learn about a variety of learning assessments you may use in your blended course. You will first review the distinctions between formative and summative assessments. Then you will see examples of authentic and renewable assessments, which are particularly relevant in a digital context. Finally, you will have a chance to explore what variety of assessments makes sense for your blended course.
Learning Outcome: I can identify a variety of assessments for use in my course.
Assessment: Design Challenge Part 1 Assessment Brainstorming Table—First Column.
Learning Assessments allow you to measure what a learner has learned from instruction and learning activities. Learning assessments can be formative or summative.
As you consider potential assessments for your blended course, you may want to consider both formative and summative assessment options as described in Table 4-3.
Table 4-3. Comparing Summative and Formative Assessments
Summative Assessment OF Learning | Formative Assessment FOR Learning | |
Purpose | To evaluate learner’s overall knowledge and performance. | To monitor learners' progress and identify areas for improvement. |
Focus | A finished product or completed unit of study, showing evidence of competency (at least 80% proficiency). | The learning process, showing evidence of thinking and understanding. |
Uses | Gives a grade, score, or ranking that represents learner's competency. Sometimes used to determine if an instructor needs to reteach a unit to the entire class, or if a learner needs additional tutoring in one content area. | Can guide the instructor's methods, processes, and pacing. Can inform learners' management of self-regulated learning. |
Timing | At the end of a unit or course. | Should happen throughout the learning process. |
Feedback | Feedback is often limited as the primary purpose in a summary evaluation of learner performance. | Provides timely and specific feedback to learners that helps them make improvements to their learning. |
Examples | Unit tests, final exams, standardized tests. Projects, performances, and portfolios. | Class discussions, quizzes, homework assignments, parts of a larger project. |
NOTE: Assessments can occasionally overlap, such as if a unit test shows a summative assessment of the unit AND a formative assessment of the course.
Table 4-4. Examples of Various Types of Assessments
Type of Assessment | Description |
Quizzes and Exams |
|
Essays and Essay Tests |
|
Short Answer Questions |
|
Research Papers |
|
Presentations |
|
Demonstrations |
|
Projects |
|
Discussions |
|
Simulations |
|
Labs |
|
Case Studies |
|
Portfolios |
|
Surveys or Self-Assessments |
|
Peer Reviews |
|
Performance Tasks |
|
Concept Maps/Graphic Organizers |
|
Reflective Journals |
|
Playlists or Choice Boards |
|
Grading traditional assessments is quite straightforward. Fill-in-the blank, multiple-choice, and matching questions typically found in quizzes and exams can be graded automatically and objectively. To implement any of the alternate assessments listed above, the use of rubrics makes them assessable. Rubrics are tools we use to measure learner performance on an assessment. A rubric identifies and outlines specific criteria that are important.
The two most common types of rubrics are holistic and analytic. They give different levels of feedback at different points of the learning process. There are sometimes different names and labels used for these rubrics, and some may have hybrid elements, but most rubrics are derivatives of either holistic or analytic. Table 4-5 explains some of the characteristics of these types of rubrics.
Table 4-5. Characteristics of Holistic and Analytic Rubrics
Holistic | Analytic |
Focuses on the product. | Focuses on the process. |
Grades based on the overall quality. | Grades based on individual characteristics that are scored separately and then added together for the total. |
Gives general feedback. | Gives specific feedback. |
Let's imagine, for example, that you taught a unit on using rhetorical writing to persuade an audience. The learners were assigned to write a short opinion article on a topic related to the university. For a draft of the opinion article, you could use an analytic rubric, which gives feedback the learners can use to revise their writing. For the finished opinion article, you could use a holistic rubric. Table 4-6 shows an example of an analytic rubric, and Table 4-7 shows an example of a holistic rubric. Notice how they address the same characteristics, but in different ways.
Table 4-6. An example of an analytic rubric for a first draft of an opinion article
Meets or exceeds requirements. Good work. (1 point each) | Below requirements. Needs some revision. I suggest you talk with me. (0.5 point each) | Far below requirements. Needs substantial revision. You must talk with me. (0 point each) | |
Topic | The topic is appropriate in scope for the assignment. | The topic's scope is too big or too small for the assignment. | The topic is unclear and/or confusing. |
Thesis Statement | The thesis statement clearly communicates the writer’s opinion/evaluation of the topic of the paper. | The thesis statement vaguely communicates the author’s opinion or evaluation of the topic. | There is no thesis statement. |
Topic Sentences | The topic sentences clearly state the main idea or purpose of each body paragraph. | The topic sentences somewhat communicate the main ideas of the body paragraphs. | There are no topic sentences. |
Organization | The paragraphs are organized in a sensible progression through the main argument. | The organization links some of the ideas together but lacks the progression of ideas. | There is no organization. |
Content and use of rhetoric - Ethos | There is one powerful or relevant real-world example of Ethos in the supporting details (with documentation). | There is one real-world example of Ethos, but it is weak or lacking documentation. | There is no example of Ethos. |
Content and use of rhetoric - Logos | There is one powerful or relevant real-world example of Logos in the supporting details (with documentation). | There is one real-world example of Logos, but it is weak or lacking documentation. | There is no example of Logos. |
Content and use of rhetoric - Pathos | There is one powerful or relevant real-world example of Pathos in the supporting details (with documentation). | There is one real-world example of Pathos, but it is weak or lacking documentation. | There is no example of Pathos. |
_____ points | _____ points | _____ + _____ = _____ out of 7 |
First, notice that this rubric does not include anything about grammar, spelling, punctuation, or vocabulary. This is a rubric for a first draft and focuses on higher-order concerns, such as topic, content, strength of argument, and organization. A rubric for a third or fourth draft might focus on lower-order concerns, such as grammar, punctuation, and vocabulary.
Second, notice that this rubric focuses on giving feedback to the learners, so they know what to revise and if they need to talk with the instructor for help.
Third, consider the strengths of this rubric: This rubric assesses if the learners used rhetoric in their writing. It provides specific, useful feedback so the learners can make improvements to their writing. The instructor receives insight about what the learners understand and what topics need to be reviewed in class.
The instructor can choose to either give a completion score in the gradebook for submitting a draft or give points for each criterion and add the points together for a total score. Both are acceptable ways of recording learner performance on the assessment.
Table 4-7. An example of a holistic rubric for a finished opinion article
Grade | Description |
A |
|
B |
|
C |
|
F |
|
First, notice that this rubric gives similar feedback to the analytic rubric, but it is less helpful for making revisions. The product is evaluated as a whole, and the most accurate overall score is selected. A learner may have a great thesis statement, but poor topic sentences and poor use of rhetoric and would thus receive the intermediate score of a B (or whatever value the instructor chooses to give that level).
Second, notice the instructor's rubric offers any learner with a score below A one week to revise the paper and improve the grade if they choose.
Third, consider the strengths of this rubric: This rubric assesses how well the learners used rhetoric in their writing. It provides general feedback so each learner can understand their grade. The instructor can evaluate what the learner knows and award an appropriate grade along with guidance for potential improvement.
You need to decide what you want to communicate to the learner.
If the assessment is formative:
If the assessment is summative:
The answers to these questions will help you decide which type of rubric might be best, what criteria to use to evaluate the learner's work, and what to include in the rubric.
Within the categories of formative and summative rubrics, there are many ways to structure the rubrics. For more information about rubrics see the resources section of this chapter.
Blueprint Challenge Part 1 Assessment Brainstorming Table - First Column.
Open your assessment plan document. Read the directions for Part 1 Assessment Brainstorming Table.
Review the example in the first column. Then fill in the first column with your assessment ideas.
In this section, you will learn how to evaluate the strengths and limitations of digital assessments. You will also be introduced to how different types of digital rubrics can help with the assessment strategy for your blended course.
Learning Outcome: I can evaluate the strengths and limitations of my digital assessments.
Assessment: Blueprint Challenge Part 1 Assessment Brainstorming Table - Second Column.
Technology makes it possible to enhance the assessment of student learning. Figure 4-2 below highlights some of the ways in which learning assessment is evolving in a digital world.
When deciding whether to use digital assessments or more traditional offline assessments, it is important to consider the strengths and limitations of your different options. Table 4-8 shares examples of how digital assessments can benefit student learning, access, flexibility, and efficiency.
Table 4-8. Some Benefits of Digital Assessments
Benefits of Digital Assessments (DA) | Examples |
Student Learning |
|
Access & Flexibility |
|
Efficiency |
|
Authentic assessment engages learners in real-world applications and problem-solving tasks, which require learners to apply their knowledge in real-world contexts. This deepens learner engagement with the content and fosters greater connections between theory and practice. It also increases motivation because learners perceive greater relevance to their own experience and transferability to their world outside of school. Video 4-3 below presents a brief introduction to authentic assessments.
Below are some questions adapted from Burton (2011, p. 25) that can be used to evaluate the authenticity of a learning assessment.
One assessment may not incorporate every principle listed above, but implementing any of them will increase its authenticity. Table 4-9 contains some examples of authentic assessments in different academic domains. Figure 4-3 shows an example from a psychology class.
Table 4-9. Examples of Authentic Assessments
Field | Example of Authentic Assessment |
Art | Create an original artwork inspired by a specific art movement or theme. |
Business | Develop a comprehensive business plan for a new startup venture. |
Education | Design and implement a lesson plan for teaching and assessing student learning. |
Engineering | Design and build a functional model or prototype of a complex machine or structure. |
Geology | Map and interpret geological formations in the field. |
History | Conduct archival research with primary sources and write a historical analysis paper. |
Psychology | Design and conduct a psychological experiment to test a hypothesis. |
Here is an example of what an authentic digital assessment might look like in a Psychology Research Methods class:
Learning Objectives:
Research Proposal Evaluation:
Instructions — To practice the skills you learned in this module and ready yourself for potential research in an area that interests you, complete the following tasks:
Research Ideas:
Renewable Assessments are a type of authentic assessment that has value beyond the scope of the classroom. While most classroom assessments are disposable because they have no lasting value beyond the classroom, renewable assessments produce artifacts that have value beyond the classroom and the assessment itself. They are often associated with internships or projects that serve the local community.
By creating authentic artifacts with a sense of permanence, renewable assessments enable learners to demonstrate their understanding, critical thinking, and practical skills in a manner that can have lasting value and relevance in their respective fields. Table 4-10 has some examples of renewable and disposable assessments in various fields.
Table 4-10. Examples of Renewable and Disposable Assessments
Field | Renewable Assessment | |
Art | Create an art portfolio that can be used to represent your skills to future employers. | Demonstrate your understanding of complementary color pallets and the color wheel. |
Business | Develop a comprehensive business plan for a client with an idea for a new small business. | Take an exam on essential elements of a comprehensive business plan. |
Education | Design a worksheet or learner assignment that can be used in a mentor instructor's class. | Design a lesson plan for the class that the learner never intends to use. |
Engineering | Work with a non-profit organization to create a solution to a real-world challenge. | Solve problems on an exam that demonstrate your understanding of an engineering principle. |
Geology | Work with local earth science school instructors to create a podcast answering learner questions about the earth. | Map and interpret geological formations in the field. |
History | Conduct archival research with primary sources and update a Wikipedia entry on the topic to share the knowledge with the community. | Do historical research using primary sources and write an essay to share with the class in a final presentation. |
Psychology | Work with a local school district to create posters for the schools with accurate information about mental health services in the community. | Conduct research and write a report on the mental health resources available in the local community. |
Digital assessments have some limitations when compared to traditional offline assessments. Below is a list of some of those limitations.
Blended courses allow for assessments in both online and offline modalities. It is important for you as an instructor to consider the strengths and limitations of the assessments that you design for your course. You might consider creating a table like the example in Table 4-11 as you consider whether to use digital or offline assessments for your blended course.
Table 4-11. Example of Comparing the Strengths of Digital and Offline Modalities for Different Types of Assessments
Type of Assessment | Online Assessment Strengths | Offline Assessment Strengths |
Quizzes and Exams |
|
|
Live Presentations and Physical Demonstrations |
|
|
Projects & Portfolios |
|
|
Papers |
|
|
Discussion Participation |
|
|
Blueprint Challenge Part 1 Assessment Brainstorming Table - Second Column.
Open your assessment plan document. Read the directions for Part 1 Assessment Brainstorming Table.
Review the example in the second column. Then fill in the second column with your assessment evaluation.
Academic Integrity is the ethical and honest pursuit of learning in an academic setting. Instructors have the responsibility to encourage learners to be honest in their academic work as well as to ensure practices that promote academic integrity. This section will address several course design strategies that can help reduce academic dishonesty. We will discuss the following:
Learning Outcome: I can develop an assessment plan that addresses academic integrity.
Assessment: Blueprint Challenge Part 1 Assessment Brainstorming Table — Third Column.
Many educational situations create or provide unintended incentives to cheat because the consequences of failure are so high. For example, a final exam worth 25% of the course grade could cause students to feel they must succeed at any cost because performing poorly on the exam would have a dramatic impact on their final course grade. High-stakes assessments usually occur at a single point in time, typically at the end of a learning unit.
The assessments are often formative and diagnostic in nature. There is an expected mastery threshold that learners are expected to reach before they can move on. Learners may take a quiz or an exam and get feedback on gaps in their understanding. They can do additional activities to address those gaps and retake the assessment. Figure 4-4 depicts the difference between a traditional approach (time-based progression) and a mastery-based approach.
Digital assessments can play an important role in making a mastery-based approach possible. For example, Section 4.3 identified the following strengths that can make mastery-based assessments more practical.
Real-time Feedback/Feedback Automation — in a mastery approach, getting quick diagnostic feedback on assessments allows learners to seek remediation immediately while the problems are fresh in their minds.
Multiple Attempts — a mastery approach depends on learners being able to take assessments multiple times. This typically means having a bank of items that measure the same knowledge or skill. Digital tools enable each attempt to draw different questions from the question bank, ensuring that when learners are making another attempt, they aren't just learning the answers to the previous assessment items.
Competency-Based Assessments are becoming more common in education. These assessments are like the mastery-based approaches described previously; however, a core idea behind many competency-based approaches is that learners can demonstrate their competency through authentic experiences that they have had outside of the classroom and sometimes before enrolling in the course. Below are a few examples:
Watch Video 4-5 to understand how competency-based education compares to traditional education.
What to look for: Three ways competency-based assessment is more equitable than traditional assessments.
Relying more heavily on authentic, competency-based assessments as part of your course grading is another way to strengthen learning and simultaneously deter cheating. While it is not impossible for learners to cheat in authentic assessments, the design and nature of these assessments make it significantly more difficult to do so. Below are some ways that authentic assessments promote academic honesty.
Contextual relevance: Authentic assessments are designed to reflect real-world scenarios and challenges. The questions and tasks presented in these assessments require learners to apply their knowledge and skills in practical ways often in a unique real-world context. Cheating through memorization or using external resources becomes less effective because the questions often demand higher-order thinking and require the ability to analyze and synthesize information in specific contexts.
Complex problem-solving: Authentic assessments often involve complex problems that cannot be easily solved by simply looking up information. These assessments focus on assessing a learner's ability to think critically, make reasoned judgments, and develop creative solutions. Cheating becomes more difficult because the tasks typically require a deep understanding of the subject matter and the application of acquired knowledge to novel situations.
Performance-based tasks: Authentic assessments often involve performance-based tasks such as presentations, demonstrations, or projects. These assessments require learners to showcase their skills and abilities in a tangible and often personal way, making cheating more difficult.
Assessment process: Authentic assessments often involve a process rather than a single final product. For instance, a research project may require learners to develop a research question, gather and analyze data, and present their findings. The assessment process itself provides opportunities for instructors to observe and evaluate a learner's progress, making it harder for cheating to go unnoticed.
Assessment variety: Authentic assessments can take various forms, including group work, field experiences, internships, and simulations. These diverse assessment methods make it more challenging to cheat, as the evaluation is not limited to traditional test formats. The authentic nature of the assessments reduces the predictability of cheating methods and requires learners to demonstrate their skills in different contexts. Figure 4-5 shows examples of competency based assessments.
This is an example of learners creating a series of commercials, promoting a local business for a viewer contest, as part of a local TV station's self-promotion initiative.
Learners were enrolled in the Radio, Television, and Broadcast News diploma at the southern Alberta Institute of Technology (SAIT).
Learners were tasked to find a local business or not-for-profit.
Working with this business they needed to write a script that their client approved, and then direct the shoot, editing, and production of two commercials (30 second and 15 second versions).
The client could freely use the commercial where they wanted, and the learner could keep it for their demo reel (like resume for the video industry).
The examples were made in 2016. These commercials aired on CHAT TV in Medicine Hat Alberta for about 2 months.
Primetime 30-second spot
Primetime 15-second spot
Exam proctoring is the process of monitoring learners as they take an exam. Exam proctoring can happen in-person or remotely using digital technologies. Typically, the monitoring is done for high-stakes, summative assessments to discourage academic dishonesty and ensure that the exam conditions setup by the instructor are enforced.
In-person Proctoring. In the in-person classroom, proctoring is often done by the instructor who walks around the classroom observing learners to ensure that cheating does not happen and that learner questions during the exam are addressed (see Figure 4-6). Some institutions have a separate testing center where learners can take exams overseen by human proctors outside of class time.
BYU-Idaho Testing Center
Digital Proctoring. Digital Proctoring involves using technology to monitor learners and ensure desired test conditions are met while they take an exam (see Figure 4-5). Some common features of digital proctoring software include:
Plagiarism is when a person presents someone else's ideas or work as their own without adequate acknowledgement. Some examples of plagiarism include:
Plagiarism detectors are software programs that help identify instances of similarity in written content. The software generates a report identifying places in a submitted text that are similar to other work in the database. Figure 4-7 shows an example of plagiarism software.
Instructors and learners can then use the report to determine if plagiarism may have happened. In this example, 10% of this paper is similar to other sources. Clicking on the match in the report links to a side-by-side comparison of the texts.
It is important for instructors and administrators to know that it is unrealistic to expect 0% plagiarism in a report. The program can only detect how similar the paper is to other sources. For example:
The instructor must review each report for accuracy before accusing a learner of plagiarism.
Plagiarism detectors can be used to encourage academic honesty through plagiarism prevention and detection strategies.
Sometimes learners commit intentional plagiarism. They purposefully use work from other people and claim it as their own. Informing learners that their writing will be processed by a plagiarism checking program may be enough to deter them. It is a best practice to inform learners in advance that the assignments will be checked, show them the program that will be used, and explain the generated report. Once they are informed of the instructor's expectations, they may be more likely to act with academic integrity.
However, it is more common for learners to commit unintentional plagiarism because of carelessness or ignorance. Learners usually think they have made sufficient changes to the borrowed information, or they think they have given sufficient acknowledgement to the original author, when they have done neither of those things.
Allowing learners to upload drafts to a plagiarism detector and teaching them how to use the report to revise their writing can be a very beneficial learning experience. The learners can see where they have insufficient or inappropriate quoting, paraphrasing, or summarizing, and learn how to do it better in the future. The detectors do not scan for citations or references, so the instructor will need to teach the learners how to review their own writing to verify adequate in-text citations and a correct reference list.
Using generative AI, such as ChatGPT, can be a helpful tool for brainstorming and drafting. Many institutions are still trying to develop policies around its use to help learners to know when it is appropriate and inappropriate to use it. In some situations,
Blueprint Challenge Part 1 Assessment Brainstorming Table - Third Column.
Open your assessment plan document. Read the directions for Part 1 Assessment Brainstorming Table.
Review the example in the third column. Then fill in the third column with your academic integrity plan.
Chapter 4 has helped you think through aligning your assessments with your learning outcomes and introduced you to types of assessment, strengths and limitations of the different types, and ways to address academic integrity. You have completed the foundational work and can now create your digital assessment plan.
Open your assessment plan document.
Read the directions for Part 3 Digital Assessment Plan.
Review the example. Then fill in the table with your ideas.
In Chapter 2, we started talking about your syllabus. Eventually information about assessments will be added to your syllabus. There will be a section in your syllabus about major course tests and projects in which you will include brief summaries of the major tests, such as midterm and final, and major projects, such as a choice board assignment. You will also include a course calendar with assessments listed.
And finally:
Step 4 Create a Digital Assessment Plan
Open your course checklist. Read the directions for Step 4 Create a Digital Assessment Plan. Then complete the assignment with your ideas. Check the completion box.
This content is provided to you freely by EdTech Books.
Access it online or download it at https://edtechbooks.org/he_blended/chapter_4_assessment_plan.