Personalizing Feedback and Developing Learner Agency Using Kami

&
DOI:10.59668/1269.15644
This paper presents the schoolwide implementation and evaluation of an online social annotation tool (SA), Kami, used to provide personalized feedback to deepen learning and develop learner agency. Through a mixed-methods study, providing personalized video feedback using Kami could enhance learners' perceived individual learning, collaborative learning, learner agency, and deep learning. A strong correlation existed between learner agency and collaborative learning using Kami. Teachers' overall adoption rate and perceptions about using Kami provided evidence for the success of the implementation and professional development strategies based on the diffusion of innovation theory. The paper offers a glimpse of effective pedagogy using SA tools in a blended learning environment, strategies for technology professional development for large-scale implementation, and an evidence-based approach to evaluating an implementation outcome

Introduction

The COVID-19 pandemic exacerbated the need to strengthen online pedagogies and interactivity. The School of Science and Technology, Singapore (SST) has actively implemented the use of an online social annotation (SA) tool, Kami, and piloted various professional development (PD) and instructional initiatives. This paper presents the schoolwide implementation strategies and the outcome evaluation findings.

Social Annotation Tools

SA tools allow users to comment, highlight, collaborate online, and receive notifications. Novak et al. (2011) found that SA tools enhanced learning outcomes, critical thinking, metacognition, comprehension, communications, collaborations, and learner emotions. The authors recommended providing adequate training, instructional support, and small groupings for learning activities. Glover and colleagues (2007) generated a list of essential and desirable features for SA tools, such as commenting, private annotations, browser compatibility, drawing, and collaborations. Chen and colleagues (2012) highlighted interactivity and intuitive notetaking as key factors fostering student-content interaction and predicting sustained use of SA tools. Kami has most of the desired features mentioned by the studies and unique features, such as video commenting, allowing users to provide personal and in-depth feedback more effortlessly. Thus, SST chose Kami as an SA tool for online instructional innovations and contributed to the literature by evaluating the impact of using SA tools within the K-12 context rarely examined.

Technology Implementation

The implementation of Kami within SST followed an ABCD framework that addresses various key concepts of the diffusion of innovation theory (DTI, Rogers, 2003). According to Rogers, an innovation-decision process involves knowledge (information about the innovation), persuasion (formation of an attitude toward the innovation), decision (adoption or rejection), implementation (innovation in practice), and confirmation (affirmation for decision) stages. Opinion leaders and change agents play pivotal roles in influencing innovation decisions. The rate of adoption depends on perceived innovativeness, or how early one is ready to adopt new technology and the innovation attributes (relative advantage, compatibility, ease of use, trialability, and observability) of the technology. The ABCD framework stands for advocacy by change agents, backing with support, capacity building for teachers, and demonstration by champions. SST completed the four-year implementation and evaluation cycle for Kami, from schoolwide subscription, department-based trials, subject-based instructional modeling, level-wide integration for learners, one-to-one coaching, and sharing at cluster and national levels. In 2023, the Physics team implemented an instructional intervention among two 9th-grade Physics classes using Kami to provide personalized video feedback for daily homework for a year and evaluated the impact of learning using Kami.

The purpose of the study was to understand the perceptions of students and teachers regarding the use of Kami and provide insights into effective instructional strategies using SA tools. The following research questions guided the study:

  1. What are students’ perceptions about the impact of learning using Kami in terms of individual learning, collaborative learning, learner agency, and deep learning?
  2. What are students’ perceptions of teachers’ feedback using Kami?
  3. What are teachers’ perceptions about using Kami in terms of their adoption decision stage, perceived innovativeness, and perceived innovation attributes of Kami?

Methodology

The evaluation study embraced a pragmatic lens, using multiple data sources (Mertens, 2018) and a matching comparison group design (Henry, 2010) to estimate the intervention's effect. This section summarizes the key elements of the research design and methodology.

Context and Research Design

The study occurred in an independent Singapore school that actively pioneers technology integration initiatives. We employed a sequential mixed-methods approach (Creswell & Plano Clark, 2011), using quantitative survey data to understand students’ and teachers’ perceptions of the impact of learning using Kami, qualitative feedback to understand students' experiences with teacher feedback, and student work samples to corroborate the results.

Sampling and Participants

We followed the purposive sampling strategy (Kerrigan, 2014) to select two 9th-grade Physics classes (n= 47) for the instructional intervention and the remaining 141 students as the control group for the evaluation. All students used Kami in regular ways, such as digital annotations and commenting. The intervention group received personalized video feedback using Kami for daily homework. Overall, 120 students and 17 Science teachers completed the evaluation surveys after the intervention.

Instrumentation

We adapted the student survey items from various validated scales. The items for individual and collaborative learning originated from Zhao et al. (2018). Individual learning refers to how learners interact with the content, whereas collaborative learning is how they interact with others and the content. Perceived learner agency was measured in terms of self-regulated learning (Greene, 2015; Jiang et al., 2023) defined as an “active, constructive process whereby learners set goals for their learning and then attempt to monitor, regulate, and control their cognition, motivation, and behavior” (Pintrich, 2000). Deep learning involves integrating prior and new knowledge to refine understanding or create more complex knowledge, according to Greene (2015) whose items for self-regulation and deep learning strategies were adapted. We constructed the qualitative questions based on Lim and colleagues’ study (2020) on perceptions of teacher feedback. The student survey consisted of 18 items on a Likert scale of 1 (strongly disagree) to 7 (strongly agree) and four open-ended questions such as “What did you like/dislike about this method of providing feedback using Kami?” For teachers’ perceptions, we adapted 18 items on a similar scale from Celik et al. (2012) to evaluate their adoption decision stage, perceived innovativeness, and innovation attributes of Kami based on DTI.

Data Collection and Analysis

Participants received the online surveys as part of school surveys after an instructional intervention. For the qualitative data, we followed systematic steps (Braun & Clarke, 2006) to conduct a priori and thematic coding independently before merging the findings. Further, we examined student work samples for evidence confirming or counteracting our conclusions.

Trustworthiness

To ensure trustworthiness, we checked the reliability of the subscales, which achieved a Cronbach Alpha value of more than 0.90 for all constructs. The credibility of qualitative findings was supported by 95% congruence in the codes generated by the researchers. We also engaged several reviewers to critique the evaluation and findings.

Findings and Discussion

The following sections show the descriptive and influential statistics from the data analyses and discuss the combined findings.

Students’ Perceptions of the Impact of Learning Using Kami

Table 1 shows an average score of 4.3 out of 7 regarding 9th-grade Physics students’ perceptions of the impact of learning using Kami. The intervention group had a significantly higher average score for each construct (t> 3 andp< 0.01). Therefore, providing personalized video feedback using Kami potentially enhances students’ perceived individual learning, collaborative learning, learner agency, and experience of deep learning. Learner agency correlated strongly (R2 = 0.69 andp < 0.01) with collaborative learning using Kami, reaffirming the importance of incorporating group work in helping students acquire self-regulation skills.

Table 1

Summary of Descriptive and Independent Samples T-test Statistics

Students’ Perceptions of Personalized Feedback Using Kami

The qualitative student feedback confirmed the positive impact of personalized feedback on student learning, particularly in terms of facilitating detailed feedback, revisions, and understanding of complex concepts. As one learner shared, “I think the videos are very useful in my understanding. Outside of going through the questions during class, this allows me to revise and look back at the mistakes I have made, especially after a long time of no revision.” Table 2 illustrates the distribution of qualitative codes by the number of participants who minimally mentioned the code once. Overall, 87.5% of the students perceived a positive impact of teachers’ feedback using Kami in at least one aspect, 77.8% perceived deep learning, 56.9% suggested increased learner agency, and 50.0% found learning more efficient with Kami feedback. One student explained that he appreciated the opportunity to review targeted feedback with Kami videos without having to go through every question slowly. Collectively, the results affirmed the positive impact of learning using Kami and explained why there was a difference in the perceptions of the students between the control and intervention groups. Nonetheless, one-third of the students mentioned some drawbacks of Kami, primarily related to difficulties typing equations and drawing without a stylus.

Table 2

Distribution of Qualitative Codes by Participants

Teachers’ Perceptions About the Use of Kami

The teacher survey results provided further evidence for the implementation's success. Kami reports that 59.3% of teachers schoolwide use Kami actively. At least 94.1% of science teachers decided to adopt Kami and 76.5% implemented it. For perceived innovativeness, 53.0% of the science teachers are innovators, early adopters, or early majority adopting Kami, suggesting they are relatively early in embracing new technology. Table 3 shows the average innovation attribute scores by teachers’ decision stages, indicating a higher average score for teachers in the implementation or confirmation stage. Observability had the highest average score, showing the effectiveness of PD strategies. The findings confirmed the applicability of Rogers’ (2003) DTI in guiding new technology implementation and evaluation of technology, reiterating the criticality of sustained support throughout the implementation.

Table 3

Summary of Average Innovation Attribute Scores by Adoption Decision Stage

Conclusion

The present study suggested that providing personalized video feedback potentially enhances perceived learning, learner agency, deep learning, and efficiency for learners. The correlation between learner agency and collaborative learning reinforces the importance of group activities in developing self-regulation. The high adoption rates of Kami provided evidence for the success of the implementation strategies based on DTI in the K-12 setting. We used multiple strategies aforementioned to ensure the trustworthiness of the findings, but limitations of the study exist, including the small sample size of teacher participants, lack of qualitative insights from teachers, and using self-reports instead of actual classroom practices. Future studies using larger teacher sample sizes from different subject teams and qualitative feedback will help validate the impact of learning using Kami and understand the challenges for teachers during implementation.

References

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology.Qualitative Research in Psychology, 3(2),77–101.https://doi.org/10.1191/1478088706qp063oa

Celik, I., Sahin, I., & Aydin, M. (2014). Reliability and validity study of the mobile learning adoption scale developed based on the diffusion of innovations theory. International Journal of Education in Mathematics, Science and Technology, 2, 300-316.https://files.eric.ed.gov/fulltext/EJ1059016.pdf

Chen, Y. C., Hwang, R. H., & Wang, C. Y. (2012). Development and evaluation of a Web 2.0 annotation system as a learning tool in an e-learning environment.Computers and Education, 58, 1094–1105.https://doi.org/10.1016/j.compedu.2011.12.017

Creswell, J. W., & Plano Clark, V. L. (2011).Designing and conducting mixed methods research. SAGE.

Glover, I., Xu, Z., & Hardaker, G. (2007). Online annotation - Research and practices.Computers and Education,49, 1308–1320.https://doi.org/10.1016/j.compedu.2006.02.006

Greene, B. (2015). Measuring cognitive engagement with self-report scales: Reflections from over 20 years of research.Educational Psychologist, 50, 14-30.https://doi.org/10.1080/00461520.2014.989230

Henry, G. T. (2010). Comparison group designs. In Wholey, J., Hatry, H., & Newcomer, K. (Eds.),Handbook of practical program evaluation(3rd ed., pp. 125–143). Jossey-Bass.

Jiang, D., Dahl, B., Chen, J., & Du, X. (2023). Engineering students’ perception of learner agency development in an intercultural pbl (problem-and project-based) team setting. IEEE Transactions on Education, 2023, 1-11.https://ieeexplore.ieee.org/abstract/document/10124721

Kerrigan, M. R. (2014). A framework for understanding community colleges’ organizational capacity for data use: A convergent parallel mixed methods study. Journal of Mixed Methods Research, 8, 341-362. https://doi.org/10.1177/1558689814523518

Lim L. A., Dawson, S., Gašević, D., Joksimović, S., Pardo, A., Fudge, A., & Gentili, S. (2021) Students’ perceptions of, and emotional responses to, personalised learning analytics-based feedback: An exploratory study of four courses. Assessment & Evaluation in Higher Education, 46, 339-359, https://doi.org/10.1080/02602938.2020.1782831

Mertens, D. M. (2018). Mixed methods design in evaluation. SAGE.

Novak, E., Razzouk, R., & Johnson, T. E. (2011). The educational use of social annotation tools in higher education: A literature review. Internet and Higher Education, 15, 39–49. https://doi.org/10.1016/j.iheduc.2011.09.002

Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self regulation (pp. 451502). Academic Press.

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). Free Press.

Zhao, N., Gao, F., & Yang, D. (2018). Examining student learning and perceptions in social annotation-based translation activities. Interactive Learning Environments, 26, 958-969. https://doi.org/10.1080/10494820.2017.1421564

This content is provided to you freely by EdTech Books.

Access it online or download it at https://edtechbooks.org/jaid_13_2/personalizing_feedback_and_developing_learner_agency_using_kami.