The present chapter offers an in-depth exploration of students' experiences in an online learning environment developed in response to the COVID-19 pandemic. Specifically, the study entailed the creation of a comprehensive online English Writing course tailored for 20 undergraduate students at a Turkish state university that transitioned to fully or partially online education formats in March 2020. Emphasising meticulous attention to crucial factors such as academic knowledge, technology skills, cultural values, and socioeconomic status, the design process encompasses both micro-level and macro-level perspectives. Through a carefully sequenced progression of eight learning steps, the study explored various dimensions including learning needs, autonomy, content knowledge, learning awareness, and self-assessment and self-reflection skills. Drawing upon data gathered from surveys and student writings, the findings demonstrated that an emergent learning design process effectively facilitated and enhanced students' writing skills through scaffolding implemented at each relevant step. Furthermore, the findings underscored the significance of adopting a responsive design approach and incorporating diverse forms of interaction in online learning environments. The chapter concludes by providing an illustrative diagram outlining a responsive approach, comprising four key stages: pre-design, implementation, evaluation, and redesign considerations. Additionally, the chapter imparts pedagogical implications derived from activities that foster interaction among students, their instructor, and the learning context.
What I learned from teaching during the COVID-19 pandemic, particularly between March and June 2020 when teaching abruptly moved online, helped me later in thinking about and developing the course design for an English writing course discussed in this chapter. Based on my research experiences in Turkey and the United Kingdom (UK) as well as what I learned from international research and colleagues through scientific publications and social network websites, learning should be designed as a response to a learner's needs with the aim of reaching learning outcomes. In this chapter, I reflect on the design of an English writing course and explain how my background in learning design and the context in which I taught inspired my pedagogical practices.
During my doctoral study at a UK university, I worked on Laurillard’s Conversational Framework (2012) to understand how learners interacted with resources that had been designed to provide scaffolding for autonomous learning (Meri-Yilan, 2017). This framework with its focus on interaction has underpinned many of the design choices in this case and shaped how I approach course design. In later years, when I returned to teach in Turkey, I faced several issues relating to the lack of suitable, free and accessible resources.
The institution in which I was teaching was relatively young and located in the eastern part of Turkey where students mostly came from families with low socioeconomic status and education. That said, many students were very keen on improving their English using all available resources. To assist them, I initiated a language lab in a self-access learning centre equipped with comfortable desks, tables and desktop computers in a quiet setting which might not be otherwise available to many students in their homes. Having ensured that they could access suitable study rooms with the necessary equipment, I made available free resources and programmes which could facilitate their language learning.
Responsive learning design and interaction
In this chapter, I borrow the concept of “responsive design” from web development to describe an emergent, student-centred learning design process. Responsive design in web development is related to creating “websites capable of adapting layout, content, and appearance to optimise user experience across devices of varying sizes and capabilities including smartphones, tablets, and widescreen computers” (Gardner, 2011, p. 13). Marcotte (2011) notes that responsive design should not be confused with simply designing for mobiles or desktops: “rather, it is about adopting a more flexible, device-agnostic approach to designing for the web” (p. 1). Despite the wide spectrum of technology available for different purposes, predicting user contexts remains challenging for designers (Gardner, 2011), because users bring their own cultural and social paradigms to online environments (Sparke, 2009).
The integration of the notion of responsive design into learning design can enable the centring of students and their characteristics in a learning setting where they can interact with fellow students and others in their learning context. In web development, interaction is commonly understood in relation to human-to-human and human-to-computer engagements (Hornbæk & Oulasvirta, 2017). Devices are varied in online learning environments, from desktop computers to tablets to laptops to mobile phones. Additionally, Nebeling and Norrie (2013) remind us of the diversification of interaction types: learner-learner, learner-instructor and learner-content (Moore, 1996). Each interaction type functions as a conceptual benefit and a practical implication “when determining which media [or a digital device] to employ” (Sharp & Huett, 2006, pp. 3-4). Learner-learner interaction is associated with the sharing of information between peers and the provision of mutual feedback. Although learner-instructor interaction involves both teachers and students, instructors have a crucial role in increasing motivation and interest, planning student learning and encouraging each student. Learner-content interaction is related to improving learner knowledge through engaging with the subject or content.
Several studies have examined the importance of interaction and interaction types in online learning design (Demir Kaymak & Horzum, 2013; Farrah & Jabari, 2020; Hirumi, 2002; Liu & Kaye, 2016; Sharp & Huett, 2006; Wright, 2014). Warfvinge et al. (2021) indicate that a learning environment characterised by different teaching strategies can improve student learning. They also add that “[i]f we want to achieve strong readiness to tackle the challenges of teaching and learning in an uncertain future, we should be strategic about choosing, designing, and deploying tools that can do more than one thing” (Warfvinge et al., p. 15). Hirumi (2002) suggests designing and organising e-learning interactions through a three-level framework to think about which interactions to use in the design of the online learning setting. Level 1 includes metacognitive and cognitive practices to facilitate the self-regulation of learners. Level 2 consists of all three types of interaction such as learner-learner, learner-instructor and learner-content. Level 3 contains learner-instruction interactions where instructors or programme designers offer an approach to design Level 2 and stimulate Level 1. Furthermore, Sharp and Huett (2006) indicate that there is no research about which interaction type is more crucial in online settings but that online learning can explicitly be enhanced through learner-learner interaction. Demir Kaymak and Horzum (2013) state that readiness for online learning is crucial for learner-learning environment interaction. Moreover, Wright (2014) focuses on the importance of learner-teacher interaction “to establish and maintain social presence in online learning” (p. 1). This kind of presence includes awareness and perceptions of others through interaction (Short et al., 1976; Walther, 1992). However, the design and affordances of online settings affect this presence or interaction (Akcaoglu & Lee, 2016).
A study by Farrah and Jabbari (2020) explored student interaction in online courses during the COVID-19 outbreak. The study indicated that the major reasons behind the lack of interaction lay in the fact that students did not see online learning as a serious and “real” learning mode. Additionally, technical difficulties such as poor internet stability negatively impacted concentration. Teachers’ pedagogic choices influenced students’ motivation, participation and interaction. For these reasons, Farrah and Jabbari (2020) suggest that instructors employ a range of engagement strategies including games for entertainment, quizzes, audience response questions and discussions for formative assessment, PowerPoint for visual aids and bonus marks for extrinsic motivation.
This chapter aims to investigate students’ online learning that was designed during COVID-19 pandemic by describing the implementation of a learning design based on a responsive design approach and utilising different interaction types. The emergent learning design process took place in Turkey where in March 2020, all educational activities moved abruptly online. Even though teachers and students did not have much experience in and with online environments, they were expected to handle this new education practice. However, there is limited research on how students can handle their learning in this disrupted learning environment.
Application of responsive design
Lippman’s (2010) definition of a responsive designer underpins the emergent process that I detail in the following section.
- To describe, examine and analyse learners’ needs: I used my background knowledge, attended conferences and followed some Twitter conversations about interactions in teaching and designing online lessons during the pandemic. For example, pedagogy of care was one of the issues I realised that I had not considered in designing my courses. Pedagogy of care is defined as “a teaching practice based on reciprocity” (Obuaku-Igwe, p. 88) through which “teachers care [sic] their students’ learning concerning their emotions and morals” (Meri-Yilan, 2022, p. 180). I was lucky to be able to discuss this topic with a colleague working at a UK university which helped in broadening my insights about learners’ needs. The most insightful part about this interaction was knowing that colleagues in different countries were applying pedagogy of care in their teaching and it was working well for both students and teachers.
- To consider the possibilities of what may occur in online learning: I listed issues I had not come across before the pandemic but faced in my teaching in the period of March-June 2020. For example, I noted that easy access to software through different devices was challenging both for my students and me. Additionally, I developed Figure 1 below to capture factors affecting the online learning environments I was engaged in.
- To be aware of, evaluate and compare the recent research on teaching during the pandemic with my own design: I presented one aspect of my design at an international symposium (Meri-Yilan, 2020) where I received other scholars’ views on it. Based on this feedback, I integrated rubrics for each writing assignment as well as an automated feedback tool.
- To promote a design that was aimed at being congruent with the students’ needs: In this instance, I created an online classroom (not officially required by my institution) which I thought would create an informal atmosphere where students and I could freely express our thoughts. On my side, I was posting the aims of each designed activity or shared resources and programmes. On their side, they were commenting and posting their views, challenges and benefits.
- To recognise that every resource or programme is context specific: This means it is related to the aim of learning outcomes and that each learning step was designed for context (for details on the steps, see Table 1). The steps were framed based on what I gained from the above-stated aspects of responsive design: considering the possible occurrence in online learning, researching previous studies in this regard and promoting the design based on the students’ learning needs. While doing so, I referred to the work of Mor and Abdu (2018) concerning the importance of reflection on a teacher’s perception of design principles and pedagogical practices. I discussed each step with more experienced colleagues who have been teaching in the field for many years.
Figure 1 presents a visualisation of the factors that were considered in the design of the writing course presented in the virtual classroom: academic knowledge, technology skills, cultural values and socioeconomic status.
Context of the design: Turkish students in the liminal space
This learning design process was focused on undergraduate students in an intensive English writing course before beginning their bachelor’s degree. This preparatory course followed a textbook published by Cambridge University Press which further provided a learning management system (LMS) for online activities that could be accessed with a digital device and an internet connection. All students had the same lecturer.
At the beginning of the 2020 academic year, at the point when the learning design was being reconceptualised to respond to the pandemic, I asked students if they wanted to co-design or co-create their learning experience with me. All of them agreed. Subsequently, 10 students stated that they had commitments such as working and looking after their elders and were unsure whether they would be able to participate in all the activities. They were therefore excluded from the study. In the end, 20 of the 30 students (10 females and 10 males), aged 18-26, took part in the learning design process.
The 20 students had similar academic knowledge: an A2 level of English, based on an exam prepared according to the Common European Framework of Reference for Languages administered at the beginning of the academic year. They also had a common goal – to pass the preparation class exams and begin their first class in the Department of Translation and Interpretation. This made them keen to carry out the required activities rather than skipping them. However, they differed in technology skills, socioeconomic status and cultural values; this diversity shaped the course design. The virtual classroom designed through this approach constructed a liminal space where the design was aimed to move the students’ understanding of their writings from graduate to expert level (Wood, 2012).
Learning design from the individual (micro-level) perspective
Although all students had similar English levels, multiple factors, presented in Figure 1, were crucial to the design. For example, some students were taking evening preparatory classes. In Turkey, most tertiary-level students prefer evening courses so that they can take up employment during the day. When the students were asked about their reasons for taking evening courses, half of them said they worked during the day. The other half stated that they were not able to enter daytime courses because of their exam results. When asked whether they had ever thought of working while studying during the academic year, everyone said that they would most probably be doing that if the pandemic had not required them to study from home Additionally, one student reported that she had skipped one university course because of financial issues. Being able to study and work simultaneously is important for students to be able to fund their studies. However, during the pandemic work opportunities were limited.
Studies have shown that socioeconomic status may affect technology use which in turn can result in a digital divide (Harris et al., 2017). For example, five students indicated that they did not have a computer, laptop or tablet, or had shared access to such devices. Thus they could not attend all virtual classes or submit assignments because they had to share the device with siblings who also had to attend online classes. Despite their low socioeconomic status, half the group indicated that they were familiar with using digital devices and had been doing so for more than five years. This did not, however, yield reliable insight into their technological proficiency since the device they were referring to could have been a smartphone. None of the students had prior experience with online learning.
The students come from different parts of Turkey, which meant they had different cultural values. For example, the attendance of females is impacted by religious, financial and gender-related reasons (Tunç, 2009) with Selim & Balyaner (2019) suggesting that the exclusion of females from physical classrooms is a reason to include different online resources. Online learning could play a vital role in enabling educational access for female students in some contexts. Unfortunately, students did not make direct reference to the impact of cultural values on their education.
Learning design from institutional and global (macro-level) perspectives
There were two reasons for incorporating macro-level institutional and global perspectives in the design. From an institutional perspective, there is a distinct gap in educational equality between institutions located in the east and west of Turkey. The emergent learning design process occurred in a state university located in the east that had not used an LMS before the pandemic. Subsequently, this institution adopted an LMS called Uzaktan Eğitim Platformu (UZEP) which had been developed by a state university located in the relatively better resourced eastern part of Turkey. This meant that the university had limited capacity to create an online environment for its students and could not adapt the LMS to cater to students’ requirements. Some of UZEP’s primary functions that can be utilised include the delivery of online teaching, sharing of uploaded materials and examining students. It does not enable the creation of an effective learning environment where students can collaborate, receive different types of feedback and upload assignments.
From a global perspective, the pivot to emergency remote teaching arising from the outbreak of the COVID-19 pandemic has resulted in some students experiencing significant challenges with online learning. These challenges range from a lack of access to online classes and shared resources due to the shortage of internet access and digital devices to failure on the part of educators and institutions to implement a pedagogy-driven approach. The latter has not been a priority for Turkish educational institutions, many of which had also been implementing online and blended learning before the pandemic.
Steps of the learning design
The steps of the learning design were developed in consultation with the students and with due consideration of their learning needs. Each step was informed by a particular aim. Table 1 summarises each of the aims and the corresponding steps in the learning design process as well as the sequencing, frequency and duration of each step.
Table 1: Implemented steps with aims, duration and timing
Time period when implemented
1. Participant survey
To understand students’ learning needs and their backgrounds
2. Email exchange
To foster learner-instructor interaction and build trust
Every 4 weeks
3. Assignment writing
To improve student writing
Throughout the year
From the beginning to the end of the year
4. Creating rubrics
To facilitate self-assessment skills
Throughout the year
From the beginning to the end of the year
5. Personal blogging
To foster individual online writing based on user choices and interests and practice of autonomy
Every 4 weeks
6. Integrating a writing tool for automatic grades and feedback
To make students aware of their language levels and provide feedback from the tool
At least once
Starting from week 18
7. Creating short YouTube videos
To remind them of key points in writing an essay
Weeks 24 and 28
8. Self-report writing
To empower students with self-reflection skills
Objectivity can be challenging when evaluating writing assignments. However, it is crucial as students need both unbiased and various perspectives and feedback on their writings. To make this happen, I discussed my design with two experts.
The following describes the process through which the students took the writing course as a liminal space. As regards the notion of liminal space, it was highlighted to indicate the importance of recognising and embracing transitional states from classroom learning to entirely online learning as valuable opportunities for development, change, autonomy and self-discovery.
Step 1: Understanding the characters of online learners and learning
Although basic preparation in designing a learning space typically starts with technical considerations regarding systems and services, the characteristics of learners and their learning needs should also be considered (Bennett, 2007). In this emergent learning design process, a survey was first implemented through a Google form to obtain basic information about the students such as their gender, age, learning aims, technology use and language level. The survey showed that all students had the same academic knowledge and learning aim – to improve their English for academic writing purposes, but differed in technology use, socioeconomic status and cultural values. Responses to the question: “What is your cultural view of learning online?” displayed considerable variation. Data from the survey indicated that the learning design approach should accommodate the use of a range of devices including smartphones, laptops, computers and tablets. Therefore, I decided to design and test the course for multiple devices.
Step 2: Building trust between learners and the instructor online
The survey provided insight into students’ characteristics and learning aims. However, I needed more information about their learning interests and to establish the level of trust required for them to believe that this was a trustworthy learning space. This was an inevitable issue as trust is “a requisite component of interaction between learner and instructor for the learner to maximise their learning” (Wang, 2014, p. 346). The learner may otherwise be misled into thinking that they are wasting their time and money and drop the class (Wang, 2014). To help the students overcome this kind of mistrust, I asked them to send me an email introducing themselves with details about anything relating to their personal lives (e.g. their hometown, language level, hobbies). For them to get to know me, I responded to their emails according to the topic they raised in the email. Students nominated the topic by deciding on what to write about, but I, the lecturer, had the role of sustaining their interest in the topic. For example, one student narrated her trip to Italy, and in response to her, I shared details of my visit there and asked which cities she liked the most. In this way, I exchanged at least four emails with her. The frequency of the email exchange typically depended on the student's reply. There were only five students (three males and two females) who sent emails back and forth only twice; the others did so at least four times. This led me to consider including some other individual and collaborative writing assignments into the course.
Step 3: Integrating writing tasks for individual development and collaboration
To improve students’ writing skills, they prepared a writing task at the end of each of the 16 units in the coursebook. Each unit prepares students to understand the writing task by providing reading and writing samples, different writing techniques and types and information on grammar and vocabulary. Students submitted their writing tasks within a one-week deadline period through a Google Classroom as the UZEP LMS did not have this functionality. Another major reason for choosing the Google Classroom was that students could submit tasks as Google Docs which enabled the rapid giving and receiving of feedback. Following the one-week deadline, I gave feedback online during the class. If a participant could not attend it, they could watch the recording afterwards. Since some students might be shy, I asked them regularly whether there was anyone who did not want me to show their writing on the screen. Although there were some students at the beginning of the year who said yes to this question, over time, they grew accustomed to this feedback mode and consented to their writing being shown online. To understand this process and their feelings, I used the polling system embedded in UZEP that shows who says yes or no. I also asked them via a Google form whether they found this feedback useful in developing their writing skills. They all replied positively. As Tangpermpoon, (2008) suggested in previous research, this approach facilitated the improvement of their writing and knowledge about different topics lectured in each unit.
In addition to 16 individual writing tasks, I gave them a writing assignment about one of the unit topics every eight weeks. They chose a partner with whom to write about the assignment and were asked to give their opinions on peer collaboration via a Google form. Based on their answers, only two groups agreed that peer collaboration allowed them to see their mistakes and improve their writing which contradicts Bhowmik et al.’s (2018) claim that peer collaboration can help students gain more awareness and understanding of their writing. The other groups stated that it was challenging trying to communicate with their partners in an online setting which is in line with a study by Andrew (2019). It follows that each person should have the chance to evaluate themselves when there is no one online; this propelled me to integrate self-assessment in the design.
Step 4: Designing rubrics for self-assessment
When designing the fourth step, I researched how to improve student self-assessment skills. According to the literature (Ghaffar et al., 2020; Panadero & Romero, 2014; Ragupathi & Lee, 2020; Wang, 2017), rubrics can illustrate scores and criteria for a task, enable students to see their progress and calculate their scores. Therefore, I created a rubric for each writing task. As “scoring rubrics focus on the product” (Ragupathi & Lee, 2020, p. 75), I did not include scores but instead used checklists with yes/no questions to motivate students to focus on the process. Some of the questions were: “Did you write an introduction?” and “Did you use nouns and adjectives for the environment (e.g. ecosystem or harmful)?” These checklists were delivered during the online class when assigning writing tasks. As the class was recorded, students could watch it again to review the lists. At the end of the year, they were asked whether they found rubrics useful through a Google form and all answered “yes”. Afterwards, one participant stated in an email that the checklists helped him improve his assignments in other courses. This indicates that rubrics potentially act as self-assessors and promote students’ self-assessment skills. Although previous research (Esfandiari & Myford, 2013; Movahedi & Aghajanzadeh Kiasi, 2021) have found that a teacher assessor is preferred and effective in increasing writing scores, this step aligns with Andrade et al. (2010) who believe that differences in individual-based scores can be “meaningful in practice” (p. 208). When responding to their need to assess their learning on their own, this step helped me develop their autonomous learning by integrating blogging in the design.
Step 5: Devising personal blogging for the promotion of autonomous learning
Following the week six lecture on how to write a paragraph, extensive writing through a blogging approach was implemented since this method allows students to “write on a wide range of interesting topics of interest to the authors, at their own pace, for various audiences and free from teachers’ corrections and judgement” (Sun, 2010, p. 328).
Consequently, the students were asked to write eight tasks in four weeks through a private blogging platform. They were free to choose any topic and write at any time and for any audience. The only requirement was to share the blog posts with me via Google Classroom by the deadline and to write the posts with the different paragraph types that were taught in the class. In grading their blog posts, I scored their use of grammar, vocabulary, punctuation, spelling, fluency and organisation etc. over time. I did not ask them to tell me if they found their graded scores appropriate or not. A few students narrated their positive beliefs about the accuracy of the scores, but this could not be considered reliable information as I was their lecturer at the time. However, I asked three evaluation questions through a Google form. As a response to the first question, all the students stated that they enjoyed blogging. As an answer to the second question, 19 students indicated that blogging promoted their autonomous learning (i.e. self-study, self-assessment, finding resources on their own etc.). In response to the final question, some of them elaborated on how blogging aided their autonomous learning. Five of them stated that blogging helped them to pace their learning on their own, while six indicated that it motivated them to write more. Furthermore, eight students said that it enhanced their critical thinking and research skills. This was probably because they had to choose the topic on their own and had to find related vocabulary and arguments by themselves. One participant disagreed with others as he was very sick during the one-month blogging period and could not write the entries on time. Although I told him that I could give an extended deadline, he did not take me up on this because he had other homework. As shown, extensive writing through the blogging approach mostly worked for the students. However, teachers should be ready for the unexpected issue that arose and possibly other factors relating to class context.
However, as blog entries were longer than emails, students appeared to need fresh eyes to check for grammatical accuracy. Considering this need, I searched for an online tool that helps students in this regard. Luckily, I attended an online session by the University of Cambridge in which a tool called Write & Improve was introduced. I will explain this tool in more detail in the next step.
Step 6: Integrating a writing tool for automatic grades and feedback
Having helped the students become accustomed to self-assessment, the Cambridge Write & Improve, a free writing tool for providing feedback and automated grades, was introduced to them. This tool was chosen because it provides instant feedback and scores on students’ writing (Tursina et al., 2021). To avoid its limitation, that is, distracting students from writing without the tool, they were told to use it at least once but not multiple times (Grimes & Warschauer, 2010). In a survey administered via a Google form, they all stated that they favoured this tool and found it effective in terms of seeing their mistakes and improving their writing. Despite the immediate feedback on their writing from the tool, they might need human assistance. This step helped me find a response to this need – scaffolding each student as described in the next step.
Step 7: Creating short reminder videos
Towards the end of the year, I realised that there were still some students who struggled with writing a proper essay. To address this, I prepared a survey asking what the problem was for them when watching the recorded lectures. They indicated that they had to share their devices and study space as they did not have a personal space and device, so they skipped crucial points in the lecture. Although the problems ranged from device sharing to the distraction of studying from home, the actual issue was the length of the lectures which must be at least 30 minutes according to the university regulation. As it is hard to keep motivation and attention for a long time (Bolliger et al., 2010), videos shorter than ten minutes were created and shared with the students. They responded through a Google form that they were more satisfied with short videos as was the case in a study by Hsin and Cigas (2013), because of easy access and less distraction. When asked whether it would be more advantageous if the videos could play through a platform which they could easily access, they answered yes. It was agreed that YouTube would work best. Short videos were thus created to function as easily-accessible reminders.
Step 8: Assigning a self-report writing task
Referring to Zimmerman’s (2008) self-regulated learning strategies (SLR), I focused on developing students’ planning, monitoring and reflection skills step by step. Related to Ericsson’s (2002) indication about the importance of being a performer in a learning environment through SLR strategies, Lam (2018) defines reflection as being about:
active monitoring and reviewing of the entire portfolio development process, diagnosing the strengths and weaknesses of different aspects of drafts of writing and displaying outstanding work to represent the best performance, selected from a host of artefacts kept in students’ portfolios. (p. 221)
According to Lam’s definition, reflection covers all processes from the beginning to the end of the learning process. As regards my intervention, I narrowed the focus to a one-paragraph assignment but included a broader sense of what they had gained to reflect on. The assignment was a self-report writing task named My Choice, My Voice which was submitted in a single paper. The task entailed:
- Finding the first post they sent to me in the email exchange.
- Revising the post and rewriting the revised version.
- Expressing their beliefs about their progress in writing through the implemented learning design.
- Stating their views on how they had made that progress.
Students focused on revising one post in line with the knowledge they had developed throughout the year. Based on their answers to each aspect, I was able to understand how much progress the students had made and what they thought of the learning design and their self-learning. These insights are presented in the following section.
Design insights from the students’ perspectives
Students’ progress: Differences between first and last posts
The comparison between the first and last posts showed that students had made positive progress throughout the year. While their initial posts were made up of three to five sentences, their last posts were longer. When compared in terms of linguistic features, fewer grammatical and mechanical errors were made and more complex structures were used in the latter posts. While a few students wrote only a corrected version of their first post, others added new information about themselves which can suggest that they became more open to sharing and trusted their teacher more. Additionally, most of them used more formal, academic English in the last post. For example, in the initial email, students were more colloquial using “Hello teacher” in their email salutation and “Love” when closing the email, while ”Dear Mrs Meri-Yilan” and “Yours sincerely” were used in the revised post. The use of vocabulary also changed from “I want” to “I would like”. This indicates that they could write more formally by the end of the course.
Students’ views on the design and their self-learning
In the last part of the task, the students were asked to write about their progress in writing. All students stated that they made positive progress in writing attributing this to: writing practice, feedback from the teacher, the coursebook, recorded videos, rubrics and peer collaboration. For practice, some indicated that the variety in writing topics and tasks enabled them to write “even about a new topic” “confidently” and “without being too nervous”. Ten students articulated the benefit of feedback in this design. One student wrote: “The more one writes and makes mistakes, the more they can develop”. Another participant stated: “We're all afraid of doing wrong when we start a new hobby or endeavour, but I've learned that making mistakes is the most important belief that will lift us up, not humiliate us.” The coursebook and recorded videos were regarded as opportunities to go back to any topic they had not understood or missed in an online class. The checklists in the rubrics were also seen as a “facilitator” to go over and evaluate their writings. Again, two students mentioned that peer collaboration activities were helpful in understanding their mistakes.
As for self-learning, the students described their self-learning process as a “motivated” and” encouraged” learning path. Fifteen of the students experienced this path as learning the basics for writing and then transferring these basic skills from writing a paragraph to an essay. Five of them also mentioned using additional resources that they found through their research skills gained in the writing class.
Design insights from the learning designer’s perspective
This emergent learning design process has taught me valuable lessons that align with previous studies. It confirms the significance of implementing a responsive design approach and incorporating different interaction types in online learning. What I find particularly interesting is the newfound understanding I have gained through this process. It has become clear to me that establishing a trusting environment between the teacher and students is crucial, especially in online settings where students may have never met their instructor. Additionally, I have discovered that students greatly appreciate having diverse access options to information which may even be obligatory in certain cases. They also demonstrate a strong preference for engaging in a variety of learning activities that go beyond individual writing. Activities like blogging, collaboration, research skills, self-assessment and self-regulation have shown to be highly valued by students. I have come to appreciate that these activities serve as effective scaffolds aiding students in identifying and understanding their mistakes. By engaging in such activities, students can interact with their teachers, peers and the learning context, thus promoting their progress and growth in academic writing which was the primary focus of this course. Undoubtedly, these positive aspects have significantly contributed to the success of the designed course.
Overall, this chapter demonstrates that despite the challenges posed by online learning during the pandemic, most students appreciated the activities to enhance their English writing skills. This was achieved through the design of the writing course which considered three key factors: (1) Lippman's (2010) suggestions on responsive design; (2) perspectives on learning design from micro and macro levels and of the designer and/or teacher; and (3) designing steps to develop a response in a liminal space. Figure 2 depicts how these three factors are interconnected and shows four design stages: pre-design, implementation, evaluation and redesign considerations.
During the pre-design stage, I incorporated Lippman's (2010) recommendations on responsive design, enabling me to address the unique needs of learners from micro-level perspectives, and institutional and global needs from macro-level perspectives. This holistic approach was instrumental in creating a comprehensive learning experience. I was able to refine my expertise in online learning design through academic collaboration.
During the implementation phase, I meticulously integrated a wide array of activities into the course structure, empowering students to become active, independent and collaborative learners. The provision of timely feedback, automated grading, and clear assessment guidelines and rubrics was instrumental in fostering their growth and improvement. Additionally, utilising short videos proved to be an effective means of facilitating easy and efficient access to key topics.
In the evaluation stage, I conducted a comprehensive analysis of students' progress by comparing their initial and final writing while also considering their self-reports on their learning journey and self-directed learning. This thorough assessment provided me with invaluable insights, enabling me to identify areas for potential redesign and improvement that come up in the next stage.
Finally, in the redesign stage, I recognised some limiting factors and offered recommendations. As the designer, researcher and lecturer, I recognise that my own biases and beliefs may have inadvertently influenced the design and analysis. Despite seeking external input from scholars and colleagues, it is crucial for me to continually strive for objectivity and impartiality. Another gap is the extent to which I could count on my students to contribute to the co-design process. Despite the surveys letting them express their voices, I was not able to hear their voices individually in certain cases because of the heavy workload during the pandemic and the number of students in the course. The integration of the students into the learning design was also affected by the culture of the students: Turkish students are usually compliant and want to pass their classes so they might avoid expressing their “real” feelings which might cause one to question how much their voices were “heard”. This issue is partly because of the dominance of Turkish culture in education (Kosker & Ozgen, 2018). However this was out of scope for this chapter and could be an area for further research.
Based on my design experience and the limiting factors identified, I recommend that in future studies, a teacher and/or course designer should create a reliable and flexible online learning environment which students can follow in their learning process. It is also critical to consider differences in students' social, cultural and academic backgrounds when developing the learning design. However, this emergent learning design has its limitations as its context was restricted to an English academic writing course in a Turkish higher education institution. The sample size was also small. To determine if there is a difference in students, future studies can include more participants and construct controlled and treatment groups.
Akcaoglu, M., & Lee, E. (2016). Increasing social presence in online learning through small group discussions. International Review of Research in Open and Distributed Learning, 17(3), 1–17. https://doi.org/10.19173/irrodl.v17i3.2293
Andrade, H. L., Du, Y., & Mycek, K. (2010). Rubric‐referenced self‐assessment and middle school students’ writing. Assessment in Education: Principles, Policy & Practice, 17(2), 199–214. https://doi.org/10.1080/09695941003696172
Andrew, M. (2019). Collaborating online with four different Google apps: Benefits to learning and usefulness for future work. Journal of Asia TEFL, 16(4), 1268–1288. http://dx.doi.org/10.18823/asiatefl.2019.16.4.13.1268
Bennett, S. (2007). First questions for designing higher education learning spaces. Journal of Academic Librarianship, 33(1), 14–26. https://doi.org/10.1016/j.acalib.2006.08.015
Bhowmik, S. K., Hilman, B., & Roy, S. (2018). Peer collaborative writing in the EAP classroom: Insights from a Canadian postsecondary context. TESOL Journal, 10(2), 1–16. https://doi.org/10.1002/tesj.393
Bolliger, D. U., Supanakorn, S., & Boggs, C. (2010). Impact of podcasting on student motivation in the online learning environment. Computers & Education, 55(2), 714–722. https://doi.org/10.1016/j.compedu.2010.03.004
Demir Kaymak, Z., & Horzum, M. B. (2013). Relationship between online learning readiness and structure and interaction of online learning students. Educational Sciences: Theory and Practice, 13(3), 1792–1797. https://doi.org/10.12738/estp.2013.3.1580
Ericsson K. A. (2002) Attaining excellence through deliberate practice: Insights from the study of expert performance. In M. Ferrari (Ed.), The pursuit of excellence in education (pp. 21–55). Lawrence Erlbaum Associates Publishers.
Esfandiari, R., & Myford, C. M. (2013). Severity differences among self-assessors, peer-assessors, and teacher assessors rating EFL essays. Assessing Writing, 18(2), 111–131. https://doi.org/10.1016/j.asw.2012.12.002
Farrah, M. A., & Jabari, S. M. G. (2020). Interaction in online learning environment during Covid-19: Factors behind lack of interaction and ideas for promoting it. Bulletin of Advanced English Studies, 5(2), 47–56. https://doi.org/10.31559/BAES2020.5.2.3
Gardner, B. S. (2011). Responsive web design: Enriching the user experience. Sigma Journal: Inside the Digital Ecosystem, 11(1), 13-19. https://studylib.net/doc/8225889/responsive-web-design--enriching-the-user-experience
Ghaffar, M. A., Khairallah, M., & Salloum, S. (2020). Co-constructed rubrics and assessment for learning: The impact on middle school students’ attitudes and writing skills. Assessing Writing, 45, 1–15. https://doi.org/10.1016/j.asw.2020.100468
Grimes, D., & Warschauer, M. (2010). Utility in a fallible tool: A multi-site case study of automated writing evaluation. Journal of Technology, Learning and Assessment, 8(6), 1–44. http://escholarship.bc.edu/jtla/
Harris, C., Straker, L., & Pollock, C. (2017). A socioeconomic related digital divide exists in how, not if, young people use computers. PLOS ONE, 12(3), 1–13. https://doi.org/10.1371/journal.pone.0175011
Hirumi, A. (2002). A Framework for analyzing, designing, and sequencing planned eLearning interactions, Quarterly Review of Distance Education, 3(2), 141–160. https://eric.ed.gov/?id=EJ654229
Hornbæk, K., & Oulasvirta, A. (2017, May 02). What is interaction? [Conference session]. 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, United States. https://doi.org/10.1145/3025453.3025765
Hsin, W. J., & Cigas, J. (2013). Short videos improve student learning in online education. Journal of Computing Sciences in Colleges, 28(5), 253–259. https://www.researchgate.net/publication/262329183_Short_videos_improve_student_learning_in_online_education
Kosker, N., & Ozgen, N. (2018). Multiculturality concept and its reflections on education: The case of Turkey. Review of International Geographical Education Online, 8(3), 571–600. http://www.rigeo.org/vol8no3/Number3winter/RIGEO-V8-N3-9.pdf
Lam, R. (2018). Promoting self-reflection in writing: A showcase portfolio approach. In A. Burns & J. Siegel (Eds.), International perspectives on teaching the four skills in ELT (pp. 219–231). Palgrave Macmillan.
Laurillard, D. (2012). Teaching as a design science: Building pedagogical patterns for learning and technology. Routledge.
Lippman, P. C. (2010). Evidence-based design of elementary and secondary schools: A responsive approach to creating learning environments. John Wiley & Sons.
Liu, J. C., & Kaye, E. R. (2016). Preparing online learning readiness with learner-content interaction: Design for scaffolding self-regulated learning. In Information Resources Management Association (Ed.), Blended learning: Concepts, methodologies, tools, and applications (pp. 586–614). IGI Global.
Marcotte, E. (2011, March 24). Toffee-nosed. https://ethanmarcotte.com/wrote/toffee-nosed/
Meri-Yilan, S. (2017). Take your time to find yourself!: An exploration of scaffolded autonomous elearning environments amongst international students in a UK university [Doctoral dissertation, University of Southampton]. https://eprints.soton.ac.uk/414201/
Meri-Yilan, S. (2020, October 15). Writing online dialogue journals and Penzu for student-teacher interaction and autonomous learning: A research plan [Symposium presentation]. Learning in Online Learning Settings [Online].
Meri-Yilan, S. (2022). (Re)considering motivational scaffolding: A mixed-method study on Turkish students’ perspectives on online learning before and during the pandemic. Open Praxis, 14(3), pp. 179–189. https://doi.org/10.55982/openpraxis.14.3.466
Mor, Y., & Abdu, R. (2018). Responsive learning design: Epistemic fluency and generative pedagogical practices. British Journal of Educational Technology, 49(6), 1162–1173. https://doi.org/10.1111/bjet.12704
Movahedi, N., & Aghajanzadeh Kiasi, G. (2021). The effect of teacher vs. learner-assessment activities on the Iranian intermediate EFL learners’ writing ability. International Journal of Research in English Education, 6(1), 49–63. http://ijreeonline.com/article-1-467-en.html
Nebeling, M., & Norrie M. C. (2013). Responsive design and development: Methods, technologies and current issues. In F. Daniel, P. Dolog & Q. Li (Eds.), Web engineering: ICWE 2013: Lecture notes in computer science (pp. 510–513). Springer. https://doi.org/10.1007/978-3-642-39200-9_47
Obuaku-Igwe, C. (2021). Teaching and re-imagining the role of medical sociology in South Africa during COVID-19: A reflection. In J. D. Herron (Ed.), Strategies for student support during a global crisis (pp. 71–88). IGI Global.
Panadero, E., & Romero, M. (2014). To rubric or not to rubric? The effects of self-assessment on self-regulation, performance and self-efficacy. Assessment in Education: Principles, Policy & Practice, 21(2), 133–148. https://doi.org/10.1080/0969594X.2013.877872
Ragupathi, K., & Lee, A. (2020). Beyond fairness and consistency in grading: The role of rubrics in higher education. In C. Sanger & N. W. Gleason (Eds.), Diversity and inclusion in global higher education (pp. 73–95). Palgrave Macmillan.
Selim, S., & Balyaner, İ. (2019). Investigation of the factors determining the number of information technology products owned by households: A count data model. Mehmet Akif Ersoy University Journal of Social Sciences Institute, 9(22), 428–454. https://doi.org/10.20875/makusobed.296800
Sharp, J. H., & Huett, J. B. (2006). Importance of learner-learner interaction in distance education. Information Systems Education Journal, 4(46), 1–10. http://isedj.org/4/46/
Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. John Wiley & Sons.
Sparke, P. (2009). The genius of design. Quadrille.
Sun, Y. C. (2010). Extensive writing in foreign‐language classrooms: A blogging approach. Innovations in Education and Teaching International, 47(3), 327–339. https://doi.org/10.1080/14703297.2010.498184
Tangpermpoon, T. (2008). Integrated approaches to improve students' writing skills for English major students. ABAC Journal, 28(2), 1–9. http://www.assumptionjournal.au.edu/index.php/abacjournal/article/view/539
Tunç, A. İ. (2009). Reasons why girls do not go to school: Van sample. Yüzüncü Yıl University Journal of Education, 6(1), 237–269.
Tursina, P., Susanty, H., & Efendi, Z. (2021). Teacher corrective feedback vs Cambridge English write and improve (CEWI) in improving EFL students’ writing performance. English Language Study and Teaching, 2(1), 30–44. https://doi.org/10.32672/elaste.v2i1.3337
Walther, J. B. (1992). Interpersonal effects in computer-mediated interaction: A relational perspective. Communication Research, 19(1), 52–90. http://dx.doi.org/10.1177/009365092019001003
Wang, W. (2017). Using rubrics in student self-assessment: Student perceptions in the English as a foreign language writing context. Assessment & Evaluation in Higher Education, 42(8), 1280–1292. https://doi.org/10.1080/02602938.2016.1261993
Wang, Y. D. (2014). Building student trust in online learning environments. Distance Education, 35(3), 345-359. https://doi.org/10.1080/01587919.2015.955267
Warfvinge, P., Löfgreen, J., Andersson, K., Roxå, T., & Åkerman, C. (2021). The rapid transition from campus to online teaching – how are students’ perception of learning experiences affected? European Journal of Engineering Education, 47(2), 1–19. https://doi.org/10.1080/03043797.2021.1942794
Wood, P. (2012). Blogs as liminal space: Student teachers at the threshold. Technology, Pedagogy and Education, 21(1), 85–99. https://doi.org/10.1080/1475939X.2012.659885
Wright, R. D. (Ed.). (2014). Student-teacher interaction in online learning environments. IGI Global.
Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal, 45(1), 166–183. https://doi.org/10.3102%2F0002831207312909