LXD Webinar Series - Think-Aloud Methods: Just-in-Time & Systematic Methods to Improve Course Design
Andrea Gregg, Ronda Reid, Tugce Aldemir, Amy Garbrick, & Jennifer Gray
In this webinar, our guest speakers discussed methods and strategies to maximize the user experience within course designs. The panelists shared their experiences and offer recommendations for instructional designers.
0:00:00.960,0:00:11.070 Matthew Schmidt: Okay, so we have started started the recording and it is my pleasure today to introduce the second in a series of webinars. 0:00:11.460,0:00:20.820 Matthew Schmidt: Related to learning experience design for the design and development division, my name is Matthew Schmidt and i'm the current President of the division. 0:00:21.510,0:00:35.490 Matthew Schmidt: And we, as a team of editors released a book on learner and user experience methods, an introduction to the field of learning and instructional design and technology and 2020. 0:00:36.090,0:00:45.090 Matthew Schmidt: and Dr Greg and her colleagues included a chapter in that book and today they're going to be giving a webinar. 0:00:45.570,0:01:03.660 Matthew Schmidt: That goes over that chapter it's entitled think aloud methods for learning experience design justin time and systematic methods to improve course designs so on that note i'm going to pass the MIC to our presenter who can present her author team so it's all you. 0:01:04.290,0:01:18.270 Andrea Gregg: Okay, thanks matt great to be here and great to see so much interest in this topic we have been involved for many years and just quickly to talk about who's here. 0:01:19.470,0:01:33.120 Andrea Gregg: So myself i'm currently in our old director of online pedagogy rhonda read currently a project manager to child Amir currently postdoc at uconn amy gar brick director of learning design and john Gray. 0:01:33.480,0:01:43.560 Andrea Gregg: An instructional designer The important thing, I think, especially given the division and a CT is that we were all in some ways. 0:01:45.420,0:01:50.130 Andrea Gregg: either are or were instructional designers so a lot of our. 0:01:51.360,0:01:59.610 Andrea Gregg: sort of passion and focus comes from always improving the learning and learner experience with that. 0:02:00.720,0:02:09.510 Andrea Gregg: And this is in the book and you've probably seen this if this is an area you're interested in, but all of our work goes to looking at sort of. 0:02:10.110,0:02:25.650 Andrea Gregg: any sort of tension between designer intention and actual user experience and so that should frame everything whether we're talking about systematic or just in time sort of approaches to this. 0:02:27.810,0:02:41.250 Andrea Gregg: And, just a quick overview of what we're going to do i'm going to talk really briefly about what we mean by systematic versus justin time briefly about sort of what how we got into this area what what prompted it. 0:02:42.570,0:02:55.110 Andrea Gregg: Defining think aloud observations that is one where there is a lot of research theoretical empirical we are barely scratching the surface, but want to give enough so you understand kind of what. 0:02:55.530,0:03:10.890 Andrea Gregg: What informed our methods and then we'll give two case examples and their length and number of slides should definitely indicate which is systematic and which is just in time, the systematic, I think, is about 40 slides and adjust and time is about four slides. 0:03:12.090,0:03:19.860 Andrea Gregg: And then we're going to end with just some resources that we have based on everything we're presenting today, if you want to do your own. 0:03:20.730,0:03:41.160 Andrea Gregg: So really briefly, this is a continuum you can you will likely fall somewhere along here we'll start with systematic it's good when you have a high stakes large scale project when you have funding time and leadership involvement and need to convince people with data so. 0:03:42.600,0:03:52.950 Andrea Gregg: If you're working across your university or your college in people are disagreeing about how something should be done on a website or in a course design. 0:03:54.450,0:04:14.340 Andrea Gregg: Going this route sort of gives you evidence that's more objective and from your learner's perspectives rather than your own kind of preference and a desire to publish present and share broadly luckily when we started this we kind of thought we should go ahead and get IRB we should. 0:04:15.510,0:04:33.630 Andrea Gregg: really do this right in a systematic sense, so that we can do more with it, it can require a large team with diverse skills we were lucky to have research expertise obvious instructional design expertise process and project management expertise and all of that really helped. 0:04:34.920,0:04:40.320 Andrea Gregg: and multiple months to collect and analyze data on the other end of the continuum. 0:04:41.040,0:04:49.500 Andrea Gregg: Just as important, and I really want to emphasize that we are not advocating systematic over just in time, it is all about sort of what you're. 0:04:50.130,0:04:58.470 Andrea Gregg: Trying to do and what you have available to accomplish it whether you're on the left or the right side of this continuum we're strong advocates of. 0:04:59.010,0:05:05.460 Andrea Gregg: finding out what even a single person's actual experience of your website or course design is. 0:05:06.420,0:05:22.110 Andrea Gregg: to inform your adjustments so when you have a single course are designed to improve limited to no time or funding know Dean, to convince anyone your only job is to make it better and no need or desire to publish or present on what you do. 0:05:23.250,0:05:27.180 Andrea Gregg: All it requires is you, and someone else to navigate your course. 0:05:27.900,0:05:38.550 Andrea Gregg: And around 30 minutes five minutes to find someone 10 minutes to watch them navigate while thinking aloud and 10 minutes to identify improvements now of course that number can go up. 0:05:38.970,0:05:55.080 Andrea Gregg: But you can really tell the difference in in each of these areas, again we they're equally important and it's likely you'll fall somewhere along the continuum depending on what you have and the project that that you're dealing with. 0:05:57.390,0:06:09.840 Andrea Gregg: So i'm gonna hand it over to rhonda reed who I sort of describe as the instigator of all of this, the talk about kind of her passion and background and and why she sort of. 0:06:11.310,0:06:14.070 Andrea Gregg: encouraged all of us to get into this more. 0:06:16.560,0:06:24.960 Ronda Reid: Well, thank you very much, Andrea and thank you to matt and everybody here today we're very excited to talk about a subject that is near and dear to my heart. 0:06:25.980,0:06:34.200 Ronda Reid: And wanted to really examine at this point, talk a little bit more about what learner experience design is and why we think it's important and how we got started with it. 0:06:34.680,0:06:45.990 Ronda Reid: So a little bit about my personal background is before I became an instructional designer I actually was a web designer and developer, and for those of you who have any familiarity with kind of that world. 0:06:46.290,0:06:57.960 Ronda Reid: You know that user experience testing is is not only pretty common but it's also considered best practice so when I transition from web design background to learning design I started asking. 0:06:58.710,0:07:12.000 Ronda Reid: Well, why aren't we testing our online courses because they're happening in the same online environment as as website design and and I have to be honest with you when I first started raising that question people really balked at that and. 0:07:13.350,0:07:23.550 Ronda Reid: A lot I think could be summarized by this notion that in education, we were educators we're not trying to get people to purchase a widget so there's a difference. 0:07:24.000,0:07:35.640 Ronda Reid: And, and I definitely acknowledge that you know we are coming from a standpoint of not a commercial consumer base standpoint we're coming from it for a very noble and altruistic. 0:07:36.420,0:07:48.720 Ronda Reid: means, but I really did kind of think like I think this is kind of missing the point of what learner experience design testing can do because really when we talk about this kind of testing at its heart. 0:07:49.050,0:07:57.510 Ronda Reid: is looking at our learning environment and trying to make improvements of it to aid certain learning outcomes happening. 0:07:58.470,0:08:05.040 Ronda Reid: And when we think about that it's actually not too different than what we experience in a residential or physical environment. 0:08:05.400,0:08:13.650 Ronda Reid: We do this same kind of practice but it's a little bit more maybe subconscious and we're not really doing this kind of formalized testing or even just in time testing. 0:08:14.100,0:08:21.330 Ronda Reid: So, if you think about your online classroom if I walk into a classroom as an educator i'm going to survey my environment i'm going to look at. 0:08:21.840,0:08:34.860 Ronda Reid: You know, are the windows open would that provide a distraction to my students as i'm teaching what's the chalkboard situation, the presentation tools that are available to me what's my desk situation like and as i'm. 0:08:35.310,0:08:41.310 Ronda Reid: Teaching students do I need to change this environment to better support the learning outcomes that I want. 0:08:41.610,0:08:50.010 Ronda Reid: So, for example, if I do have windows and it's a nice day and I see students are going to look outside the window and maybe be distracted by that i'm going to pull down the shades. 0:08:50.250,0:08:54.210 Ronda Reid: Because that's going to be a detractor from the learning objectives that I want to achieve. 0:08:54.630,0:09:02.310 Ronda Reid: And same thing with I might want to reconfigure the desk if I have a certain assignment to help facilitate maybe a group discussion or a group project so. 0:09:02.820,0:09:11.160 Ronda Reid: As educators we're making those design decisions on the fly and really that's kind of the same basis of what we're talking about for learners sign. 0:09:11.820,0:09:20.910 Ronda Reid: learner experience design testing is we are going through this testing process in order to build a better design environment for our students to learn. 0:09:21.750,0:09:30.450 Ronda Reid: So that was kind of the thinking behind why this kind of testing might be necessary and had an opportunity to kind of push that topic in. 0:09:31.680,0:09:40.740 Ronda Reid: When penn state as a university was switching over to a new LM S, and I think, for if there are any learning designers or the on the call who. 0:09:41.160,0:09:49.920 Ronda Reid: gone through this process before have some familiarity with it, this is really no small undertaking to switch to a new LM s and as a learning designer at the time, what it really meant for us was. 0:09:50.010,0:09:54.300 Ronda Reid: homeschooling nuclear we decided to really look at what we were doing with our classes. 0:09:54.750,0:10:01.500 Ronda Reid: really looking to see how we could take advantage of this new learning environment what tools were available to us and really approached with. 0:10:01.950,0:10:14.820 Ronda Reid: let's design these courses from scratch on those and, as we were doing that mentioned again what if we start doing this kind of testing and we decided to test it and the College that that amy goldberg and I were in and. 0:10:15.360,0:10:26.940 Ronda Reid: What we started as we started to glean some really important information from that and as others around the university heard of that they said hey we want to start doing this, too, and then that kind of movement started to grow to include. 0:10:27.210,0:10:36.240 Ronda Reid: Many of the people on the on the call today and even more or part of our research group so that's a little bit about how how we came to this now obviously we're talking. 0:10:37.410,0:10:47.490 Ronda Reid: That was a little bit of go, but I will say that since that time, my my favorite belief that we need to do this, and the importance of it has only grown. 0:10:47.880,0:10:57.630 Ronda Reid: And some of those grown just by changes i've seen in my own life and One of those is that i'm no longer currently learning designer but I moved back to it project management. 0:10:58.650,0:11:09.300 Ronda Reid: And with kind of moving back in it, I can tell you right now I work at penn state central it where a lot of our conversations deal with this notion of customer experience. 0:11:09.720,0:11:17.550 Ronda Reid: And I know when I put the word customer out there to educators it's kind of a negative knee jerk connotation like we're not we're not serving customers, but. 0:11:17.880,0:11:28.260 Ronda Reid: When we use that word in it, it really does we're talking about our faculty students and staff we're talking about how we can best serve these people to support. 0:11:28.620,0:11:34.530 Ronda Reid: What the university wants to do and serve the university mission and sort of the education mission. 0:11:35.250,0:11:42.870 Ronda Reid: So, again seeing this in a technology, environment and just the importance of things like user experience testing customer service experience. 0:11:43.560,0:11:53.970 Ronda Reid: is only kind of fortified my thinking on this too, but another change to that's happened since 2015 is i've been very fortunate to have the opportunity to be an online. 0:11:54.360,0:12:05.190 Ronda Reid: and residential instructor and having classes of my own has been an incredible experience for me, but it's also been an opportunity to kind of see how students are interacting with. 0:12:05.550,0:12:12.690 Ronda Reid: With some of the the environments and also dealing with an important part of our one of our research goals. 0:12:13.170,0:12:25.620 Ronda Reid: Was we wanted to reduce cognitive load that could impede student learning and seeing that firsthand as an instructor really kind of drove home to me the importance that we do need to take our learning design environment. 0:12:26.100,0:12:29.220 Ronda Reid: into consideration to maximize the learning potential. 0:12:29.820,0:12:38.610 Ronda Reid: And I will say to it's it's a terrible use case, but it is an important one for us to to remember is that with that violent shift online learning with the cove and experience. 0:12:39.000,0:12:45.720 Ronda Reid: If you think back to that our students were dealing with so much at that time, new it's a global pandemic our whole lives are changing. 0:12:46.410,0:12:49.410 Ronda Reid: When they turn to conduct their learning online. 0:12:49.770,0:13:01.170 Ronda Reid: We want to reduce the cognitive load of that learning environment we don't want them to have to ask and to kind of strain themselves thinking where's the syllabus how do I achieve X, we want to less than load for that. 0:13:01.590,0:13:05.490 Ronda Reid: And even though thankfully we're coming out of a coven environment. 0:13:06.120,0:13:14.550 Ronda Reid: We don't know the full effect of that will have on online learning will more students gravitate towards our online learning and still the principles are the same, we want to reduce. 0:13:14.700,0:13:22.620 Ronda Reid: That cognitive load for our students and create better learning environments for them so that they can achieve what we want them to achieve as educators. 0:13:24.660,0:13:33.690 Ronda Reid: Alright, so sorry just to sum up a little bit here to talk this notion of user experience and learning experience. 0:13:34.020,0:13:42.600 Ronda Reid: Design there's some similarities and there's some differences, obviously, as I mentioned before, when you talk about user experience testing and coming from that background. 0:13:43.020,0:13:54.180 Ronda Reid: A lot of times the focal points of that will be, I am a designer i'm going to develop a maybe a business website and the ultimate goal of that business website is to have that consumer buy a widget. 0:13:54.570,0:14:04.830 Ronda Reid: And if I don't design the site properly if there's a misalignment between what I think is important as a designer and how the user actually uses that site well that's going to impact, the bottom line. 0:14:05.190,0:14:13.710 Ronda Reid: And that's going to impact sales and finances in an education setting we don't have those concerns, but we do have that bottom line impact. 0:14:14.190,0:14:23.910 Ronda Reid: As as an instructional designer if I don't design a course that really meet student needs there is going to be that bottom line impact is that that could potentially impact. 0:14:24.330,0:14:33.390 Ronda Reid: The learning outcomes, it could impact student frustration and lead to less effective pedagogy and those are all things I think we don't want to happen. 0:14:34.980,0:14:39.360 Ronda Reid: So next i'm going to turn it over to tj who's going to talk about think aloud observations. 0:14:40.980,0:14:53.820 Tugce Aldemir: Thanks so much trauma Thank you um so what, why do they think so much for coming here and i'm very happy to be here as well, um I will very briefly touch on think a lot of technique that'd be implemented are very briefly. 0:14:54.720,0:15:01.170 Tugce Aldemir: Because we want to focus on what we did, and have it at it um so in order to understand better the course design was intuitive. 0:15:01.950,0:15:11.550 Tugce Aldemir: Guidance or just navigation drawer and directions and identify the point of confusion and also examine the common navigation behaviors be implemented think a lot of technique. 0:15:12.000,0:15:23.040 Tugce Aldemir: i'm in the very basic sense to tinker logic new concepts of asking users to perform a specific class, and I think a lot as as as they you know, try to complete that task. 0:15:24.120,0:15:35.280 Tugce Aldemir: Excuse me, the process of urbanization reveals the assumptions inferences misconceptions and problems that user space, while solving problems or perform a certain task. 0:15:36.630,0:15:44.250 Tugce Aldemir: Perfect um so it's very common mistake near, especially in human computer interaction field for those. 0:15:45.090,0:15:50.100 Tugce Aldemir: who are familiar with that i'm sure that you can kind of think about metal quite often. 0:15:50.640,0:15:59.370 Tugce Aldemir: And, and it is even close to it as like by some as the most valuable usable in direct method, because as they further argue. 0:15:59.940,0:16:08.580 Tugce Aldemir: It can help detect columns activities that may not be visible at all and enrich the basis for evaluating van and why users experienced the problem. 0:16:09.360,0:16:15.630 Tugce Aldemir: I see some interesting I mean great interest in the in the chat that you like to learn more about how it is and what he did it. 0:16:16.170,0:16:27.000 Tugce Aldemir: And that's exactly what you're going to talk in this in this partner session they're going to talk about what we did, how they did it and invited and gentlemen, we will start with our stomach good. 0:16:28.080,0:16:28.620 Thank you. 0:16:29.790,0:16:37.290 Jennifer Gray: hi amy and I are going to step you through the systematic process, which is the longest part of our presentation, and so we are going to go through it. 0:16:38.250,0:16:45.630 Jennifer Gray: as quickly as we can in this section of our presentation we're going to go through the whole thoughtful systematic process that we found a service well. 0:16:46.380,0:16:55.200 Jennifer Gray: Once you decide, you want to conduct a learning experience design, we hope the steps that we've captured are going to present you with a helpful guide to step you, through your own. 0:16:55.800,0:17:05.430 Jennifer Gray: We broke down our process into three main categories determine pilot and do we will look at each category and more detail on the upcoming slides. 0:17:05.880,0:17:22.080 Jennifer Gray: we're going to be throwing a lot of information at you, but in very quickly, but don't worry, we have resources already compiled for you, that we are going to provide at the end so don't feel like you have to take a lot of notes will provide all the information later. 0:17:23.610,0:17:32.910 Jennifer Gray: So, first we want to look at the determine steps when beginning any new endeavor the best course of action is making some determinations at the onset. 0:17:33.270,0:17:38.670 Jennifer Gray: which will serve as your blueprint for the rest of the project, we strongly suggest front loading the work. 0:17:39.450,0:17:54.270 Jennifer Gray: If you do a thorough job when you're doing the preparation and actual conducting of the think allows will be much smoother and it'll be a more productive experience for everyone so let's start by looking at the determine section which outlines the things that we're going to do beforehand. 0:17:56.190,0:17:59.310 Jennifer Gray: First, we want to determine what should be tested. 0:18:00.540,0:18:03.630 Jennifer Gray: determine your foundational elements that you want to test. 0:18:04.440,0:18:23.970 Jennifer Gray: What is it that you want to test identify what your research focuses going to be, then you want to identify the specific goals of the testing, which could be as simple as testing instructions for an assignment or as complicated as an entire course like we did with ours. 0:18:25.560,0:18:32.520 Jennifer Gray: We were testing a full course because we were moving to a new LM S, but you can certainly just test sections of your course. 0:18:34.620,0:18:45.390 Jennifer Gray: So, in our case, the university was adopting a new LM s we had ideas from different colleges across the university that came together to discuss how we could best present the content and how to set up navigation. 0:18:45.720,0:19:02.910 Jennifer Gray: In the new system for our students, we placed real course content in a test space within our LM s to test that the courses, although we did a large project, the steps and the think aloud method that we used can easily be adapted for a smaller scale project. 0:19:04.530,0:19:15.990 Jennifer Gray: And next we want to think about what questions we're going to ask, we want our students to be thinking about the right things we want them to focus on content rather than finding the content. 0:19:16.410,0:19:29.760 Jennifer Gray: So once you've identified your goals you want to write questions and tasks scenarios that will allow you to evaluate the areas of concern, such as your course navigation your introduction directions that type of thing. 0:19:33.300,0:19:41.220 Jennifer Gray: determine what problems need to be solved and then write the questions, as we did an example on this page, we wanted to see how students would connect with TEAM members. 0:19:41.490,0:19:50.850 Jennifer Gray: So we asked the participants to demonstrate for us how they would locate them when writing the questions decide where, in the course the participants should be located. 0:19:51.390,0:19:59.760 Jennifer Gray: When they start the task so do you want the students to be on the homepage do you want them to be at a different location within the course that's going to be super helpful. 0:20:01.500,0:20:08.370 Jennifer Gray: We provided our participants with a printout of the questions or the tasks to guide them here's an example of one. 0:20:09.120,0:20:20.460 Jennifer Gray: there was room for the students next slide yeah there's room for the students to write their answers, and then we asked them to also speak aloud as they were doing it, and we were taking notes. 0:20:20.880,0:20:30.870 Jennifer Gray: So we provided students, also with the opportunity to rank the difficulty of the task when they were done so we gave them the task they would write their answer and then also write it. 0:20:32.850,0:20:49.470 Jennifer Gray: Next we're going to look at who to test so typically it's recommended to select five participants and you want them to mimic the gender and age and other demographic pieces of your student population, it is preferred to use students enrolled in your courses when possible. 0:20:51.000,0:20:59.550 Jennifer Gray: In our case, the college's the different college units different in the process of who they tested slightly because they all recruited independently. 0:20:59.940,0:21:09.270 Jennifer Gray: But as an example the College of ISP recruited all online students and they did it distributed to match the student demographics by gender and age. 0:21:11.700,0:21:15.840 Jennifer Gray: next step is to determine how you're thinking about information will be captured. 0:21:16.590,0:21:24.510 Jennifer Gray: At this step, you want to consider how you're going to be capturing the information, in particular, considering what technology needs, you will have. 0:21:25.110,0:21:36.900 Jennifer Gray: We recommend recording the session because it's hard to collect all the information, otherwise no taking is another great tool is as well as recording the audio the video and screen capture if you can. 0:21:40.170,0:21:47.610 Jennifer Gray: We use fantasia for much of our testing to capture the audio and video recordings, we also use external cameras for backup. 0:21:51.030,0:21:53.550 Jennifer Gray: The next step when and where. 0:21:55.320,0:22:04.890 Jennifer Gray: So now we need to decide when and where you're going to test, what will your schedule be for testing we use a one week window or a one month window in which she will conduct your think allows. 0:22:05.310,0:22:11.760 Jennifer Gray: The technology you decided upon in the last step will also impact the location decision, make sure there's adequate space. 0:22:12.690,0:22:20.640 Jennifer Gray: For our example, most of the testing we use a dedicated room, which allowed us to set up the equipment and leave it there in between testing participants. 0:22:21.300,0:22:31.110 Jennifer Gray: We also determine timeframes, we wanted the testing to take place in we did run into space issues in a smaller office a larger space would have been better. 0:22:31.920,0:22:43.650 Jennifer Gray: Also, you want to consider the location regarding the noise level, make sure there's not classes letting out there, so that you're not distracting your participants and also so you can have useful audio recordings. 0:22:45.600,0:22:58.020 Jennifer Gray: Next, you want to think about necessary forms and funding, if you want to use your findings for research you're going to also need to consider IRB forms approvals permission forms and funding. 0:22:59.220,0:23:16.680 Jennifer Gray: We chose to go the IRB route which resulted needing to take additional steps, such as consent forms so on the next slide you'll see the consent form that we used, we also used a financial signature sheet, because we had provided $50 gift cards to each participant. 0:23:17.850,0:23:21.690 Jennifer Gray: And so, for the financial department, we needed to collect that form. 0:23:25.170,0:23:28.950 Jennifer Gray: And now amy is going to step us through piloting and do. 0:23:30.090,0:23:30.930 Amy Garbrick: hi everyone. 0:23:30.990,0:23:43.290 Amy Garbrick: The next step in the process is called pilot so after we made our determinations then we piloted pilot is very important part of the process and it provides the opportunity to rehearse the entire study before. 0:23:43.980,0:23:53.220 Amy Garbrick: To make sure it's running smoothly before you actually have real participants piloting also test out the tasks to ensure none of them are misleading or confusing. 0:23:53.700,0:24:01.650 Amy Garbrick: It makes more realistic estimates of time how long is basically practicing and it validates the data and the wording of the tasks. 0:24:02.130,0:24:13.500 Amy Garbrick: Well, this might seem like a duplication of efforts of what we did before we found it invaluable in our case, all of our team reported they found it very helpful to practice in the location, with the equipment. 0:24:13.890,0:24:21.060 Amy Garbrick: And do a complete run through sometimes multiple times of every step that we would take later and with an actual participant. 0:24:21.420,0:24:27.330 Amy Garbrick: We would strongly recommend that you do this to ensure more useful data and a better participant experience. 0:24:28.020,0:24:34.590 Amy Garbrick: we've ordered this pilot steps in a linear linear way but it doesn't necessarily have to be in that order. 0:24:35.130,0:24:43.950 Amy Garbrick: And the steps are first pilot your test questions pilot the equipment room setup pilot getting participants to think aloud and pilot the entire process. 0:24:44.490,0:25:01.800 Amy Garbrick: So the first one is pilot test questions so you work with someone who's perhaps is not involved in the project to trial, the questions try them out confirmed instructions are clear to ensure that wording does not cause confusion with your students or participants. 0:25:03.390,0:25:09.600 Amy Garbrick: So don't assume that just because it makes sense to you, it will make sense to your participants and here's an example, then have. 0:25:10.230,0:25:21.150 Amy Garbrick: A question where they're at we're asking participants to find a particular assignment, but in this example, the name of the assignment is confusing because there's a similar assignment. 0:25:21.600,0:25:29.820 Amy Garbrick: That has a similar name so make sure that the confusion of the participants encounters and there will be some confusion as that's what you're determining. 0:25:30.150,0:25:38.790 Amy Garbrick: That is, with the task you're testing and not a problem with the question or the naming conventions as the scenario that the question is not a problem. 0:25:39.240,0:25:54.060 Amy Garbrick: One of the things we didn't he he was asked her student interns to run through the test questions and practice conducting the testing with them first This allowed us to get comfortable with it and identify any point sticking points, the questions before the actual tests were conducted. 0:25:55.200,0:26:05.250 Amy Garbrick: The next step in pilot is the pilot the equipment in the room setup so we set up the hardware and software in a room, we did a complete run through to detect any problems with that. 0:26:06.210,0:26:17.610 Amy Garbrick: And we had a backup plan so some examples are to start the test of is you're actually participating in the room, leave enough time, so the issues can be fixed and do a backup plan, where you have. 0:26:17.970,0:26:28.620 Amy Garbrick: alternate location software hardware and alternate people to help so you're covering your bases, with a backup plan next we show our equipment. 0:26:29.190,0:26:37.680 Amy Garbrick: picture of what we had set up in our in our equipment room, so we had a backup camera on a tripod we had actually two of them. 0:26:38.280,0:26:49.710 Amy Garbrick: front facing in the back facing we had screen capture software on the computer capturing what the participants for doing, and we had a task sheet for the participants to go through So there you can see our setup. 0:26:50.940,0:27:00.600 Amy Garbrick: The next step and pilot is getting participants to think aloud, this is a very important step in the process, your purpose is to capture preferences and performance data. 0:27:01.020,0:27:12.840 Amy Garbrick: Simultaneous simultaneously and that model you're going to model that think aloud for the participants by physically and verbally doing exactly what someone would do as a participant so actually walking through this. 0:27:13.800,0:27:26.010 Amy Garbrick: Participants might feel awkward this is awkward to talk allowed as you do things it's not something that we do a lot so actually demonstrating it to them practicing it yourself and demonstrating to them is very important. 0:27:27.480,0:27:38.880 Amy Garbrick: So we use the think aloud tasks for the participation, so you don't want to influence the participants and what you do and you want to practice prepping people to talk about but not telling them what to do. 0:27:39.960,0:27:44.430 Amy Garbrick: So they're going to have some struggles and some awkwardness there and that's part of the process. 0:27:45.870,0:27:51.300 Amy Garbrick: The next step and pilot is piloting the entire process so from start to finish, you want to practice. 0:27:51.870,0:28:01.320 Amy Garbrick: This will reveal areas where you need to improve the process and make changes, based on what is revealed that will help that testing to go more smoothly or comfortably. 0:28:01.740,0:28:07.380 Amy Garbrick: and better for the participants so that practice also those the competence level for you. 0:28:07.800,0:28:17.370 Amy Garbrick: As educators, we don't conduct ux studies on a regular basis, so practice helps us and build confidence, providing a more relaxed relaxed environment for our participants. 0:28:17.970,0:28:23.760 Amy Garbrick: We did use things like checklists so the next screen shows you an example of our checklist testing checklist. 0:28:24.120,0:28:36.750 Amy Garbrick: We had to make sure that we didn't miss any steps and setting up and going through the entire process, we also had a moderator script that we used with participants, that would be exactly what we intended to say. 0:28:37.770,0:28:49.290 Amy Garbrick: So the next part is the do so once you have determined and then you've piloted The next step is do, and so you actually do it with real participants and you conduct the testing. 0:28:50.460,0:28:57.210 Amy Garbrick: There may be some repetitions in this in this part of this, and the third part of the process. 0:28:58.230,0:29:08.160 Amy Garbrick: But that's okay it's from piloting and conducting the actual testing while we've ordered the steps in a linear way they can be done. 0:29:08.760,0:29:15.390 Amy Garbrick: Some of them can be done at the same time, or you can shift the order slightly, so the steps are recruit the participants and schedule. 0:29:15.780,0:29:28.650 Amy Garbrick: conduct it the testing capture the observations analyze the data and make improvements to your design The first one is to recruit participants, so you actually want to recruit real users, if possible. 0:29:29.880,0:29:39.810 Amy Garbrick: In our case, it was online students who would be using the new learning management system so and you want to think about scheduling so perhaps schedule gaps in between. 0:29:40.320,0:29:45.810 Amy Garbrick: Your testing Center so we send an email to instructors have an online classes to recruit our students. 0:29:46.710,0:29:57.600 Amy Garbrick: If online students are not available in your case or the actual users, you can use colleagues or other people to go through the testing but ideally you want the real users to be doing going through your testing. 0:29:58.110,0:30:04.980 Amy Garbrick: In their recruitment email, we also directed potential participants to fill out a survey, so we could collect additional data. 0:30:05.850,0:30:19.800 Amy Garbrick: and consider what types of data that you want to use the next slide shows our recruiting email, so this is how we explain what the study was we also offered a $50 gift certificate to Amazon for their participation. 0:30:20.460,0:30:30.120 Amy Garbrick: And that was part of our funding that we had to run this so we explain things, and this is to have kind of example or a template of what you want to use to get participants to recruit them. 0:30:31.500,0:30:34.050 Amy Garbrick: The next step is to actually conduct. 0:30:34.530,0:30:46.380 Amy Garbrick: The tackle out observation think a lot of observation, so you actually want to do a few things you want to greet your participants, you want to make sure they feel welcome you want to provide instructions for how to do that think aloud. 0:30:47.040,0:30:57.120 Amy Garbrick: And how long it will take, and then you want to model, the example of how you do it in our case, we had two TEAM members in the room, one asking questions and one taking notes. 0:30:57.570,0:31:10.350 Amy Garbrick: And so that way, we could make sure that we were capturing things you want to make sure to follow your scripts check lines and be sure that they signed a consent form, I apologize for my dog barking. 0:31:11.910,0:31:15.150 Amy Garbrick: If the if the participant is confused try to go over those steps again. 0:31:15.660,0:31:27.570 Amy Garbrick: and make sure the modeling of the think aloud is very important that you're modeling that because they're going to follow what you're doing so, if you put all your effort into planning the scenarios and tasks, then this should work smoothly. 0:31:29.010,0:31:31.440 Amy Garbrick: As far as the actual conducting of the to. 0:31:32.580,0:31:40.740 Amy Garbrick: The next step is to actually to capture observations during and after so during it during that during that actual testing, you want to. 0:31:41.310,0:31:49.830 Amy Garbrick: capture as much as you can audio video live note taking and then immediately after right in your notes, so you don't forget. 0:31:50.310,0:31:59.430 Amy Garbrick: What what you what has happened and you had capture all that have a backup camera to make sure if you forgotten to turn on something as far as the record button. 0:32:00.090,0:32:05.550 Amy Garbrick: Have backups to the recording in case of data loss, so you want to have all those kind of backups capturing notes in the moment. 0:32:07.380,0:32:18.600 Amy Garbrick: Next, is an example of kind of a task, one of the tasks that we had the participants to and then underneath is pseudo names of the people of our participants and what they said. 0:32:19.110,0:32:28.350 Amy Garbrick: So capturing what they said and being able to kind of combine that and be able to analyze it that's example, these are examples of how we kind of capture that information. 0:32:29.760,0:32:36.840 Amy Garbrick: will talk about in analyzing the data next so The next step is to analyze the data, so you can do in formal analysis. 0:32:37.650,0:32:46.500 Amy Garbrick: doesn't take a lot of time to watch the recordings to look over the notes or you can do formal analysis with quantitative and qualitative data analysis. 0:32:46.860,0:32:50.130 Amy Garbrick: So we started with an informal and went to formal. 0:32:50.490,0:33:00.630 Amy Garbrick: So we had multiple people in the in a small group in on a research team who watch the recordings together, who read over the notes together, and then we talked about it, what are the things that we found. 0:33:01.020,0:33:09.090 Amy Garbrick: We also did a formal analysis, where we went through and did a deeper dive on our data and all the data that we've collected. 0:33:10.380,0:33:23.610 Amy Garbrick: So the next screen shows you some of the data that we collected examples of that so we had the videos that we could actually watch we had transcripts of the videos that we created, so we have that data as well. 0:33:25.530,0:33:32.370 Amy Garbrick: The next we have participant notes, so this is the task sheet that the participants has, as they went through it would give them a task. 0:33:32.760,0:33:43.800 Amy Garbrick: They would read it, and they put in notes, we have that data, and we also had all the facilitator notes during and after the actual sessions, so we had a lot of data that we could dig into. 0:33:45.840,0:33:57.300 Amy Garbrick: The next step is to actually make improvements on your design, based on what you found so improvements might be on navigation structures naming conventions terminology. 0:33:58.500,0:34:09.690 Amy Garbrick: or you may even just think of what small changes, you can make to improve your design so for us it was a standard homepage standard course navigation standard naming conventions in canvas. 0:34:10.800,0:34:23.970 Amy Garbrick: That we wanted to use and we wanted to minimize multiple interfaces for our students and have consistent design across kit courses and even across colleges or units that were offering those courses. 0:34:25.950,0:34:33.570 Amy Garbrick: The next slide shows an example of a task that was asking about a particular assignment. 0:34:33.960,0:34:41.610 Amy Garbrick: The students were confused participants were confused because that name funding scenario assignment was similar to another name. 0:34:41.880,0:34:47.880 Amy Garbrick: For a totally different assignment, in the course so finding those gotchas of We really need to name. 0:34:48.210,0:35:02.940 Amy Garbrick: Each assignment uniquely so students are not confused about the name was important for us to see so that's what this example is showing it could be so, this could be improved the design could be improved if there are neek distinct names for each assignment. 0:35:04.890,0:35:22.350 Amy Garbrick: And the last slide in this section on do talks about some of the improvements that we need for our design, so we found that students oriented to courses differently than we did as educators and designers they relied heavily on modules and canvas. 0:35:23.430,0:35:32.460 Amy Garbrick: But they also had kind of their own way to get there, but lodgepoles was a key that we found and they work with us by terminology discrepancies. 0:35:32.790,0:35:40.260 Amy Garbrick: Multiple interfaces they were confused by inconsistent content organization with of course they liked consistency and predictability. 0:35:40.620,0:35:45.960 Amy Garbrick: And they were confused by different designs across courses that took more of that cognitive load. 0:35:46.230,0:35:59.760 Amy Garbrick: That rhonda mentioned earlier, to be able to figure out where's the syllabus and what's this called when they were working on a course and looking through the course so now, the next section is the justin time process and it's back to andhra. 0:36:02.070,0:36:07.770 Andrea Gregg: Okay, great Thank you and don't worry we're going to have time for questions and discussion. 0:36:08.880,0:36:10.800 Andrea Gregg: I just wanted to very quickly. 0:36:11.880,0:36:22.590 Andrea Gregg: Talk about a project i'm doing right now and some very justin time version of everything that Jen and amy just talked through so. 0:36:24.750,0:36:30.720 Andrea Gregg: The bare minimum if you wanted to get some data from a think aloud is identify a course or website. 0:36:31.590,0:36:47.160 Andrea Gregg: determine at least three basic tasks a user learner would complete find someone to review, I don't know if you guys are familiar with Steve crew, but don't make me think and rocket site made easy he's a huge web usability guy and he has. 0:36:48.180,0:36:57.240 Andrea Gregg: A quote in his book that it is better to grab a random person off the street sit them in a chair and watch them try to do something on your website. 0:36:57.630,0:37:10.920 Andrea Gregg: Then to design a perfect study that never ends up getting conducted so it's always better to have somebody tried to use your site or course than to not do it because they don't match parameters, etc. 0:37:11.550,0:37:25.530 Andrea Gregg: Have the meeting immediately follow capture observations identify improvements make improvement, so what i'm working on right now in my position I created for the Faculty in my department, a. 0:37:26.040,0:37:38.070 Andrea Gregg: site meant to support teaching and learning and I wanted to get feedback, they are the users, also the learners of what's being put out there, so the identifying the site was easy. 0:37:39.390,0:37:40.140 Andrea Gregg: I. 0:37:41.340,0:37:54.450 Andrea Gregg: identified three tasks that they are likely to perform so new faculty members will be told the site has the resources for the department and then open the site and talk through how you would use it. 0:37:56.100,0:38:05.490 Andrea Gregg: We have a lot of faculty that may not may have used a different LM s before so you know that penn state uses canvas How would you find out more about it. 0:38:06.000,0:38:15.990 Andrea Gregg: And you want to see past workshops that medley, which is the initiative eileen has offered, how do you find them so very pragmatic and. 0:38:16.620,0:38:24.540 Andrea Gregg: find your people, again, I was really lucky because I just emailed faculty directly and basically my friends and asked would they do this for me. 0:38:24.870,0:38:34.290 Andrea Gregg: So they sort of felt like they couldn't say no, which was great and I got four people I finished three i've one or two to go I think i'll put another call out, but if I couldn't find any. 0:38:35.010,0:38:40.590 Andrea Gregg: I would just get somebody who could understand enough to give me some insight, I would ask. 0:38:40.920,0:38:54.810 Andrea Gregg: A friend who wasn't in the department, it really is more important that someone tried to do something on your course or site than that they're the sort of perfect person scheduled a meeting for 45 minutes told them not to prepare anything or do anything. 0:38:56.460,0:39:10.290 Andrea Gregg: And then, in the meeting we met and zoom I briefly explain think aloud and why you do it DEMO the thinking aloud process I picked a random website I just went to ABC news and. 0:39:11.430,0:39:16.140 Andrea Gregg: imagined, I was trying to find international news, and I actually thought aloud doing it. 0:39:17.220,0:39:36.000 Andrea Gregg: turned on the zoom recording, but I also added notes had them screen share so I could watch them kind of move around and then I had them do each task, so what you'll find here's the same that you'll find in the systematic is, you have to keep encouraging them to say what they're thinking. 0:39:37.110,0:39:45.060 Andrea Gregg: They don't want to because they don't hurt your feelings it's unnatural etc so you're just constantly saying okay tell me more. 0:39:45.900,0:39:57.480 Andrea Gregg: What are you looking for what are you thinking when you see that what do you think, why did you pick there tell me why so you're you're constantly doing that that's probably the hardest part of the whole process, but the most important. 0:39:58.890,0:40:05.250 Andrea Gregg: And then immediately after so i'm really going to emphasize two things I think you've probably heard this a million times from all of us. 0:40:06.060,0:40:14.040 Andrea Gregg: Do it immediately after I had three the first one, I didn't do immediately after because I ran to another meeting and the notes I got from her less. 0:40:14.550,0:40:24.330 Andrea Gregg: The other two I scheduled time typed up the notes immediately after and has a lot more data from that now, I did record them but I haven't watched the recordings yet. 0:40:26.460,0:40:38.910 Andrea Gregg: I may or may not depending on time in my schedule, so that immediately after is crucial and don't assume you'll remember it because, while you're watching you're thinking Oh, she click there why didn't she click here. 0:40:40.290,0:40:43.710 Andrea Gregg: A week later, you may not remember that you had that observation. 0:40:45.750,0:40:51.570 Andrea Gregg: And then figure out what you're going to change so i'm not done going through the data but. 0:40:52.770,0:40:59.520 Andrea Gregg: I did find some very interesting easy things in general people multiple people mentioned they like the logo. 0:41:00.360,0:41:08.580 Andrea Gregg: When people landed, they just immediately started scrolling and I spent a lot of time learning how to do these links that go in page. 0:41:09.330,0:41:22.890 Andrea Gregg: mixed feelings on the images one participant said, the images are annoying and they don't do anything related another said, if the images weren't here, I would have left the site, by now, because I don't like to read i'm in mechanical engineering department. 0:41:24.210,0:41:40.530 Andrea Gregg: This no one read, I had to ask because they just scrolled right past so even very simple 345 minute meetings I got a lot of really good feedback that will go into the improvements. 0:41:42.240,0:41:56.850 Andrea Gregg: And then just quickly want to give you guys the resources again the site all of our stuff is on his sites that psu.edu slash canvas ux maybe maybe one of our team can put it in the chat. 0:41:57.870,0:42:20.190 Andrea Gregg: And then, this qr code will also take you directly to our camp our companion site which goes through each of the steps, and in some cases it gives you actual examples of what we used so with all That said, I would like to kind of stop things now and just do questions discussion, etc. 0:42:22.800,0:42:24.390 Andrea Gregg: whoa whoa. 0:42:34.860,0:42:47.640 Andrea Gregg: Okay Martha that's a great question and I think that is something that I will do right now and i'm doing it on the fly I haven't prepared this. 0:42:48.060,0:43:00.120 Andrea Gregg: So why don't you guys put in chat a site you want me to go to and give me a task and I will impromptu think aloud hired navigate So you can see, see it real and you know it's not. 0:43:01.650,0:43:03.600 Andrea Gregg: not playing so let's do that first. 0:43:05.640,0:43:11.370 Andrea Gregg: So for those who actually want to see, I think a live DEMO just put in the chat a website you want me to go to. 0:43:22.080,0:43:30.420 Andrea Gregg: super thanks on it so i'll do ad tech books and then give me a task for someone for what I should be trying to do an ad tech books. 0:43:32.910,0:43:37.110 Ahmed Lachheb: try to find a chapter there that talks about project management organic design. 0:43:37.800,0:43:39.510 Ahmed Lachheb: Okay perfect so. 0:43:39.570,0:43:41.580 Andrea Gregg: Let me get there and then i'll share my screen. 0:43:47.220,0:43:54.840 Andrea Gregg: Actually I won't even get there before I share my screen going to show you guys, from the very scratch what we're doing. 0:43:59.820,0:44:06.210 Andrea Gregg: Okay, so i'm gonna hit a tech books and go directly hopefully to that website. 0:44:08.160,0:44:09.720 Andrea Gregg: Taking a little while to load. 0:44:10.740,0:44:16.890 Andrea Gregg: And I typed in the wrong URL so let me actually copy it from the chat oh.org. 0:44:18.360,0:44:19.650 Ahmed Lachheb: yeah books. 0:44:21.180,0:44:21.680 Andrea Gregg: Okay. 0:44:22.890,0:44:26.970 Andrea Gregg: So I opened the website and I immediately drawn. 0:44:26.970,0:44:39.960 Andrea Gregg: To free text books and journals the very first thing I tend to do when i'm on a website and I have something in mind, as I go to the search, I see that there's a search bar here, which is nice, I also see search up here. 0:44:40.410,0:44:48.840 Andrea Gregg: Those are two things I always look for I just now noticed this graphic which is interesting and curious I not. 0:44:50.070,0:44:57.660 Andrea Gregg: sure how I feel about it, given what i'm trying to find, but I know my task is project management and. 0:44:59.130,0:45:08.400 Andrea Gregg: instructional design, but i'm also a little curious about what the books are here, because this is my field, so I might scroll and just look at some of these topics. 0:45:09.450,0:45:16.260 Andrea Gregg: I think they could draw me in so i'm going to go away from that for now, so I will just type in project management and. 0:45:16.710,0:45:28.110 Andrea Gregg: instructional design, I noticed there's no enter or search here, which is not a big deal because I would normally just hit enter, but it is something I noticed wasn't there. 0:45:29.580,0:45:37.920 Andrea Gregg: And no search results were found okay so that's a little bit of a bummer but maybe the search was too complex so i'm going to get rid of an instructional design. 0:45:38.490,0:45:43.980 Andrea Gregg: And then it automatically so now, I see how the search bar works I don't have to click enter it just brings it up. 0:45:44.460,0:46:00.150 Andrea Gregg: So, then, I see the book that's exactly the topic i'm looking for I would click on the book, I still don't know if i'm looking at like a whole book if i'm looking at something that's going to eventually lead me to Amazon. 0:46:01.200,0:46:12.930 Andrea Gregg: But OK, I see press books which i've heard of and know that they often have completely open source okay creative comments, so now i'm thinking this I might have access to a whole book so that's. 0:46:14.010,0:46:18.690 Andrea Gregg: that's what I would what I would do and that's how I would DEMO it to a student or a participant. 0:46:19.800,0:46:35.700 Andrea Gregg: And that I think so, those are my two main points definitely do that DEMO practice doing the DEMO it's super awkward if you haven't done it i'm only comfortable doing because i've done it like 50 times practice it and practice, it in front of someone and then. 0:46:37.620,0:46:44.160 Andrea Gregg: write up your notes, right after the meeting, do you learn nothing else Those are the two key things so what other questions do folks have. 0:46:49.650,0:46:53.520 Andrea Gregg: Oh thanks matt yep that's that's perfect and actually I think. 0:46:54.150,0:46:56.670 Andrea Gregg: Steve crew rocket science made it easy. 0:46:57.060,0:46:58.860 Andrea Gregg: I think that he has. 0:47:00.540,0:47:14.610 Andrea Gregg: Even some links to videos on that the website for that book that maybe go to YouTube videos where he demos it or there was something in that book that was also super helpful. 0:47:15.810,0:47:16.680 Andrea Gregg: On the website. 0:47:18.240,0:47:20.310 Andrea Gregg: yep so matt just put a link in. 0:47:21.690,0:47:24.900 Andrea Gregg: There, there are good examples out there. 0:47:25.980,0:47:33.060 Andrea Gregg: What you're really trying to get at is all that stuff that you think sometimes not even consciously. 0:47:34.110,0:47:41.280 Andrea Gregg: That the designers never going to be aware of they're never going to be aware that you might wonder why there's not an intro bar there. 0:47:42.780,0:47:50.370 Andrea Gregg: Or that it's going to take you a half a second to realize that it searches a little differently than you were expecting. 0:47:50.790,0:47:56.700 Andrea Gregg: So some of those things aren't a big deal when you when all your users figure something out and a half a second, not a big deal. 0:47:57.510,0:48:04.260 Andrea Gregg: But if it's something where they aren't ever clicking where you thought they would click big deal. 0:48:04.650,0:48:15.420 Andrea Gregg: Now, actually, one thing I would add, from systematic perspective that we didn't do at the time, but if we had the expertise and we were doing all this again and had the funding and the team, and all of that. 0:48:15.870,0:48:32.160 Andrea Gregg: I would probably integrate I tracking because that takes it a step further than what we did, and can really dig into where people are looking and where you want to kind of front load, the most important things, thanks to Jay. 0:48:33.870,0:48:36.900 Andrea Gregg: So what other questions or for the rest yep. 0:48:38.970,0:48:43.140 Ahmed Lachheb: Thank you so much for the for the presentation i'm curious if you. 0:48:44.550,0:48:57.690 Ahmed Lachheb: Care or like to make the distinction between approach to design a methods, the design and the design technique and do you think this distinction is useful in framing when we. 0:48:58.650,0:49:08.460 Ahmed Lachheb: educators designers design practitioners want to advocate to prescribe something that other learning designers should do and in the interest of full transparency there might be a follow up. 0:49:09.510,0:49:10.920 Ahmed Lachheb: or two based on your answer. 0:49:11.490,0:49:14.880 Andrea Gregg: Okay, so say the three things again methods of design. 0:49:15.150,0:49:27.570 Ahmed Lachheb: approach to design a method of design and insight technique because throughout the presentation, I mean the title of the presentation think aloud method or methods of think aloud and then there's just some point said think aloud technique. 0:49:28.830,0:49:42.390 Ahmed Lachheb: yeah So do you think we should make those distinctions, first of all, between approach to design a method to design and the design technique, I personally do, and I think it's helpful but i'm curious to learn from your wisdom, do you think that distinction. 0:49:42.390,0:49:53.070 Andrea Gregg: Yesterday night question so if you think back to the continue on slide on a systematic slide we probably were not all the way at the left, because. 0:49:53.610,0:50:13.110 Andrea Gregg: We we were not interested in researching the validity of think aloud as a practice we took that for granted, and so, as you can see our language is sort of casual like we were not none of those differences were intentional, they were just to us there from a pragmatic perspective. 0:50:14.460,0:50:18.750 Andrea Gregg: I didn't think differently about method versus technique versus approach. 0:50:19.140,0:50:23.790 Andrea Gregg: So if we were doing formal research on think aloud and like. 0:50:24.180,0:50:33.540 Andrea Gregg: to chase done a ton of just lit review and research in this area so it's a lot more complicated, if you really want to dig into it. 0:50:33.870,0:50:46.080 Andrea Gregg: But our main goal was how our students navigating our courses, so that designers can quit arguing their preferences and let's actually watch students do it so yeah that's a good question. 0:50:47.820,0:50:52.080 Ahmed Lachheb: So the follow up that I have here so i'm curious to hear more. 0:50:53.160,0:51:06.810 Ahmed Lachheb: First, I appreciate what you said and you're right right at the beginning, you said we're not advocating for systematic or just in time or just some time over systematic and really depends, and that and kind of the the the underlying meaning. 0:51:08.070,0:51:20.430 Ahmed Lachheb: Is that you emphasize the need for strong design judgment and expertise by the designer to choose and pick one, it is a systematic approach and when it's used to use a just in time approach or somewhere in between, and I appreciate that. 0:51:22.110,0:51:34.200 Ahmed Lachheb: Yet I heard something from rhonda and i'm curious to hear more from rhonda about once a 20 sets of based on your rich professional experience situated in your professional organization. 0:51:34.620,0:51:42.840 Ahmed Lachheb: You have evidence of instructional designers not conducting usability testing, but they have design and what kind of what is the nature of that evidence coming from. 0:51:46.680,0:51:47.180 Ronda Reid: Well, I. 0:51:47.250,0:51:52.200 Ronda Reid: would say that just from the beginnings and we're going back to maybe 2013 here. 0:51:52.590,0:52:02.850 Ronda Reid: Just raising the question of do we test do we get feedback from from what we're designing, at least at that time it was kind of unheard. 0:52:03.780,0:52:09.450 Ronda Reid: For for the circles that we were operating in I didn't go out and do a lot of. 0:52:09.990,0:52:18.720 Ronda Reid: I guess academic type of research, but I can tell you can understand all of Delhi, that when we started this and we did kind of take a more academic approach to. 0:52:19.530,0:52:35.910 Ronda Reid: Writing papers and doing things like that, at that time there didn't feel like there was a ton of additional research that was conducted to so we did feel at the time that this was something that was maybe just starting to take a little bit more notice wasn't done a lot in practice. 0:52:37.230,0:52:46.560 Ronda Reid: And you know wasn't wasn't even being considered or talked about, you know how do we get that kind of feedback any kind of feedback from from our students on what we've designed. 0:52:47.550,0:53:00.960 Andrea Gregg: I mean, I can say I worked in the unit with 20 plus designers and no one ever did this, at most, we would roll our Chair over to someone else and say look at this which do you like better at most so. 0:53:02.040,0:53:21.240 Andrea Gregg: I think that, unless things have changed dramatically since I was a hands on designer it's practical you're busy you're trying to get feedback you're working with faculty you're you know, creating these courses like it's just not something in your practice um okay so. 0:53:21.870,0:53:24.930 Andrea Gregg: I like the question of. 0:53:27.330,0:53:30.960 Andrea Gregg: What mistakes, did we make because they love talking about our mistakes. 0:53:32.100,0:53:48.720 Andrea Gregg: So who wants to share mistakes what we would do differently, I think a lot of this presentation was based, not a lot, there were definitely a few things to do what we say now what we do, or not what we did, because I don't I think we did some testing before piloting, I think. 0:53:49.860,0:53:57.000 Andrea Gregg: We did a dry run and it was after some tests have been done and people kind of had an eye open moment of Oh, we should have done that. 0:53:58.290,0:54:11.460 Andrea Gregg: One thing I should mention is while we were doing this we would debrief and say what worked what didn't, how do we improve it for the next run, so it was iterative in that sense as well, other mistakes people want to mention for merrill's question. 0:54:14.460,0:54:16.140 Tugce Aldemir: um maybe I can try me. 0:54:16.170,0:54:35.490 Tugce Aldemir: i'm sure, so one thing might be sure to pay extra attention is to moderators guess how moderator specific sessions like impacts, the data that you collect from tickets, you know, data collection sessions, so it might be helpful in the future that. 0:54:37.020,0:54:43.950 Tugce Aldemir: To set certain norms for moderators like, for example, not interviewing any processes are not explaining anything to do. 0:54:43.980,0:54:47.040 Tugce Aldemir: You know participants by out there, doing certain tasks. 0:54:47.190,0:54:57.630 Tugce Aldemir: That might be helpful to make those decisions before you collect data and then ensure that all moderators behaving the same main the data collection so. 0:54:58.260,0:55:14.130 Tugce Aldemir: In that way, that will not be any kind of much Asians in terms of data that you collect because sometimes people are guided a little bit further than other people in some sessions, so this chain this create some kind of fluctuations in the data that you. 0:55:15.300,0:55:29.250 Andrea Gregg: Know oh that's a great point and today is the one who analyzed all all of the data so she knew in great detail sort of how different facilitators work, work to because we could see that in the data. 0:55:31.080,0:55:37.590 Andrea Gregg: yeah so ideally facilitators would be pretty identical in their practices or you'd have one facilitator. 0:55:39.540,0:55:42.420 Andrea Gregg: So Max what approach, do you use to create. 0:55:42.420,0:55:45.030 Andrea Gregg: Good interface navigation designs. 0:55:47.250,0:55:58.620 Andrea Gregg: I think that, so the website I just created it's my first time using divvy in wordpress so I looked at a lot of divvy examples things that I resonated with. 0:55:59.760,0:56:08.730 Andrea Gregg: but luckily from doing the my justin time testing found out some features they're just not useful to people so i'm going to get rid of those. 0:56:09.210,0:56:23.490 Andrea Gregg: I think in the learning management system, one of the big takeaways was design within the logic of the learning management system, yes designers can tweak things or work behind the scenes or figure out how to insert code or. 0:56:24.900,0:56:40.560 Andrea Gregg: But at least within canvas it's a setup to be used in a particular way like and it's not bad may not be the prettiest thing or the most like you know exciting thing but it's pretty easy it's pretty easy to get around. 0:56:41.910,0:57:00.120 Andrea Gregg: So I think the basic model is don't make them think about the stuff they don't need to think about finding things naming etc, they need to think about the content their assignments their interactions all of that Okay, so we just got the mad message from matt. 0:57:01.980,0:57:17.490 Andrea Gregg: So just Martha to use for training courses I think anything you can use think aloud on anything she's for product design like I said i'm in mechanical engineering now it's it's a method used for anything so it's always helpful. 0:57:18.720,0:57:23.820 Andrea Gregg: Okay, so matt i'll hand it back to you in terms of any wrap up or closing. 0:57:24.780,0:57:33.120 Matthew Schmidt: Outstanding I don't have very much to wrap up or close with, I just wanted to thank you and all the presenters for this excellent presentation. 0:57:33.480,0:57:44.010 Matthew Schmidt: there's been some outstanding conversation and questions as follow ups and many questions continue to feed into the chat i'm hoping that. 0:57:44.640,0:57:54.660 Matthew Schmidt: Some of you will reach out to the team and continue these conversations, but we do need to wrap up for today, so thank you everyone for coming. 0:57:54.930,0:58:11.730 Matthew Schmidt: You can expect to see this webinar posted online presently and there will be more in the learning experience design webinar series so stay tuned to the usual channels will be sending out information when it becomes available alright folks thanks everyone. 0:58:13.170,0:58:14.520 Andrea Gregg: Okay, thank you guys. 0:58:16.710,0:58:17.210 Andrea Gregg: Were.
Suggested Citation
Gregg, A., Reid, R., Aldemir, T., Garbrick, A., & Gray, J. (2022). LXD Webinar Series - Think-Aloud Methods: Just-in-Time & Systematic Methods to Improve Course Design. Design and Development Chronicles. https://edtechbooks.org/dd_chronicles/lxd_tao
CC BY: This work is released under a CC BY license, which means that
you are free to do with it as you please as long as you properly
attribute it.
Andrea Gregg
Andrea Gregg, PhD works as the Director for Online Pedagogy and Credentialing and is a faculty member in the Department of Mechanical Engineering where she leads the instructional design of online and blended courses; provides guidance for innovative approaches to teaching and learning; and leads the micro-credentialing and badging initiatives within the department. Prior to this, in her role as Associate Director for Research, she provided strategic leadership for applied learning design, educational technology, and online learning research. Gregg’s primary research focus is in understanding learners’ experiences from their perspectives. She has over 15 years in the field of online education. In her previous role, she managed a design team made up of instructional designers and instructional production specialists responsible for roughly 100 unique courses offered online by Penn State. She has published and presented in a variety of educational technology, online and distance learning, and adult learning outlets. Prior to her work in online education, Andrea taught academic for-credit courses in Penn State’s Department of Communication Arts and Sciences.
Ronda Reid
Ronda Reid holds a Master’s in Education from Penn State University and is a certified Project Management Professional (PMP). Her years of work on various websites where user experience is common practice, lead her to believe the same user-focused principals should be applied to the realm of online education where she believed online course design problems could be an inhibitor to the learning process. She spearheaded the Canvas UX research project at Penn State as the University was switching to the learning management system and served as principal investigator for one of the research efforts as well as played an intricate role throughout the various stages of the other research projects.
Tugce Aldemir
Tugce Aldemir is a fifth-year PhD candidate in Learning, Design and Technology Program in the College of Education at the Pennsylvania State University. She is also a graduate research assistant in the Teaching and Learning with Technology (TLT) at the Pennsylvania State University. She has received her M.S. in the Computer Education and Instructional Technology Department at Middle East Technical University, Turkey. Her research interests are: computer-supported collaborative learning, socio-emotional interaction and regulation in collaborative learning environments, socio-metacognition, human computer interaction, emotionally responsive online learning environments, and cultural competence and collaborative discussions. Her PhD dissertation focuses on developing a new model of socio-emotional competence to inform metacognitive and socio-metacognitive expertise or modify the existing model of competence to broaden its scope to entail socio-emotional interactional patterns. Her previous work has centered around a diverse group of topics including online learning, learner-based UX design, gamification, game theories, and learning experience design for empathy and social connectedness, and she has co-authored in the journal articles and conference proceedings about these topics.
Amy Garbrick
Amy Garbrick, PhD has over 25 years of experience in Education and Technology and is the Director of Learning Design in the College of Information Sciences and Technology (IST). Amy manages the design and development efforts of over 100 courses for five award-winning online programs: the Masters of Professional Studies (MPS) in IST, MPS in Homeland Security, MPS in Enterprise Architecture, BS in IST, and BS in SRA (Security and Risk Analysis). Amy has taught in online, resident, and hybrid undergraduate courses and was the only instructor at Penn State to teach in all of the LMS pilot offerings using: MoodleRooms, Pearson, Desire2Learn, Blackboard (2 separate pilots), Canvas (3 separate pilots), and ANGEL. She has guest taught in numerous other courses both in Online and in Resident Instruction (RI). Amy completed her PhD in Learning Design and Technology with a dissertation on improving student engagement in online asynchronous discussion forums measured by quantity, quality, survey, and Social Network Analysis (SNA).
Jennifer Gray
Jennifer Gray is an instructional designer in the College of Health and Human Development at The Pennsylvania State University. Jennifer earned her MEd in Business Education and has 25 years of experience in business, education, and technology fields. Her experience with learning and UX comes from various professional perspectives. Jennifer’s professional experience includes corporate experience managing people and projects, teaching and working as a technology coordinator in K-12 settings, training educators and collaborating with designers at an educational software company, and as a higher education instructional designer.