• Introduction to Becoming an LIDT Professional
  • Download
  • Translations
  • Where Should Educational Technologists Publish Their Research?

    Updated Results From a Survey of Researchers and Professionals
    Ritzhaupt et al. (2012) asked, “Where should educational technologists publish their research?” This question remains relevant for today’s researchers. Most researchers argue that high caliber, peer-reviewed works should be the benchmark for quality. But which journals are high caliber? In this chapter, we share results of a survey of professionals and researchers, identifying which journals in the field are most visible and considered prestigious.

    Over a decade ago, Ritzhaupt et al. (2012) asked, “Where should educational technologists publish their research?” This question remains relevant for today’s researchers. However, what are the factors for answering that question? Most researchers argue that high caliber, peer-reviewed works should be the benchmark for quality (e.g., Schimanski & Alperin, 2018), but they also suggest that various metrics, like impact factor alone, can be problematic in making a judgement by using the metric alone (see McKiernan et al., 2019 for more details). With the increasing number of publication venues (e.g., some lists of venues include more than 100 journals) available to educational technology researchers (Ritzhaupt, n. d.), we need guidance beyond acceptance rates and impact factors to assist us in choosing where to share our research (West & Rich, 2012).

    Once research is complete and the manuscript is written, where does it go? How do you find the best publication outlet for your work? And why is this important? Understanding the right outlet for a manuscript can challenge even a seasoned author. As such, the task of picking a venue as a novice researcher is even more daunting, which can lead to questionable choices in selecting a journal for manuscript submission. Frandsen (2019) reviewed the literature on the causes of poor journal selection and highlighted motivational factors and the lack of awareness as the cause for poor selection. Kurt (2018) found that the pressure to publish, general unawareness of quality publication channels, and/or fears of social misperception or academic inadequacy, lead novice researchers to publish in predatory journals. These types journals “prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices” (Grudniewicz et al., 2019, p. 211). As such, Al-Khatib (2016) suggested that publication in poor or predatory journals can lead to “negative scars” on career advancement (e.g., tenure and promotion), which could result in novice researchers facing long-term consequences for publishing in a predatory journal.

    Purpose of this Chapter

    Understanding the value of publication outlets, particularly peer-reviewed journals, can play a critical role in the careers of new scholars. As explained by Ritzhaupt et al. (2012) in their earlier examination of these factors, the academic career path requires justification of publication choices throughout. Yet, as noted earlier, there are arguably over 100 journals in the field of educational technology (Ritzhaupt, n. d.), including journals on topics from educational pedagogy to neuroscience. This can make selecting the “right” journal challenging at best. Novice scholars need to clearly understand where to publish, and faculty who are seeking promotion and tenure need data to explain why their work is valuable to their discipline. The earlier study by Ritzhaupt et al. (2012) filled this void but is now dated. To update their work, we answer the following questions in this chapter:

    Research Methods


    As with Ritzhaupt et al. (2012), we designed a survey to collect background information from respondents and asked them their beliefs about the relative importance of different factors for selecting a publication venue, and their perceptions of the prestige and visibility of journals in the field. Respondent demographic questions included gender, years in the field, academic rank, service in the field of educational technology, ethnicity, highest degree earned, professional affiliations, and country of residence.

    Decision-making factors influencing publication choices were rated using a five-point Likert scale ranging from Unimportant to Maximal Importance. The responses were coded for analysis from zero (0) to four (4) because it was decided that factors rated as “unimportant” to the respondent should not attribute any point weight to the mean score. We included 11 decision factors, to be consistent with Ritzhaupt et al. (2012).

    Perceived prestige data was collected both qualitatively and quantitatively. The qualitative data was collected via text entry. To collect this data, five questions asked the participant to enter, from their perspective, the ranking of educational technology journals at five levels, starting with the “top” journal in the field down to the “fifth highest ranked” journal. It is important to note that answers were based on the respondent’s personal knowledge of journals in the field and the perception of their value and prestige. In other words, they may not have known about other high-quality journals, and thus our study is about perceived visibility/prestige.

    We quantitatively measured perceived prestige by first curating a list of journals in the field consistent with the list in Ritzhaupt et al. (2012). We then asked respondents to score each journal by using a  six-point scale from Lacks Prestige to Exceptionally Prestigious, as well as a low-end option of Irrelevant/Unknown to Respondent (we coded this option as zero). The full seven-point scale was analyzed to assess perceived prestige of each journal, while the converse percentage of “irrelevant/unknown” rating was calculated to indicate a journal’s visibility.  The journal prestige portion of the instrument had a Cronbach’s α = 0.974.

    Participant Recruitment and Survey Administration

    Participants were recruited via social media groups and listservs for members of the Association for Educational Communications and Technology (AECT) and the American Educational Research Association's (AERA) Special Interest Groups (SIG) on Instructional Technology and TACTL (Teaching as an Agent of Change in Teaching and Learning). The survey was open to response from December 16, 2022, to the end of January 18, 2023. A total number of 101 participants completed the survey. Survey instances with missing responses to the questions on journal selection criteria or journal prestige were deleted, removing 29 (n = 29) participants and leaving a total of 72 completed responses for analysis.

    Table 1 summarizes the role of the respondents, with the majority self-identifying as some type of professor (i.e., assistant, associate or full professor), which also accounts for the majority holding a doctorate degree (Table 2), or being a graduate student. For the participants, the mean length of activity in the field of educational technology was 15.94 years with a median length of 12.5 years. Table 3 and the histogram (Figure 1) provide a full descriptive summary.

    Table1. Academic Position
    Full Professor1216.67
    Associate Professor1115.26
    Assistant Professor1622.22
    Non-Academic Role79.72
    Graduate Student1216.67
    Table 2. Highest Earned Degree
    Table 3. Years in Educational Technology
    StatisticYears in Field
    Valid N72
    Std. Deviation11.94
    Histogram displaying data from table 3.
    Figure 1. Years in Educational Technology

    Data Analysis

    From the original pool of seventy-two responses, the qualitative data on academic prestige was analyzed. Nearly sixty (n = 59) journals were identified as ranking in the “Top 5” journals in terms of “academic prestige” by the survey participants. The remaining data sources for the factors leading to publication venue selection, perceived prestige, and visibility were analyzed using basic descriptive statistics (e.g., M, SD, or %).


    The two questions we addressed through this work were:

    1. Why do educational technologists choose particular publication venues?
    2. How prestigious and visible are different peer-reviewed educational technology journals?

    In the following sections, we first summarize and analyze the specific criteria academics use to support publication decisions. Then, we look at the perceived prestige and comparative visibility, as well as the major criteria used to assess journals.

    Decision Factors

    First, we wanted to understand which factors influenced practicing educational technology researchers’ decisions of where to publish. We included 11 potential factors that were rated using a five-point scale from zero (Unimportant) to four (Maximal Importance). Overall, these researchers felt it was most important to find a good “fit” with the manuscript, followed by journal ranking and research accessibility (see Table 4).

    Table 4. Importance of Factors in Selecting Publication Venue
    Decision FactorCurrentRitzhaupt et al. (2012)Shift in Rank

    Manuscript Fit3.350.74Maximal4.660.62---
    Journal Rank2.670.98Moderate3.591.26↑2
    Impact Factor2.581.05Moderate3.461.26↑2
    Journal Indexing2.421.14Moderate3.541.14---
    Acceptance Rate2.290.90Moderate3.761.06↓4
    Publishing Association2.130.98Moderate3.251.21---
    Frequency of Publication1.970.93Minimal2.711.11↑2
    Editorial Review Board1.830.87Minimal2.851.16↓1


    The perceived prestige of publications in the domain of educational technology was evaluated using a pair of metrics. First, each participant was asked to list the Top 5 journals in the field from their perspective. Journals listed by name were first counted across all five ranks to get a “mentions” count. The journals listed then had each of their mentions weighted by rank with the “top” journal receiving a weight of five (5), the second highest ranked a four (4), and so on down to a weight of one. Table 5 presents the results ordered by the weighted score.

    Table 5. Perceived Prestige by Personal Ranking
    Educational Technology Research and Development (ETRD)50199
    British Journal of Educational Technology (BJET)37131
    Computers & Education (C&E)31131
    The Internet and Higher Education1233
    Journal of Research on Technology in Education (JRTE)1026
    Journal of Applied Instructional Design (JAID)824
    Distance Education818
    Educational Technology & Society718
    Online Learning518
    Computers in Human Behavior616
    Journal of Computing in Higher Education613
    Journal of Technology and Teacher Education (JTATE)311
    The International Review of Research in Open and Distributed Learning510
    International Journal of Educational Technology in Higher Education47

    The second prestige ranking opportunity for the respondents came from the pre-generated list of field specific journals. A journal’s prestige score was averaged and then rank ordered using a five-point scale. Table 6 presents the results.  If a journal was ranked in the previous 2012 article by Ritzhaupt et al., the 2012 ranking is provided within the table. The differences, then, in the list of journals are due to Table 5 displaying data on what scholars recalled without prompting and how they ordered a pre-determined list of journals (Table 6). However, as can be noted, the top three journals in terms of perceived perception remained the same.

    Table 6. Journals by Perceived Prestige
    Journal TitleMeanSD2012 Ranking
    Educational Technology Research and Development (ETRD)4.501.931
    British Journal of Educational Technology (BJET)4.321.892
    Computers & Education (C&E)3.972.133
    The Internet and Higher Education3.182.25---
    Distance Education3.072.004
    Journal of Computing in Higher Education3.001.887
    IEEE Transactions on Learning Technologies2.812.11---
    American Journal of Distance Education2.741.655
    The International Review of Research in Open and Distributed Learning2.742.11---
    Computers in Human Behavior2.682.28---
    Journal of Computer Assisted Learning2.651.97---
    Journal of Research on Technology in Education (JRTE)2.652.276
    Australasian Journal of Educational Technology2.631.73---
    Educational Technology & Society2.561.939
    International Journal of Educational Technology in Higher Education2.542.23---
    Association of the Advancement of Computing in Education2.401.73---
    International Journal of Computer-Supported Collaborative Learning2.362.06---
    Computer Assisted Language Learning2.281.96---
    Journal of Educational Computing Research2.212.03---

    Journal Visibility

    Journal visibility was computed by taking the reverse percentage of results when a given journal scored an irrelevant/unknown result by a respondent. For example, if a respondent marks irrelevant/unknown for the journal Computers & Education, that results in a lower overall visibility score. Table 7 shows the results of the upper quartile of the results. Again, for those journals previously ranked in the 2012 article by Ritzhaupt et al., the 2012 ranking is provided within the table for comparison. While some shifting occurred in rank, overall, the most visible journals remained the same. Additional influencers of these ranks are considered later in the discussion.

    Table 7. Journal Visibility
    JournalPercentage2012 Ranking
    British Journal of Educational Technology (BJET)90.282
    Educational Technology Research and Development (ETRD)88.891
    Computers & Education86.113
    American Journal of Distance Education81.94---
    Australasian Journal of Educational Technology (AJET)80.566
    Distance Education79.178
    Journal of Computing in Higher Education79.175
    Association of the Advancement of Computing in Education Journal76.399
    The Internet and Higher Education73.61---
    Educational Technology & Society72.22---
    Computers in Human Behavior70.83---
    IEEE Transactions on Learning Technologies70.83---
    International Journal of Technology Enhanced Learning70.83---
    The International Review of Research in Open and Distributed Learning70.83---
    Turkish Online Journal of Distance Education70.83---

    Impact Factor

    Given the importance of impact factor within the results of this study, examining various impact metrics can be of use when determining both the prestige and visibility of a journal. As such, the impact factor, five-year impact factor, and CiteScore ratings were collected for the journals that the research participants identified as most prestigious. We present this list in the order of prestige identified from Table 6 in our study. Each score was hand collected from a journal’s website in February 2022. Not all journals presented this information on their site; the missing metrics are listed as not available (NA). Table 8 summarizes the information.

    Table 8. Top Journal Impact Metrics
    Journal TitleImpact Factor (IF)Five-Year IFCiteScore
    Educational Technology Research and Development (ETRD)5.5805.6135.4
    British Journal of Educational Technology (BJET)2.588NA9.6
    Computers & Education (C&E)13.71NA19.8
    The Internet and Higher Education8.591NANA
    Distance Education5.5005.0077.2
    Journal of Computing in Higher Education4.0454.748NA
    IEEE Transactions of Learning Technologies4.433NA7.4
    American Journal of Distance EducationNANA4.7
    The International Review of Research in Open and Distributed Learning0.734NANA
    Computers in Human Behavior8.597NA14.9
    Journal of Computer Assisted Learning3.761NANA
    Journal of Research on Technology in Education (JRTE)3.2813.4734.6
    Australasian Journal of Educational Technology3.067NA5.5
    Educational Technology and Society2.6334.358NA
    International Journal of Educational Technology in Higher Education7.6117.82611.8
    Association of the Advancement of Computing in Education JournalNANANA
    International Journal of Computer-Supported Collaborative Learning5.6115.685NA
    Computer Assisted Language Learning5.9645.9378.4
    Journal of Educational Computing Research4.3453.7867.2


    Our first research question was the following: What is the relative importance of factors leading to publication venue selection by educational technologists?  Knight and Steinbach (2008) identified “five major considerations” when selecting a journal. The likelihood of manuscript selection (i.e., “fit”) should be the primary factor, according to Knight and Steinbach (2008, p. 62). The second most important factor is journal reputation or prestige, which is an amalgamation of age, circulation, acceptance rate, affiliation, citation generation, editorial staff, and impact (Knight & Steinbach, 2008). Relatedly, journal visibility and publication impact is the third factor identified by Knight and Steinbach. The last two considerations have to do with how long publication takes and the potential existence of issues regarding ethics or personal philosophies. Each of these factors were considered both here and within the 2012 article. The method of data collection we employed does not allow us to definitively state to what degree the decision making factors outlined above played into the decision-making process of the respondents in this study. However, the perceived prestige rankings (see Table 5) that were derived from analyses of the cued-recall questions on personal journal rankings could potentially provide some insights into the value of the journals.

    This aligns with what was found both within this research and previous research (Ritzhaupt et al., 2012). As supported by data from both this study and the prior work of Ritzhaupt et al., participants highlighted journal accessibility as a key factor in determining where to seek publication. The issue was highlighted by one participant that stated:

    We are lacking a well-indexed and open-access prestigious journal. I cannot [sic] recommend journals that are not openly accessible across the globe. I realize there is an ethical debate regarding tenure, promotion, workload, and institutional funding. I feel that academics have a responsibility to help change the model and party line so that there are no barriers to publishing in open-access journals.

    Accessibility can take multiple forms, but “open” accessibility to keep science free moving appears to be a topic of serious, ongoing consideration across domains (Berkowitz & Delacour, 2020; Matthias et al., 2019; Willinsky, 2018).

    In terms of perceived prestige, our results show that the placing of the top journals remain consistent with the work of Ritzhaupt et al. (2012), namely: (a) Educational Technology Research and Development, (b) British Journal of Educational Technology, and (c) Computers & Education. While other academic journals in educational technology shifted in perceived prestige, the placement of these three journals remained consistent. Interestingly, the journal TechTrends moved to the first position in perceived visibility according to our survey respondents, which is another official journal from the Association for Educational Communications and Technology (AECT). As mentioned in our methods section, this result may be due to sampling bias from respondents who were mostly recruited from AECT groups and listservs. Still, it indicates a strong visibility for this journal.

    While understanding perceived prestige and visibility can help researchers focus on the best venues for publication, focused examination of research networks can also inform publication selection. Journal-level analyses using bibliometric and network analytic methodologies can illustrate the impact a journal has and provide insights when researchers are deciding on outlets for publication (Mering, 2017); as such, these journal-level analyses are highly useful in understanding the nature of the field. Examples of major educational technology journals reviewed in this way include the Australasian Journal of Educational Technology (AJET) (Bond & Buntins, 2018), Computers in Human Behavior (CHB) (Chen et al., 2021; Vošner et al., 2016), Computers & Education (C&E) (Chen et al., 2019), British Journal of Educational Technology (BJET) (Chen et al., 2020), and the Journal of Research on Technology in Education (JRTE) (Wilson, 2022). These educational, technology-focused, bibliometric studies help illustrate the literature of educational technology in a manner that is useful when determining the impact and fit of journals.

    Selecting an appropriate journal to publish your research in can be of paramount concern for emerging scholars in the field of educational technology. Journal prestige and visibility can influence tenure as well as promotion decisions along with the overall assessment of a scholar’s academic performance and productivity (Fradsen, 2019; Hannafin, 1991; Kurt, 2018). While we do not claim the results from this small snapshot are conclusive in the domain of educational technology, these results are consistent with prior research (Ritzhaupt et al., 2012). This examination of prestige and visibility continues the conversation among scholars in our community on these topics as they pertain to educational technology journals and the use of prestige and visibility as relevant factors in choosing where to place our work. Considerations beyond mere impact factors should be used in judging the merits of a publication outlet (West & Rich, 2012).


    The results of this research should be considered in light of these limitations. While the survey was widely disseminated via social media, research associations, and networking, over 94% of the respondents indicated membership in the Association of Educational and Communication Technology (AECT). As journals within the study included AECT affiliations (i.e., ETRD and TechTrends), this could have impacted the perceived prestige and visibility of those journals within the results and the resulting rankings. Similarly, less than 14% of the respondents resided outside the United States. This could have impacted the perception of journals originating outside the U.S. on both scales.

    While the survey was intentionally designed to capture information on the visibility, prestige, and importance of selection factors, the results cannot definitively verify the nature of the responses. Clarity in the response process appeared to vary across participants. Many respondents entered associations or “not sure” into the personal rankings of field journals. Alternately, it cannot be confirmed that all participants knew conclusively the journal about which they were responding. For example, the American Journal of Distance Education could easily be confused for Distance Education and vice versa. As such, the results may have been impacted.

    Finally, there is a potential existence of human error in any study. The journals identified for consideration were likely highly rated journals in this field. As such, some journals could have been left out of consideration. However, this is mitigated by the self-report items within the survey.


    Al-Khatib, A. (2016). Protecting authors from predatory journals and publishers. Publishing Research Quarterly, 32, 281–285. https://doi.org/10.1007/s12109-016-9474-3

    Berkowitz, H., & Delacour, H. (2020). Sustainable academia: Open, engaged, and slow science. M@n@gement, 23(1), 1–3. https://doi.org/10.37725/mgmt.v23.4474

    Billings, C., Nielsen, P. L., Snyder, A., Sorensen, A., & West, R. E. (2012). Journal of Research on Technology in Education, 2001–2010. Educational Technology, 37–41. https://www.jstor.org/stable/44430057

    Bodily, R., Leary, H., & West, R. E. (2019). Research trends in instructional design and technology journals. British Journal of Educational Technology, 50(1), 64–79. https://doi.org/10.1111/bjet.12712

    Bond, M., & Buntins, K. (2018). An analysis of the Australasian Journal of Educational Technology 2013–2017. Australasian Journal of Educational Technology, 34(4). https://doi.org/10.14742/ajet.4359

    Cabells. (2022). Journalytics. Retrieved December from https://www2.cabells.com/journalytics

    Chen, X., Yu, G., Cheng, G., & Hao, T. (2019). Research topics, author profiles, and collaboration networks in the top-ranked journal on educational technology over the past 40 years: A bibliometric analysis. Journal of Computers in Education, 6(4), 563–585. https://doi.org/10.1007/s40692-019-00149-1

    Chen, X., Zou, D., & Xie, H. (2020). Fifty years of British Journal of Educational Technology: A topic modeling based bibliometric perspective. British Journal of Educational Technology, 51(3), 692–708. https://doi.org/10.1111/bjet.12907

    Chen, X., Zou, D., Xie, H., & Cheng, G. (2021). A topic-based bibliometric review of computers in human behavior: Contributors, collaborations, and research topics. Sustainability, 13(9), 4859. https://doi.org/10.3390/su13094859

    Frandsen, T. F. (2019). Why do researchers decide to publish in questionable journals? A review of the literature. Learned Publishing, 32(1), 57–62. https://doi.org/10.1002/leap.1214

    Grudniewicz, A., Moher, D., Cobey, K. D., Bryson, G. L., Cukier, S., Allen, K., Ardern, C., Balcom, L., Barros, T., Berger, M., Ciro, J.B., Cugusi, L., Donaldson, M.R., Egger, M., Graham, I.D., Hodgkinson, M., Khan, K.M., Mabizela, M., Manca, A.,...  Lalu, M. M. (2019). Predatory journals: no definition, no defence. Nature, 576(7786), 210–212.

    Hannafin, K. M. (1991). An analysis of the scholarly productivity of instructional technology faculty. Educational Technology Research and Development, 39(2), 39–42.

    Harris, J., Foulger, T. S., Huijser, H., & Phillips, M. (2019). Goldilocks and journal publication: Finding a fit that’s “just right”. Australasian Journal of Educational Technology, 35(4).

    Knight, L. V., & Steinbach, T. A. (2008). Selecting an appropriate publication outlet: A comprehensive model of journal selection criteria for researchers in a broad range of academic disciplines. International Journal of Doctoral Studies, 3, 59–79. https://doi.org/10.28945/51

    Kurt, S. (2018). Why do authors publish in predatory journals? Learned Publishing, 31(2), 141–147. https://doi.org/10.1002/leap.1150

    Matthias, L., Jahn, N., & Laakso, M. (2019). The two-way street of open access journal publishing: Flip it and reverse it. Publications, 7(2), 23. https://doi.org/10.3390/publications7020023

    McKiernan, E. C., Schimanski, L. A., Muñoz Nieves, C., Matthias, L., Niles, M. T., & Alperin, J. P. (2019). Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations. eLife, 8, e47338. https://doi.org/10.7554/eLife.47338

    Mering, M. (2017). Bibliometrics: Understanding author-, article- and journal-level metrics. Serials Review, 43(1), 41–45. https://doi.org/10.1080/00987913.2017.1282288

    Mertala, P., Moens, E., & Teräs, M. (2022). Highly cited educational technology journal articles: A descriptive and critical analysis. Learning, Media and Technology, 1–14. https://doi.org/10.1080/17439884.2022.2141253

    Önder, Ç., & Erdil, S. E. (2017). Opportunities and opportunism: Publication outlet selection under pressure to increase research productivity. Research Evaluation, 26(2), 66–77. https://doi.org/10.1093/reseval/rvx006

    Perkins, R. A., & Lowenthal, P. R. (2016). Open access journals in educational technology: Results of a survey of experienced users. Australasian Journal of Educational Technology, 32(3), 18–37. https://doi.org/10.14742/ajet.2578

    Ritzhaupt, A. D. (n. d.). Select list of educational technology journals. Available at: https://aritzhaupt.com/resources/ed-tech-journals/

    Ritzhaupt, A. D., Sessums, C. D., & Johnson, M. C. (2012). Where should educational technologists publish their research? An examination of peer-reviewed journals within the field of educational technology and factors influencing publication choice. Educational Technology, 52(6), 47. https://www.jstor.org/stable/44430204

    Schimanski, L. A., & Alperin, J. P. (2018). The evaluation of scholarship in academic promotion and tenure processes: Past, present, and future. F1000Research, 7. https://doi.org/10.12688/f1000research.16493.1

    Stoilescu, D., & McDougall, D. (2010). Starting to publish academic research as a doctoral student. International Journal of Doctoral Studies, 5, 79.

    Vošner, H. B., Kokol, P., Bobek, S., Železnik, D., & Završnik, J. (2016). A bibliometric retrospective of the Journal Computers in Human Behavior (1991–2015). Computers in Human Behavior, 65, 46–58. https://doi.org/10.1016/j.chb.2016.08.026

    West, R. E., & Rich, P. J. (2012). Rigor, impact and prestige: A proposed framework for evaluating scholarly publications. Innovative Higher Education, 37, 359–371. https://doi.org/10.1007/s10755-012-9214-3

    Wijewickrema, M., & Petras, V. (2017). Journal selection criteria in an open access environment: A comparison between the medicine and social sciences. Learned Publishing, 30(4), 289-300. https://doi.org/10.1002/leap.1113

    Willinsky, J. (2018). Scholarly associations and the economic viability of open access publishing. Health Sciences: An OJS Theme Demo, 1(2). https://demo.publicknowledgeproject.org/ojs3/demo/index.php/health-sciences/article/view/693

    Wilson, M. L. (2022). Topics, author profiles, and collaboration networks in the Journal of Research on Technology in Education: A bibliometric analysis of 20 years of research. Journal of Research on Technology in Education, 1–23. https://doi.org/10.1080/15391523.2022.2134236

    Matthew L. Wilson

    Kennesaw State University

    Dr. Matthew L. Wilson is an Assistant Professor of Instructional Technology in the School of Instructional Technology and Innovation for the Bagwell College of Education at Kennesaw State University. He graduated of the University of Florida with a Ph.D. in Curriculum and Instruction focusing on Educational Technology and a minor in Research and Evaluation Methodology (REM). His areas of interest include digital citizenship, the digital divide, technology-related anxieties, and technology integration in P12 classrooms . He currently teaches teacher preparation courses at the undergraduate and graduate levels focusing on technology integration and research methodology.Dr. Wilson serves the field of educational technology as a leader in AECT’s Teacher Education Division and AERA’s SIG-TACTL, as well as editing for the Journal of Research on Technology in Education (JRTE).
    Albert D. Ritzhaupt

    University of Florida

    Dr. Albert D. Ritzhaupt is a Professor of Educational Technology and Computer Science Education, and the Associate Director for Graduate Studies in the School of Teaching and Learning at the University of Florida. Dr. Ritzhaupt formerly served as the Program Coordinator for the Educational Technology program. Dr. Ritzhaupt is an accomplished educational researcher and technologist. An award-winning researcher, Dr. Ritzhaupt has published more than 100 journal articles, book chapters, technical reports, and conference proceedings; and has presented his research at numerous state, national, and international conferences. His primary research areas focus on the design, development, utilization, and evaluation of theory-inspired, technology-enhanced learning environments; teaching practices and instructional strategies in computer and information sciences education; operationalizing and measuring technology integration in education, particularly focusing on the factors that facilitate and hinder technology use in formal educational settings; and the professional competencies of individuals in the field of educational technology. Dr. Ritzhaupt was identified as one of the world’s most prolific scholarly authors in the field of educational technology from 2007 to 2017 (Bodily, Leary, & West, 2019). Dr. Ritzhaupt was awarded the 2016 UF Research Foundation Professors distinction, a term-limited professorship awarded to tenured professors who have a distinguished current record of research. In 2019, the Florida Educational Research Association selected Dr. Ritzhaupt as the Educational Researcher of the Year.

    Dr. Ritzhaupt has been funded by the Florida Department of Education (FLDOE), National Institutes of Health (NIH), and National Science Foundation (NSF) to support his research endeavors. His publications have appeared in multiple leading venues, including the Journal of Research on Technology in Education, Educational Technology Research and Development, Journal of Computing in Higher Education, Computers & Education, Journal of Educational Computing Research, and Computers in Human Behavior. Dr. Ritzhaupt has been cited and interviewed in a wide range of news media outlets like Wired, National Public Radio, Chronicles of Higher Education, and Education Week for his research. Dr. Ritzhaupt is Editor of the Journal of Research on Technology in Education (JRTE), the flagship research journal of the International Society for Technology in Education (ISTE); the former Associate Editor of the Journal of Educational Computing Research, a social-science indexed Sage publication; a consulting editor for Educational Technology Research and Development; a member of the scientific board of Computers and Human Behavior; and serves as a reviewer for several other journals. Dr. Ritzhaupt regularly attends and presents at the American Educational Research Association (AERA), ISTE, Florida Educational Research Association (FERA), and the Association for Educational Communications and Technology (AECT). In terms of professional service and leadership, Dr. Ritzhaupt has served as Chair of the Games and Simulations Special Interest Group (SIG) in ISTE, past-Chair of SIG Instructional Technology in AERA, past-President and Director of FERA, and Past-President of the Design and Development Division within AECT. Dr. Ritzhaupt is an active member of the Educational Technology and Educational Research community both locally and nationally.

    Inspired by multiple perspectives, Dr. Ritzhaupt employs a variety of instructional methods in his teaching practice and he consistently receives high scores in his teaching by students. Dr. Ritzhaupt teaches a wide-range of courses for the University of Florida, including on topics like software design and development for educational purposes, multimedia design and development, educational games and simulations, traditional and agile project management, introductory and advanced instructional design, educational research design and applied statistical methods, and advanced seminar courses for doctoral students in the field of Educational Technology. Previously, Dr. Ritzhaupt taught computer science courses for both community colleges and universities in Florida, particularly in the areas of software design and development. Dr. Ritzhaupt is a Level 2 Google Certified Educator (GCE) and a Microsoft Certified Educator (MCE), and commonly uses emerging technologies in his teaching practice. Dr. Ritzhaupt considers his greatest scholarly contribution to be the more than 20 doctoral students he has mentored to completion in the Educational Technology program.

    This content is provided to you freely by EdTech Books.

    Access it online or download it at https://edtechbooks.org/becoming_an_lidt_pro/publishing_research.