Ritzhaupt et al. (2012) asked, “Where should educational technologists publish their research?” This question remains relevant for today’s researchers. Most researchers argue that high caliber, peer-reviewed works should be the benchmark for quality. But which journals are high caliber? In this chapter, we share results of a survey of professionals and researchers, identifying which journals in the field are most visible and considered prestigious.
Over a decade ago, Ritzhaupt et al. (2012) asked, “Where should educational technologists publish their research?” This question remains relevant for today’s researchers. However, what are the factors for answering that question? Most researchers argue that high caliber, peer-reviewed works should be the benchmark for quality (e.g., Schimanski & Alperin, 2018), but they also suggest that various metrics, like impact factor alone, can be problematic in making a judgement by using the metric alone (see McKiernan et al., 2019 for more details). With the increasing number of publication venues (e.g., some lists of venues include more than 100 journals) available to educational technology researchers (Ritzhaupt, n. d.), we need guidance beyond acceptance rates and impact factors to assist us in choosing where to share our research (West & Rich, 2012).
Once research is complete and the manuscript is written, where does it go? How do you find the best publication outlet for your work? And why is this important? Understanding the right outlet for a manuscript can challenge even a seasoned author. As such, the task of picking a venue as a novice researcher is even more daunting, which can lead to questionable choices in selecting a journal for manuscript submission. Frandsen (2019) reviewed the literature on the causes of poor journal selection and highlighted motivational factors and the lack of awareness as the cause for poor selection. Kurt (2018) found that the pressure to publish, general unawareness of quality publication channels, and/or fears of social misperception or academic inadequacy, lead novice researchers to publish in predatory journals. These types journals “prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices” (Grudniewicz et al., 2019, p. 211). As such, Al-Khatib (2016) suggested that publication in poor or predatory journals can lead to “negative scars” on career advancement (e.g., tenure and promotion), which could result in novice researchers facing long-term consequences for publishing in a predatory journal.
Purpose of this Chapter
Understanding the value of publication outlets, particularly peer-reviewed journals, can play a critical role in the careers of new scholars. As explained by Ritzhaupt et al. (2012) in their earlier examination of these factors, the academic career path requires justification of publication choices throughout. Yet, as noted earlier, there are arguably over 100 journals in the field of educational technology (Ritzhaupt, n. d.), including journals on topics from educational pedagogy to neuroscience. This can make selecting the “right” journal challenging at best. Novice scholars need to clearly understand where to publish, and faculty who are seeking promotion and tenure need data to explain why their work is valuable to their discipline. The earlier study by Ritzhaupt et al. (2012) filled this void but is now dated. To update their work, we answer the following questions in this chapter:
- Why do educational technologists choose particular publication venues?
- How prestigious and visible are different peer-reviewed educational technology journals?
As with Ritzhaupt et al. (2012), we designed a survey to collect background information from respondents and asked them their beliefs about the relative importance of different factors for selecting a publication venue, and their perceptions of the prestige and visibility of journals in the field. Respondent demographic questions included gender, years in the field, academic rank, service in the field of educational technology, ethnicity, highest degree earned, professional affiliations, and country of residence.
Decision-making factors influencing publication choices were rated using a five-point Likert scale ranging from Unimportant to Maximal Importance. The responses were coded for analysis from zero (0) to four (4) because it was decided that factors rated as “unimportant” to the respondent should not attribute any point weight to the mean score. We included 11 decision factors, to be consistent with Ritzhaupt et al. (2012).
Perceived prestige data was collected both qualitatively and quantitatively. The qualitative data was collected via text entry. To collect this data, five questions asked the participant to enter, from their perspective, the ranking of educational technology journals at five levels, starting with the “top” journal in the field down to the “fifth highest ranked” journal. It is important to note that answers were based on the respondent’s personal knowledge of journals in the field and the perception of their value and prestige. In other words, they may not have known about other high-quality journals, and thus our study is about perceived visibility/prestige.
We quantitatively measured perceived prestige by first curating a list of journals in the field consistent with the list in Ritzhaupt et al. (2012). We then asked respondents to score each journal by using a six-point scale from Lacks Prestige to Exceptionally Prestigious, as well as a low-end option of Irrelevant/Unknown to Respondent (we coded this option as zero). The full seven-point scale was analyzed to assess perceived prestige of each journal, while the converse percentage of “irrelevant/unknown” rating was calculated to indicate a journal’s visibility. The journal prestige portion of the instrument had a Cronbach’s α = 0.974.
Participant Recruitment and Survey Administration
Participants were recruited via social media groups and listservs for members of the Association for Educational Communications and Technology (AECT) and the American Educational Research Association's (AERA) Special Interest Groups (SIG) on Instructional Technology and TACTL (Teaching as an Agent of Change in Teaching and Learning). The survey was open to response from December 16, 2022, to the end of January 18, 2023. A total number of 101 participants completed the survey. Survey instances with missing responses to the questions on journal selection criteria or journal prestige were deleted, removing 29 (n = 29) participants and leaving a total of 72 completed responses for analysis.
Table 1 summarizes the role of the respondents, with the majority self-identifying as some type of professor (i.e., assistant, associate or full professor), which also accounts for the majority holding a doctorate degree (Table 2), or being a graduate student. For the participants, the mean length of activity in the field of educational technology was 15.94 years with a median length of 12.5 years. Table 3 and the histogram (Figure 1) provide a full descriptive summary.
Table1. Academic Position
Table 2. Highest Earned Degree
Table 3. Years in Educational Technology
|Statistic||Years in Field|
From the original pool of seventy-two responses, the qualitative data on academic prestige was analyzed. Nearly sixty (n = 59) journals were identified as ranking in the “Top 5” journals in terms of “academic prestige” by the survey participants. The remaining data sources for the factors leading to publication venue selection, perceived prestige, and visibility were analyzed using basic descriptive statistics (e.g., M, SD, or %).
The two questions we addressed through this work were:
- Why do educational technologists choose particular publication venues?
- How prestigious and visible are different peer-reviewed educational technology journals?
In the following sections, we first summarize and analyze the specific criteria academics use to support publication decisions. Then, we look at the perceived prestige and comparative visibility, as well as the major criteria used to assess journals.
First, we wanted to understand which factors influenced practicing educational technology researchers’ decisions of where to publish. We included 11 potential factors that were rated using a five-point scale from zero (Unimportant) to four (Maximal Importance). Overall, these researchers felt it was most important to find a good “fit” with the manuscript, followed by journal ranking and research accessibility (see Table 4).
Table 4. Importance of Factors in Selecting Publication Venue
|Decision Factor||Current||Ritzhaupt et al. (2012)||Shift in Rank|
|Frequency of Publication||1.97||0.93||Minimal||2.71||1.11||↑2|
|Editorial Review Board||1.83||0.87||Minimal||2.85||1.16||↓1|
The perceived prestige of publications in the domain of educational technology was evaluated using a pair of metrics. First, each participant was asked to list the Top 5 journals in the field from their perspective. Journals listed by name were first counted across all five ranks to get a “mentions” count. The journals listed then had each of their mentions weighted by rank with the “top” journal receiving a weight of five (5), the second highest ranked a four (4), and so on down to a weight of one. Table 5 presents the results ordered by the weighted score.
Table 5. Perceived Prestige by Personal Ranking
|Educational Technology Research and Development (ETRD)||50||199|
|British Journal of Educational Technology (BJET)||37||131|
|Computers & Education (C&E)||31||131|
|The Internet and Higher Education||12||33|
|Journal of Research on Technology in Education (JRTE)||10||26|
|Journal of Applied Instructional Design (JAID)||8||24|
|Educational Technology & Society||7||18|
|Computers in Human Behavior||6||16|
|Journal of Computing in Higher Education||6||13|
|Journal of Technology and Teacher Education (JTATE)||3||11|
|The International Review of Research in Open and Distributed Learning||5||10|
|International Journal of Educational Technology in Higher Education||4||7|
The second prestige ranking opportunity for the respondents came from the pre-generated list of field specific journals. A journal’s prestige score was averaged and then rank ordered using a five-point scale. Table 6 presents the results. If a journal was ranked in the previous 2012 article by Ritzhaupt et al., the 2012 ranking is provided within the table. The differences, then, in the list of journals are due to Table 5 displaying data on what scholars recalled without prompting and how they ordered a pre-determined list of journals (Table 6). However, as can be noted, the top three journals in terms of perceived perception remained the same.
Table 6. Journals by Perceived Prestige
|Journal Title||Mean||SD||2012 Ranking|
|Educational Technology Research and Development (ETRD)||4.50||1.93||1|
|British Journal of Educational Technology (BJET)||4.32||1.89||2|
|Computers & Education (C&E)||3.97||2.13||3|
|The Internet and Higher Education||3.18||2.25||---|
|Journal of Computing in Higher Education||3.00||1.88||7|
|IEEE Transactions on Learning Technologies||2.81||2.11||---|
|American Journal of Distance Education||2.74||1.65||5|
|The International Review of Research in Open and Distributed Learning||2.74||2.11||---|
|Computers in Human Behavior||2.68||2.28||---|
|Journal of Computer Assisted Learning||2.65||1.97||---|
|Journal of Research on Technology in Education (JRTE)||2.65||2.27||6|
|Australasian Journal of Educational Technology||2.63||1.73||---|
|Educational Technology & Society||2.56||1.93||9|
|International Journal of Educational Technology in Higher Education||2.54||2.23||---|
|Association of the Advancement of Computing in Education||2.40||1.73||---|
|International Journal of Computer-Supported Collaborative Learning||2.36||2.06||---|
|Computer Assisted Language Learning||2.28||1.96||---|
|Journal of Educational Computing Research||2.21||2.03||---|
Journal visibility was computed by taking the reverse percentage of results when a given journal scored an irrelevant/unknown result by a respondent. For example, if a respondent marks irrelevant/unknown for the journal Computers & Education, that results in a lower overall visibility score. Table 7 shows the results of the upper quartile of the results. Again, for those journals previously ranked in the 2012 article by Ritzhaupt et al., the 2012 ranking is provided within the table for comparison. While some shifting occurred in rank, overall, the most visible journals remained the same. Additional influencers of these ranks are considered later in the discussion.
Table 7. Journal Visibility
|British Journal of Educational Technology (BJET)||90.28||2|
|Educational Technology Research and Development (ETRD)||88.89||1|
|Computers & Education||86.11||3|
|American Journal of Distance Education||81.94||---|
|Australasian Journal of Educational Technology (AJET)||80.56||6|
|Journal of Computing in Higher Education||79.17||5|
|Association of the Advancement of Computing in Education Journal||76.39||9|
|The Internet and Higher Education||73.61||---|
|Educational Technology & Society||72.22||---|
|Computers in Human Behavior||70.83||---|
|IEEE Transactions on Learning Technologies||70.83||---|
|International Journal of Technology Enhanced Learning||70.83||---|
|The International Review of Research in Open and Distributed Learning||70.83||---|
|Turkish Online Journal of Distance Education||70.83||---|
Given the importance of impact factor within the results of this study, examining various impact metrics can be of use when determining both the prestige and visibility of a journal. As such, the impact factor, five-year impact factor, and CiteScore ratings were collected for the journals that the research participants identified as most prestigious. We present this list in the order of prestige identified from Table 6 in our study. Each score was hand collected from a journal’s website in February 2022. Not all journals presented this information on their site; the missing metrics are listed as not available (NA). Table 8 summarizes the information.
Table 8. Top Journal Impact Metrics
|Journal Title||Impact Factor (IF)||Five-Year IF||CiteScore|
|Educational Technology Research and Development (ETRD)||5.580||5.613||5.4|
|British Journal of Educational Technology (BJET)||2.588||NA||9.6|
|Computers & Education (C&E)||13.71||NA||19.8|
|The Internet and Higher Education||8.591||NA||NA|
|Journal of Computing in Higher Education||4.045||4.748||NA|
|IEEE Transactions of Learning Technologies||4.433||NA||7.4|
|American Journal of Distance Education||NA||NA||4.7|
|The International Review of Research in Open and Distributed Learning||0.734||NA||NA|
|Computers in Human Behavior||8.597||NA||14.9|
|Journal of Computer Assisted Learning||3.761||NA||NA|
|Journal of Research on Technology in Education (JRTE)||3.281||3.473||4.6|
|Australasian Journal of Educational Technology||3.067||NA||5.5|
|Educational Technology and Society||2.633||4.358||NA|
|International Journal of Educational Technology in Higher Education||7.611||7.826||11.8|
|Association of the Advancement of Computing in Education Journal||NA||NA||NA|
|International Journal of Computer-Supported Collaborative Learning||5.611||5.685||NA|
|Computer Assisted Language Learning||5.964||5.937||8.4|
|Journal of Educational Computing Research||4.345||3.786||7.2|
Our first research question was the following: What is the relative importance of factors leading to publication venue selection by educational technologists? Knight and Steinbach (2008) identified “five major considerations” when selecting a journal. The likelihood of manuscript selection (i.e., “fit”) should be the primary factor, according to Knight and Steinbach (2008, p. 62). The second most important factor is journal reputation or prestige, which is an amalgamation of age, circulation, acceptance rate, affiliation, citation generation, editorial staff, and impact (Knight & Steinbach, 2008). Relatedly, journal visibility and publication impact is the third factor identified by Knight and Steinbach. The last two considerations have to do with how long publication takes and the potential existence of issues regarding ethics or personal philosophies. Each of these factors were considered both here and within the 2012 article. The method of data collection we employed does not allow us to definitively state to what degree the decision making factors outlined above played into the decision-making process of the respondents in this study. However, the perceived prestige rankings (see Table 5) that were derived from analyses of the cued-recall questions on personal journal rankings could potentially provide some insights into the value of the journals.
This aligns with what was found both within this research and previous research (Ritzhaupt et al., 2012). As supported by data from both this study and the prior work of Ritzhaupt et al., participants highlighted journal accessibility as a key factor in determining where to seek publication. The issue was highlighted by one participant that stated:
We are lacking a well-indexed and open-access prestigious journal. I cannot [sic] recommend journals that are not openly accessible across the globe. I realize there is an ethical debate regarding tenure, promotion, workload, and institutional funding. I feel that academics have a responsibility to help change the model and party line so that there are no barriers to publishing in open-access journals.
Accessibility can take multiple forms, but “open” accessibility to keep science free moving appears to be a topic of serious, ongoing consideration across domains (Berkowitz & Delacour, 2020; Matthias et al., 2019; Willinsky, 2018).
In terms of perceived prestige, our results show that the placing of the top journals remain consistent with the work of Ritzhaupt et al. (2012), namely: (a) Educational Technology Research and Development, (b) British Journal of Educational Technology, and (c) Computers & Education. While other academic journals in educational technology shifted in perceived prestige, the placement of these three journals remained consistent. Interestingly, the journal TechTrends moved to the first position in perceived visibility according to our survey respondents, which is another official journal from the Association for Educational Communications and Technology (AECT). As mentioned in our methods section, this result may be due to sampling bias from respondents who were mostly recruited from AECT groups and listservs. Still, it indicates a strong visibility for this journal.
While understanding perceived prestige and visibility can help researchers focus on the best venues for publication, focused examination of research networks can also inform publication selection. Journal-level analyses using bibliometric and network analytic methodologies can illustrate the impact a journal has and provide insights when researchers are deciding on outlets for publication (Mering, 2017); as such, these journal-level analyses are highly useful in understanding the nature of the field. Examples of major educational technology journals reviewed in this way include the Australasian Journal of Educational Technology (AJET) (Bond & Buntins, 2018), Computers in Human Behavior (CHB) (Chen et al., 2021; Vošner et al., 2016), Computers & Education (C&E) (Chen et al., 2019), British Journal of Educational Technology (BJET) (Chen et al., 2020), and the Journal of Research on Technology in Education (JRTE) (Wilson, 2022). These educational, technology-focused, bibliometric studies help illustrate the literature of educational technology in a manner that is useful when determining the impact and fit of journals.
Selecting an appropriate journal to publish your research in can be of paramount concern for emerging scholars in the field of educational technology. Journal prestige and visibility can influence tenure as well as promotion decisions along with the overall assessment of a scholar’s academic performance and productivity (Fradsen, 2019; Hannafin, 1991; Kurt, 2018). While we do not claim the results from this small snapshot are conclusive in the domain of educational technology, these results are consistent with prior research (Ritzhaupt et al., 2012). This examination of prestige and visibility continues the conversation among scholars in our community on these topics as they pertain to educational technology journals and the use of prestige and visibility as relevant factors in choosing where to place our work. Considerations beyond mere impact factors should be used in judging the merits of a publication outlet (West & Rich, 2012).
The results of this research should be considered in light of these limitations. While the survey was widely disseminated via social media, research associations, and networking, over 94% of the respondents indicated membership in the Association of Educational and Communication Technology (AECT). As journals within the study included AECT affiliations (i.e., ETRD and TechTrends), this could have impacted the perceived prestige and visibility of those journals within the results and the resulting rankings. Similarly, less than 14% of the respondents resided outside the United States. This could have impacted the perception of journals originating outside the U.S. on both scales.
While the survey was intentionally designed to capture information on the visibility, prestige, and importance of selection factors, the results cannot definitively verify the nature of the responses. Clarity in the response process appeared to vary across participants. Many respondents entered associations or “not sure” into the personal rankings of field journals. Alternately, it cannot be confirmed that all participants knew conclusively the journal about which they were responding. For example, the American Journal of Distance Education could easily be confused for Distance Education and vice versa. As such, the results may have been impacted.
Finally, there is a potential existence of human error in any study. The journals identified for consideration were likely highly rated journals in this field. As such, some journals could have been left out of consideration. However, this is mitigated by the self-report items within the survey.
Al-Khatib, A. (2016). Protecting authors from predatory journals and publishers. Publishing Research Quarterly, 32, 281–285. https://doi.org/10.1007/s12109-016-9474-3
Berkowitz, H., & Delacour, H. (2020). Sustainable academia: Open, engaged, and slow science. M@n@gement, 23(1), 1–3. https://doi.org/10.37725/mgmt.v23.4474
Billings, C., Nielsen, P. L., Snyder, A., Sorensen, A., & West, R. E. (2012). Journal of Research on Technology in Education, 2001–2010. Educational Technology, 37–41. https://www.jstor.org/stable/44430057
Bodily, R., Leary, H., & West, R. E. (2019). Research trends in instructional design and technology journals. British Journal of Educational Technology, 50(1), 64–79. https://doi.org/10.1111/bjet.12712
Bond, M., & Buntins, K. (2018). An analysis of the Australasian Journal of Educational Technology 2013–2017. Australasian Journal of Educational Technology, 34(4). https://doi.org/10.14742/ajet.4359
Cabells. (2022). Journalytics. Retrieved December from https://www2.cabells.com/journalytics
Chen, X., Yu, G., Cheng, G., & Hao, T. (2019). Research topics, author profiles, and collaboration networks in the top-ranked journal on educational technology over the past 40 years: A bibliometric analysis. Journal of Computers in Education, 6(4), 563–585. https://doi.org/10.1007/s40692-019-00149-1
Chen, X., Zou, D., & Xie, H. (2020). Fifty years of British Journal of Educational Technology: A topic modeling based bibliometric perspective. British Journal of Educational Technology, 51(3), 692–708. https://doi.org/10.1111/bjet.12907
Chen, X., Zou, D., Xie, H., & Cheng, G. (2021). A topic-based bibliometric review of computers in human behavior: Contributors, collaborations, and research topics. Sustainability, 13(9), 4859. https://doi.org/10.3390/su13094859
Frandsen, T. F. (2019). Why do researchers decide to publish in questionable journals? A review of the literature. Learned Publishing, 32(1), 57–62. https://doi.org/10.1002/leap.1214
Grudniewicz, A., Moher, D., Cobey, K. D., Bryson, G. L., Cukier, S., Allen, K., Ardern, C., Balcom, L., Barros, T., Berger, M., Ciro, J.B., Cugusi, L., Donaldson, M.R., Egger, M., Graham, I.D., Hodgkinson, M., Khan, K.M., Mabizela, M., Manca, A.,... Lalu, M. M. (2019). Predatory journals: no definition, no defence. Nature, 576(7786), 210–212.
Hannafin, K. M. (1991). An analysis of the scholarly productivity of instructional technology faculty. Educational Technology Research and Development, 39(2), 39–42.
Harris, J., Foulger, T. S., Huijser, H., & Phillips, M. (2019). Goldilocks and journal publication: Finding a fit that’s “just right”. Australasian Journal of Educational Technology, 35(4).
Knight, L. V., & Steinbach, T. A. (2008). Selecting an appropriate publication outlet: A comprehensive model of journal selection criteria for researchers in a broad range of academic disciplines. International Journal of Doctoral Studies, 3, 59–79. https://doi.org/10.28945/51
Kurt, S. (2018). Why do authors publish in predatory journals? Learned Publishing, 31(2), 141–147. https://doi.org/10.1002/leap.1150
Matthias, L., Jahn, N., & Laakso, M. (2019). The two-way street of open access journal publishing: Flip it and reverse it. Publications, 7(2), 23. https://doi.org/10.3390/publications7020023
McKiernan, E. C., Schimanski, L. A., Muñoz Nieves, C., Matthias, L., Niles, M. T., & Alperin, J. P. (2019). Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations. eLife, 8, e47338. https://doi.org/10.7554/eLife.47338
Mering, M. (2017). Bibliometrics: Understanding author-, article- and journal-level metrics. Serials Review, 43(1), 41–45. https://doi.org/10.1080/00987913.2017.1282288
Mertala, P., Moens, E., & Teräs, M. (2022). Highly cited educational technology journal articles: A descriptive and critical analysis. Learning, Media and Technology, 1–14. https://doi.org/10.1080/17439884.2022.2141253
Önder, Ç., & Erdil, S. E. (2017). Opportunities and opportunism: Publication outlet selection under pressure to increase research productivity. Research Evaluation, 26(2), 66–77. https://doi.org/10.1093/reseval/rvx006
Perkins, R. A., & Lowenthal, P. R. (2016). Open access journals in educational technology: Results of a survey of experienced users. Australasian Journal of Educational Technology, 32(3), 18–37. https://doi.org/10.14742/ajet.2578
Ritzhaupt, A. D. (n. d.). Select list of educational technology journals. Available at: https://aritzhaupt.com/resources/ed-tech-journals/
Ritzhaupt, A. D., Sessums, C. D., & Johnson, M. C. (2012). Where should educational technologists publish their research? An examination of peer-reviewed journals within the field of educational technology and factors influencing publication choice. Educational Technology, 52(6), 47. https://www.jstor.org/stable/44430204
Schimanski, L. A., & Alperin, J. P. (2018). The evaluation of scholarship in academic promotion and tenure processes: Past, present, and future. F1000Research, 7. https://doi.org/10.12688/f1000research.16493.1
Stoilescu, D., & McDougall, D. (2010). Starting to publish academic research as a doctoral student. International Journal of Doctoral Studies, 5, 79.
Vošner, H. B., Kokol, P., Bobek, S., Železnik, D., & Završnik, J. (2016). A bibliometric retrospective of the Journal Computers in Human Behavior (1991–2015). Computers in Human Behavior, 65, 46–58. https://doi.org/10.1016/j.chb.2016.08.026
West, R. E., & Rich, P. J. (2012). Rigor, impact and prestige: A proposed framework for evaluating scholarly publications. Innovative Higher Education, 37, 359–371. https://doi.org/10.1007/s10755-012-9214-3
Wijewickrema, M., & Petras, V. (2017). Journal selection criteria in an open access environment: A comparison between the medicine and social sciences. Learned Publishing, 30(4), 289-300. https://doi.org/10.1002/leap.1113
Willinsky, J. (2018). Scholarly associations and the economic viability of open access publishing. Health Sciences: An OJS Theme Demo, 1(2). https://demo.publicknowledgeproject.org/ojs3/demo/index.php/health-sciences/article/view/693
Wilson, M. L. (2022). Topics, author profiles, and collaboration networks in the Journal of Research on Technology in Education: A bibliometric analysis of 20 years of research. Journal of Research on Technology in Education, 1–23. https://doi.org/10.1080/15391523.2022.2134236