Seduced by Open Access capitalism? When the cure is worse than the disease


Ulrik Wagner
Department of Nutrition, Exercise and Sports (NEXS), University of Copenhagen


Dr. Ulrik Wagner is an Associate Professor at the Department of Nutrition, Exercise and Sports (NEXS) at the University of Copenhagen. He is a ember of the research group SoRSE – Sociological Research in Sport and Exercise.

Publishing our research in scientific, peer reviewed journals is a core task for people engaged in academia. It is the activity that distinguishes universities from other higher education institutions where teaching is the primary mission. Accordingly, in an era characterized by the publish-or-perish paradigm, debating the conditions under which we produce, distribute, and evaluate the quality of one of our core outcomes is of utmost importance (Wagner, 2016).

It is hardly a surprise that our working conditions have changed and differ radically from the ideals once formulated by Humboldt. Without romanticizing the male-dominated past, recent decades have been characterized by struggles to keep and maintain some level of autonomy and research integrity in situations where, to varying degrees, market forces and political interests seek to control and direct us. A central practice that is supposed to ensure this autonomy is the peer review system. As with all systems, flaws and errors are easily detected in the peer reviewing processes. Nonetheless, ideally this system intends to allow the research community to establish relevant and reasonable criteria of research quality and to assess, evaluate, and decide what counts as proper research. The ideal is to let those in charge of conducting research administer the criteria for adherence and for meeting a certain research standard. This can be seen as testimony to a field’s level of professionalization (Abbott, 1991), e.g., professional associations (or community of scholars) of architects or physicians define what is required in order to be entitled to call oneself an architect or a physician. Imagining the opposite situation might illustrate why this research autonomy is worth fighting for: It could have serious consequences if politicians were the ones to decide between research/not-research. For example, election campaigns could opportunistically stigmatize certain kinds of research. Or imagine how regulation by market forces would lead to a dull and uniform (main)stream of research attracting the widest audience (consumers) and therefore reducing critical niches for material irrelevant as investment objects.

This paved the way for a lucrative and very profitable business model, framed and fueled by the omnipresent publish-or-perish paradigm haunting the academic research workplace combined with a precarious and very competitive labour market.

The ecology of journal publishing

Having said this, the current quality assessment system of journal publishing is not without some serious issues (Beverungen, Böhm & Land, 2012). If we turn back time ten to fifteen years, the field could be characterized as an oligopoly. A few for-profit publishing companies such as Springer, Elsevier, Blackwell, Taylor & Francis, and SAGE dominated the market, and many university publishers either struggled to keep momentum or were already shutting down (Harvey, Lightfoot, Lilley & Weir, 2012). This paved the way for a lucrative and very profitable business model, framed and fueled by the omnipresent publish-or-perish paradigm haunting the academic research workplace combined with a precarious and very competitive labour market. Without wishing to sound (too much) like an orthodox 1970s Marxist scholar, the circular system operates like this: Scholars – a labour force often financed by public-funds at universities – deliver the raw material (manuscripts) based on their research for the journals (for a detailed introduction see Knocke, 2020). The same scholars voluntarily handle quality assessment and refinement of the raw material through a peer review process (processing). Finally, the same scholars can only access ‘their’ product if their own universities (typically their libraries) are willing to pay the publishers a fee. As such, universities pay twice: Funding of labour costs and a fee for access to the end product. The academic labour force is structurally dependent on this access to survive the next steps in the on-going global publish-or-perish battle. This creates the perfect situation for an oligopoly: Publishers administer a product with low or insignificant costs for wages and processing, but simultaneously with a high demand for the end-product from the very creators of the raw material. However, it is worth remembering that some journals are owned by professional academic societies, enabling a share of the surplus to be channeled back to researchers though agreements with the publishers. For instance, a significant source of income for the International Sociology of Sport Association (ISSA) comes from SAGE for the International Review for the Sociology of Sport. Taylor & Francis allocates finances for the European Association for Sport Management (EASM) for publishing the European Sport Management Quarterly (ESMQ). Moreover, publishers can provide marketing services that academic societies might find it hard to deliver themselves.

At least, this was the situation until recently. The widespread call for Open Access journals has been one avenue for counteracting the dominance of the oligopoly. The internet is the technology that enabled open access (Laakso et al. 2011). It is claimed by some that open access would, ideally, bring research closer to those producing it by making publishing houses less influential, or perhaps even redundant. And even better – open access has an immense democratic potential as it will make research papers available for everyone and not limited to the sections of academia with budgets that can pay for journal access. Open access allows poorer universities, funders, lay people, practitioners, and anyone with interests in research and new knowledge to access research papers. It will enable us to meet the policy recommendations stipulating open access. What’s not to like?

What we in the field of sport and exercise studies have witnessed over the last five years is a dramatic and aggressive increase in a variant of so-called open access publications from platforms such as Frontiers and the MDPI (Multidisciplinary Digital Publishing Institute). The concern and argument raised in this essay is that rather than representing a substantial and democratic alternative to the oligopoly, these publishing channels represent a cure that is even worse than the disease. This essay aims to encourage critical reflection and foster a debate around our publishing patterns. This critique focusses on MDPI but the Frontiers journals very probably suffer from similar flaws.

MDPI – Money Develops Problematic Incentives

The MDPI is registered in Switzerland and publishes nearby 400 journals[1]. Since it’s beginnings in 1996, the number of journal articles published by MDPI has grown rapidly. Over the years, MDPI’s practices and standards have been heavily debated and considered controversial. However, detailed discussion of this is beyond the scope of this essay[2]. A trademark of MDPI is open access and a fast peer review process. Potential authors are probably well aware of this, as this distinguishes MDPI journals from traditional journals, where the peer review process can be drawn out. What is also interesting is that the prospective author accepts – in advance – to pay an ‘Open Access and Article Processing Charge’ (APC) if the manuscript is accepted. In traditional journals this is an additional option you can buy, not a prerequisite. For instance, for the MDPI journal International Journal of Environmental Research and Public Health the APC is 2,500 Swiss Francs (c. €2,535), to increase to 2,700 CHF (c. €2,738) from July 1 2023. The APC for MDPI journals with a lower impact factor, such as Social Sciences, will ‘only’ be 1,400 CHF (c. 1,420). Nonetheless, these are significant sums of money, and with the rapid growth in the number of articles published by MDPI this practice ensures a considerable income.

Currently, we are seeing a considerable number of scholars, for example from the field of sociology of sport or sport management, publishing in MDPI journals. Here it is very important to note that I am not proposing that published articles are bad per se just because they appear in MDPI journals. What I do propose is that scholars from the two research fields I personally know the best are also engaged here, and that they deliver the raw material for maintaining MDPI productivity. Sport-related topics frequently appear in MDPI-journals such as International Journal of Environmental Research and Public Health, Social Sciences, Societies, and Sustainability. Rapid publication is stated on the front cover of the journals for promotional branding. For instance, Sustainability writes ‘a first decision is provided to authors approximately 17.7 days after submission’ while simultaneously claiming papers undergo peer review. Impact factor is used as an indicator of legitimacy (my personal interpretation), e.g., 3.889 by Sustainability, as indexed in PubMed, while currently MDPI journals like the International Journal of Environmental Research and Public Health has been delisted by Clarivate/Web of Science (the company administering the annual impact factor)[3]. The APC is higher in these two journals compared to journals lacking these tokens of legitimacy and thus you pay more for publishing in ‘prestigious’ journals equipped with these symbols.

All of you who regularly engage in anonymously reviewing manuscripts know that this is time-consuming task, and also that it is something that needs to be done thoroughly (next time it could be your paper!), even though it is often considered voluntary work. Therefore, receiving a decision less than three weeks after a submission is fast, very fast, and perhaps too fast? A search of ‘sport’ in Sustainability results in 954 articles (of course, not all of these are dedicated solely to sport). I have tried to trace their track history: It normally takes approximately one month to forty days from date of receipt (first submission) to the date the revised version is sent back to MDPI. Standard procedure is then that the manuscript is accepted between one to four days later, and published another few days later. The most interesting detail in this is: There never seems to be a second reviewing! My best guess is that many of the readers of this essay have experienced that papers submitted to respected journals such as International Review for the Sociology of Sport or European Sport Management Quarterly are often subjected to a second or third round of reviewing before being assessed as meeting the quality criteria of the journal and being accepted for publication. Apparently, no manuscript submitted to MDPI journals ever needs more than one review occasion.

MDPI journals like the International Journal of Environmental Research and Public Health has been delisted by Clarivate/Web of Science (the company administering the annual impact factor).

Why is this the case with the MDPI journals? Well, it is not unreasonable to suspect that as the author has already agreed to pay the APC the publisher has no incentive to prolong the process between receiving the raw material to seeing the final outcome. This is market logic, where rapid production (i.e., bringing the product to the market) rather than rigorous peer review is the dominant quality criteria. One suspects that impact factor and proclaimed (pseudo) peer review serve the purpose of legitimization, while the ongoing publish-or-perish paradigm fosters a significant and steadily increasing influx of raw material (manuscripts) from the labour force. The consequences for our academic societies of embracing this version of open access capitalism are alarming, and may be fatal. Here are a few examples:

I have for some years had the privilege to be part of the ESMQ editorial board. Thus, I know what it takes to review manuscripts, and also how much effort it requires for editors and co-editors to find two or three well-qualified reviewers. I have also had the (dubious) honour of having the two manuscripts I submitted to ESMQ most recently rejected. The first one, submitted some years ago, went into a second round of review after one positive and one rather less enthusiastic reviewer response. For the second round of peer review, a third reviewer was invited and s/he, unfortunately, also saw some problems in the paper. Accordingly, the paper was rejected after the second round. Although I did not agree with the critique (and my paper later found its way into another journal), the moral is clear: The critical evaluation by peer reviewers (i.e., scholars in our community) of a submitted manuscript is important, and of course editors should take their objections into consideration prior to decision. My question is: Would a rejection have been a possibility in a journal where I had accepted, in advance, to pay a significant amount of money to see my paper published? More recently, I submitted a paper that was rejected after a first round of review on the basis of the critical concerns of two highly qualified reviewers. After 24 hours of moderate disappointment, I had to agree: They were absolutely right. My manuscript was not really good enough or suitable for this journal. My question is: Would one receive a similar qualified review and rejection in a journal branding itself with rapid production? My point is that rigorous reviews and rejections are prerequisites for research quality – rapid production and superficial reviewing are not.

So, how do the MDPI journals really work? Well, my guess is that they work in various ways. What worries me is that money and financial discounts have become part of the publishing agenda, rather than a commitment founded on substantial academic anchoring. I recently received an e-mail announcement from the MDPI journal Healthcare. Aside from mentioning its impact factor (3.160), rapid production (19.06 days after submission), and the 2,000 CHF APC, it also emphasized that ‘you may be entitled to a discount if you have previously received a discount code from Editorial Office’. Another example is a post from LinkedIn uploaded by a professor acting as a guest editor for a MDPI-journal: ‘For contributions submitted until [date] the journal offers a discount, if indicating “25 percent special discount invited by the Guest Editor” in the cover letter’. Hence, discounts promoted by a scholar via social media are one way of attracting potential contributors. A good friend of mine was invited to review a manuscript submitted to the International Journal of Environmental Research and Public Health. The invitation states that ‘we would appreciate receiving your comments within 10 days’, however it mentions that more time can also be given if needed, but ‘for thorough and timely review reports we provide discount vouchers for Article Processing Charges’. Thus, conducting a rapid and timely (and then perhaps not so thorough) review is rewarded with a discount on the APC for the next manuscript submitted by the reviewer to a MDPI journal. This – one could argue – creates a very intimate relationship based on financial rationales (saving money), rather than a relation based on academic and scholarly affiliation. As a result, our research outputs, i.e., publications and reviews, are traded as a commodity.

And what might be the financial effect on academic communities such as EASM or ISSA? While previously a profitable endeavour, publishing houses are now under pressure from perhaps even more profitable business models. A consequence of this is that traditional publishers may argue that they cannot continue to disburse finances to our academic societies. Furthermore, in terms of quality and rigor held up by solid reviewing, the push for fast production risks undermining what we de facto understand by reviewing and consequently the ability to maintain our professional integrity. We might even imagine a near future where scholars speculate the reverse: If APC costs are integrated in external grants applications, then the temptation to choose a swift MDPI journal over a less agile (but more rigorous) journal, where you risk multiple and demanding review rounds, is perhaps hard to resist. Paying to get published becomes the standard – of course primarily for those who are able to attract substantial external funding. Whereas in the old system university libraries paid for access to scientific journals, no matter whether scholars had research funding or not. Therefore, not only do these open access journals represent a financial challenge, they also pose a potential threat to our efforts to benchmark standards of quality for research. The worst-case scenario is that this variant of open access becomes a cure that is even worse than the disease it originally attempted to target.

And, for young and not yet tenured scholars who need to operate in a highly competitive environment, I understand the temptation to favour a rapid process and high acceptance rates over rigorous and time-consuming review processes in journals with acceptance rates below 20%.

What now?

The purpose of this critical essay is not to point at individual colleagues or generally claim that all MDPI contributions lack quality. My purpose is basically to draw attention to our working conditions (MacLean, 2014). Specifically, I think there are multiple issues with regard to publishing that are cause for concern. I do not propose that returning to the good old days of the oligopoly is a sustainable path. Instead, I suggest that it is about developing sustainable alternatives (see also Böhm et al. 2012).

Therefore, my first wish is that colleagues share their experiences with these versions of open access publishing. I might be wrong, missing some nuances, or simply be misinformed by conspiracy theories due to my latent quasi-Marxist orientation. Choosing MDPI journals might work well for the individual author(s), or as a personal parallel publishing strategy where you simultaneously go for (traditional) high standard journals. And, for young and not yet tenured scholars who need to operate in a highly competitive environment, I understand the temptation to favour a rapid process and high acceptance rates over rigorous and time-consuming review processes in journals with acceptance rates below 20%. However, for tenured and established scholars I am eager to hear how you perceive this open access variant in relation to our simultaneous efforts to strengthen existing communities like EASM or ISSA?

Secondly, I believe we need to think in alternatives, including how we can improve existing practices. Multiple journals are now trying to speed up the review process, for instance limiting to 4-5 weeks rather than 8-10 weeks. Stories about manuscripts that were left unattended for several months fundamentally undermine the idea of being a serious, quality journal and are detrimental to the reputation of a journal. Furthermore, open access does not need to be driven by for-profit financial incentives (Fuchs & Sandoval, 2013). The journal Ephemera – theory & politics in organization[4] is run by a collective of scholars, and the peer reviewed journal Scandinavian Sport Studies Forum is part of idrottsforum.org, hosted and supported by Malmö University – and there are many other examples. At the same time, by acting together some universities have successfully managed to push major publishers, such as Elsevier, to make agreements enabling open access, as is the case for Denmark[5]. It is perhaps also worth re-considering whether university presses might not be more natural hosts for academic journals – whilst well-aware that such presses must also adhere to a for-profit principle.

Finally, universities can – if they share concerns similar to my own with regard to some versions of open access – simply decide that publications in these outlets do not count, for instance in applications for promotion. The signal would be that such (pseudo) review processes are incompatible with ideals of rigor and high research quality. If scholars then, for various reasons, still decide to opt for such journals it will be their own free choice but not the policy of the university as such.

Copyright © Ulrik Wagner 2023


References

Abbott, A. (1991). The order of professionalization – an empirical analysis. Work and Professions, 18, 355-384.
Beverungen, A., Böhm, S & Land, C. (2012). The poverty of journal publishing. Organization, 19, 929-938.
Fuchs, C & Sandoval, M. (2013). The diamond model of open access publishing: Why policy makers, scholars, universities, libraries, labour unions and the publishing world need to take non-commercial, non-profit open access serious. Tripple C, 13, 428-443.
Harvie, D., Lightfoot, G., Lilley, S. & Weir, K. (2012). What are we to do with feral publishers? Organization, 19, 905-914.
Knocke, M. (2020). Science communication and open access: The critique of the political economy in capitalist academic publishers as ideology critique. Tripple C, 18, 508-534.
Laakso M, Welling P, Bukvova H, Nyman L, Björk B-C, Hedlund T (2011) The Development of Open Access Journal Publishing from 1993 to 2009. PLoS ONE 6(6): e20961. https://doi.org/10.1371/journal.pone.0020961
Maclean, M. (2014). (Re)Occupying a cultural commons: reclaiming the labour process in critical sport studies. Sport in Society, 17, 1248-1265.
Wagner, U. (2016). The publishing game: The dubious mission of evaluating research and measuring performance in a cross-disciplinary field. Scandinavian Sport Studies Forum, 7, 63-88.

Notes

[1] See MDPI – Publisher of Open Access Journals
[2] A (critical) look at Wikipedia should ring enough alarm bells regarding MDPI to demand anyone’s interest prior to submitting a manuscript: MDPI – Wikipedia
[3] MDPI mega-journal delisted by Clarivate / Web of Science – M.A. HANSON RESEARCH (weebly.com)
[4] Home | Ephemeral Journal (ephemerajournal.org)
[5] Elsevier aftale 2021-24 | For professionelle (kb.dk)


Print Friendly, PDF & Email

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.