Much has been said about how expensive academic journals are. Large companies like Elsevier, Sage, Springer Nature, Taylor & Francis, and Wiley publish most of the major journals, and their shareholders pocket much of the “rent” they receive thanks to academics’ labor.

CC BY-SA 4.0 Fluffybuns.
There are alternatives. One of them is based on Wikipedia, whose process for vetting information is more transparent than that of most journals. The back-and-forth between authors and other Wikipedia volunteers that result in changes to Wikipedia is right there in the talk pages, available as it is happening, and anyone can chime in. Contrast this with academic journals, which are largely a closed shop.
To be fair, while the “shops” may be closed, they do have more windows than they used to. Many journals have come out from behind the paywalls, and now practice more accountability, such as by indicating which editor handled each article, and by having a policy on editors publishing in their own journal. To their credit, the Association for Psychological Science journals, for example, have long had a policy that when an editor or associate editor submits to their own journal, the review process for their article is managed by an external guest editor to avoid conflicts of interest. When I was an associate editor several years ago at one APS journal (AMPPS), this is what we did. I recently realized that not all APS editors are aware of their own policy, however, and that sort of forgetting is another example of why keeping the windows open, so that we can see what is happening inside, is important.

As part of the open windows principle, we should also expect journals to produce evidence that they effectively evaluate submissions for whether they are scientifically sound. Now, if asked how they we can be confident that they are publishing quality scholarship, most journal editors would point to peer review. When asked to produce examples, however, they’d have to say something like “Can’t do that! Peer review reports are confidential.”
This “you’ll just have to trust us” type of situation is ironic for a class of people who long have held skepticism to be critical to what they do. And for me as an acculturated academic, I confess it almost feels like a betrayal to state this as plainly as I have. I imagine colleagues trying to push aside the point, with responses like “Alex, you know we try hard to get good peer reviewers, besides, in the end, science yields things that work, so your point is misleading.”

I actually agree that science works on average, but often readers need to know whether there is much reason to have confidence in particular papers. Fortunately not all editors are so defensive that they cannot acknowledge this point. It took time, but by about a decade or so ago, a bunch of journal editors had freed the peer review reports from the confines of their password-protected journal management systems, allowing anyone to read them. Finally, readers had direct evidence of how well a journal is actually vetting its articles. Just as importantly, readers no longer had to rely on the overall journal reputation to make a guess about the process undergone by an individual article – they could actually see the peer review reports for an article they were interested in.
While the processes happening inside journals had to be dragged into the open, Wikipedia and its associated projects have always had openness baked in.

One project associated with Wikipedia is the WikiJournal of Science. This is a proper scholarly journal, one indexed by mainstream publication databases such as the Scopus database maintained by Elsevier. But unlike a conventional journal, most of the peer review process at the WikiJournal of Science happens in the open from the beginning. It’s all in the “Discuss” page that sits alongside each article.
In another convergence with mainstream journals, four years ago the prestigious eLife journal announced that they would only review manuscripts that had already been published elsewhere as a preprint, as part of their “long-term plan to create a system of curation around preprints that replaces journal titles as the primary indicator of a paper’s perceived quality and impact”. This has always been the preferred route for the WikiJournal of Science – manuscripts ideally are submitted by linking to a publicly-available preprint.

I’ve been an associate editor for the WikiJournal of Science for a year or so. One manuscripts I handled reported a study suggesting that geckos spontaneously “play” by running in running wheels. As the editor, I was pleased to have the opportunity to usher in new knowledge about these gravity-defying reptiles.

One of my first jobs was to email several experts on animal play to ask them to review the manuscript, which the author had posted as a preprint on WikiJournal Preprints. Two agreed, and after receiving the peer reviews, I posted them on the preprint’s Discuss page where, if anyone else were moved to do so, they could also comment. The author responded to the reviewers’ comments, and those responses also can be seen on the Discuss page. Much of the reviewing process, then, works like a conventional journal, just more transparently and able to appear in real time.
When I edited that manuscript, I had no scientific knowledge of animal play (moreover, I had consistently resisted our dog’s offers to give me real-world experience).

It would have been nice if we had had a more knowledgeable editor for the gecko manuscript, but we’re currently spread pretty thin in the editorial department. That’s one reason for this post (apply to be an editor! You don’t need to know anything about geckos!).
As the 🦎 example illustrates, like a conventional academic journal, we publish original research at the WikiJournal of Science. But the most common use of the journal is for academic peer review of articles that are intended for Wikipedia itself, and these typically don’t include original research. Before I joined the journal as an editor, for example, I saw that the Wikipedia article for “multiple object tracking” was a bit spotty in its coverage. Unsurprising, of course, as it’s quite an obscure topic. But because I had just written a short book on object tracking, I considered myself well-placed to write a more comprehensive Wikipedia entry. The eventual article I wrote was based on my book, together with others’ publications, so it didn’t count as the type of original research that is prohibited by Wikipedia.

I submitted my draft Wikipedia article to the WikiJournal, and it eventually passed peer review. As a result, the editor replaced the existing Wikipedia entry with my article. This was quite satisfying – given how widely Wikipedia is used, my contributions to this obscure topic are probably now much more influential then if they had remained confined to academic journals and my book.

A nice aspect of the WikiJournal of Science is that part of the revision process occurs almost instantaneously, thanks to its wiki infrastructure. As I read through a submission, I typically make small edits on the preprint itself to improve the language, just as many Wikipedians do when they come across a Wikipedia entry they are interested in. The reviewers of the manuscript are able to do the same thing. The author is not obliged to keep these edits, of course; they can revert them and explain why in their response letter.
This really should be seen as basic functionality, as it is similar to the nearly universally-used Track Changes in Word or Google Docs. But despite most of us collaborating on documents in that way for decades now, most academic journals still don’t have this functionality.
Reviewers and editors at traditional journals typically aren’t able to enter the journal’s system and directly make suggestions on the manuscript. Instead, they write their comments in a separate, standalone document or form. This lack of functionality for scholarly communication is one illustration of how little the scientific community has gotten for the billions of dollars that they have been paying to publishers each year (the previous link is for APCs alone; it doesn’t even include the subscriptions payments, and the free peer reviewing that academics do).
The unwieldiness of journals’ systems is not because corporations generally don’t deliver good products or continually improve their service; many do. But academic journals are not part of a functioning economic market. In the dreamworld of a functioning market for scholarly communication, the journal that provides the best service and features would win the most market share. In the world we actually live in, the owners of the journals (who are sometimes the publisher, and sometimes a scientific society) simply wait for submissions from the researchers. They know that researchers will stick with the journals that have the highest impact factors in their field, which then results in those journals maintaining a high impact factor, with little effect of the fees charged or the quality of the services provided.
I think that all of this means that you should support diamond open access journals in general, not just the WikiJournal of Science.

Diamond open access journals are those that are free to read and to publish in. They typically use open source software (the wiki infrastructure in the case of the WikiJournal of Science, and Open Journal Systems for thousands of other diamond OA journals) hosted by a nonprofit institution, such as a university. The open source software does tend to be more klunky than the big publishers’ systems, which does mean it’s more annoying for the academic editors involved. But the alternative, the tradeoff of letting corporate publishers handle things in return for billions of dollars and a corruption of academic values, is an even worse deal.
But why should you, an individual scholar, have to do something about this? The primary way that scientists in a field come together to get things done (aside from doing science, reviewing, and editing itself) is through scholarly societies. Scientific societies were designed to serve scientists’ interests. They should be leading the way to reducing dependence on corporate publishers and creating diamond OA journals.
But many scientific societies have been captured by their publishers. Here’s how it happened. As part of a contract giving a publisher the right to publish the society’s journal, the publisher provides the society with a payment. Over the years, this payment rose, reflecting the steady increase in subscription and/or APC fees. While the payment is only a small fraction of how much the journal makes (otherwise the publisher wouldn’t have the high profits that they do), it’s a substantial amount of money for a scientific society, and quite a high percentage in the current era of declining in-person conference fees. Societies pay much of their staff salaries off of this, and many hired more staff with this money. For many societies, these staff end up making most of the society’s decisions, or advising the academics who ostensibly make the decisions but offer little resistance. As the staff’s jobs depend on maintaining the society’s revenue, giving up the publishing income is a non-starter. This dynamic has played out even at some of the most respected and active scientific societies, as we recently learned in the case of the American Association for the Advancement of Science.
Within psychology, the Association for Psychological Science (APS) is another example. Six months ago, APS suddenly announced they were starting a new journal, with no evidence of consultation of academics. Indeed, the announcement was strangely light on details of why they were starting a journal and what the vision for it was. So I wasn’t the only one who suspected this was concocted simply to create a new revenue stream.
Yesterday, I did some digging. The publisher used by APS, Sage, maintains a spreadsheet with their list of publication fees (APCs) for the open access journals they publish. Advances in Psychological Science Open is now in that list, just below Advances in Methods and Practices in Psychological Science, formerly APS’ only fully open access journal. The price to publish in the new journal? Two Benjamins and five bills!

That APC (Article Processing Charge) of $2500 is $1500 more than that for APS’ more well-established journal (AMPPS).

In short, APS is starting an expensive journal that has little to no buy-in from the community (judging from social media) and hoping that demand for the prestige of the APS brand, combined with the reject-and-refer system developed by PLOS, and perfected by Nature Publishing, will bring the money rolling in.

If you’re a tenured academic, you shouldn’t be editing for journals like that!
I better re-phrase that. Because admittedly, I myself took an editorial stipend from APS, first at Perspectives on Psychological Science over ten years ago when some of us started the Registered Replication Report format there, and subsequently when we co-founded the journal Advances in Methods and Practices in Psychological Science.
Here’s my rewrite: if you are a tenured academic, you should be devoting a bunch of your time to cultivating alternatives to the usual money-sucking journal racket.

Over at freejournals.org, we highlight quality diamond OA journals and we diamond OA editors try to support each other. So here I am, trying to promote this. While not many people read this blog, a lot of people are occasionally forced to read emails from me (simply because I am a more-or-less tenured academic). Therefore, I have changed my email signature I now advertise the diamond OA initiatives that I am most involved in.

And now it is time for me to turn to other activities for avoiding the news.

Postscript. Perhaps the biggest challenge facing the WikiJournal of Science is our high liability insurance bill (for things like defamation suits); my colleagues have contacted dozens of insurers but none would give us a lower bill. And that was before Elon Musk started threatening Wikipedia! If you think you can help us, please get in touch.



