PeerJ Exits

Issue 63 · March 2024

Join the conversation on Twitter @Briefer_Yet and LinkedIn.

Did a colleague forward the The Brief to you? Sign up via the button below to get your very own copy.

Customer Data Platforms: What? Why? and When?

In a podcast episode of Reboot IT, Dave Coriale, President of Delcor, enlists the help of C&E’s Colleen Scollans to explore customer data platforms (CDPs), why CDPs are gaining traction in the 501c community, and whether (and why) one might be in your future. It is 44 minutes of distilled wisdom (perfect for your next dog walk) – well worth the time of anyone working at a professional association or academic society in any capacity.

Is the Research Literature Overrun with AI-Generated Articles?

Recent investigations into the use of generative AI in writing papers raise alarms but our review of the these investigations suggests the concern may be overblown (at this point anyway).  

Journal Metrics Benchmarking

Join C&E’s 2024 Journal  Metrics Benchmarking study to get benchmark data on EIC and other editor honoraria, submission trends, publication turnaround times, and many more critical metrics. This study is conducted only every 2–3 years, so this is your last chance to participate before 2026. 

Spring Meetings

Attending the STM Annual Conference in DC (April 24–25)? Set up a meeting with a C&E representative to discuss how we can help your organization better grow revenues and engage your audience.

PeerJ Exits


Taylor & Francis (T&F) announced this month that it has acquired open access (OA) pioneer PeerJ for an undisclosed amount, bringing to an end just over a decade of independent publishing for the OA trailblazer. 

PeerJ was launched in 2013 by cofounders Peter Binfield and Jason Hoyt, following the massive early success of PLOS ONE. Indeed, Binfield was the Publisher of PLOS ONE immediately before launching PeerJ, and it appeared that the initial plan was to replicate the successful megajournal. This was not a bad idea. As the Publisher of PLOS ONE, you are watching the explosive growth of the publication you are working on but have no way to participate in that value creation. The owner of PLOS ONE is, of course, PLOS. As a not-for-profit, PLOS does not issue equity to its employees. So Binfield was helping to create tens of millions of dollars of asset value but keeping none of it (beyond his regular salary and maybe a modest bonus). It made perfect sense for him to seek an alternative corporate structure for a megajournal in which he participated in the upside. It would frankly be hard to be that close to Sand Hill Road in California’s Silicon Valley and not think this way. 

The most obvious venture capital (VC) firm to back the new publishing venture was not on Sand Hill Road, however, but across the Golden Gate in Sebastopol. O’Reilly AlphaTech Ventures (OATV) is the VC wing of tech publisher O’Reilly Media and O’Reilly namesake, Tim O’Reilly. 

We would love to see that initial pitch deck! We have no idea what it said, but can imagine it was something like:

  1. PLOS ONE has hit on a great new publishing model: the “sound science” multidisciplinary megajournal. It is a geyser of money. 
  2. PeerJ is going to replicate the success of PLOS, which we know how to do because we have the Publisher of PLOS ONE as a co-founder.
  3. $$$$$

If we had been on the investment team at OATV and saw a slide with those three points, we would have given this the green light! O’Reilly obviously felt the same and invested $950,000 in PeerJ’s seed funding round.

It turned out, however, that putting the “mega” in megajournal was not as replicable as anticipated. PeerJ’s highest-volume year saw around 7% of the volume of papers published by PLOS ONE at its peak. The formula that worked at PLOS did not work at PeerJ – and to be fair, it also didn’t work at lots of other publishers that tried it (including giants such as Elsevier, whose megajournal Heliyon took more than a decade to gain any momentum). To add to PeerJ’s challenges, lots of other organizations quickly jumped on the megajournal bandwagon. While they were mostly not successful, in aggregate they siphoned off a lot of papers, making it very difficult for anyone to create another megajournal. Eventually, Nature (now Springer Nature) succeeded with Scientific Reports and MDPI later arrived with the now downward-spiraling International Journal of Environmental Research and Public Health

Reputation matters in many things, and journals are one of them. PLOS was able to succeed with PLOS ONE only because they had first launched PLOS Biology and PLOS Medicine and had built a reputation as a high-quality, high-impact publisher. This is part of why Springer Nature has succeeded with Scientific Reports, which is situated in the Nature portfolio (and is often referred to by researchers before hiring committees as “Nature Scientific Reports”). There is an aspirational element to publishing in Scientific Reports and PLOS ONE and, critically, a brand halo from the larger portfolio. Undoubtedly Elsevier would have had better early success with Heliyon if they had called it “Cell Heliyon.” PeerJ was a new publisher with no reputation, no high-impact-factor journal, and no brand halo. What they had was a business model. 

To build momentum, PeerJ – believing “price” could be a competitive differentiator from PLOS ONE – started by offering a $299 flat “all you can publish” fee to authors, allowing them to pay once for their first paper and to publish all future papers at no additional fee. When this didn’t catch on as hoped, PeerJ  pivoted repeatedly over the years (like many a VC-backed Silicon Valley start-up), both editorially – moving from a “sound science” journal to being more selective – and financially. They abandoned the all-you-can-publish model for a personal lifetime membership model offering authors a limited number of papers per year, then a straightforward article processing charge (APC) model, and more recently, an annual institutional membership model that parallels PLOS’s Community Action Publishing approach. Along the way PeerJ also experimented with a preprint product for conferences that was ultimately shuttered due to competition from bioRxiv, Research Square, and others, combined with an unclear revenue model for preprints more generally. 

In response to lackluster submission numbers, PeerJ ultimately abandoned its “one journal to rule them all” megajournal aspirations and launched additional titles in specific subject areas. These attempts to build the portfolio by launching more journals added fewer than 2,000 lifetime articles to the portfolio. The dearth of submissions stands in sharp contrast to explosive growth from rival OA publishers Frontiers, MDPI, and Hindawi over the past decade. This is on the one hand a testament to PeerJ’s management focus on building reputation (after the lessons of the early years) and refusal to join in the growth hacking strategies (e.g., unchecked special issues) pursued by their competitors. On the other hand, it is challenging to grow an OA publisher whose revenue is ultimately tied to published output without attracting a lot of papers. 

The acquisition of PeerJ by T&F is notable in that the acquirer was not Sage. Sage led the Series A investment round in PeerJ in 2014, a move that presumably set Sage up to acquire the OA upstart in the future. The reason for their ultimate demurral is unclear.  

PeerJ will make an incremental addition to the T&F portfolio and serve as an OA imprint for the publisher, joining F1000 in that regard. It also follows closely on the heels of T&F’s purchase of Future Science Group, announced just a few months ago. PeerJ, while relatively small, will nonetheless help bolster T&F’s article output. There is some post-acquisition complexity that T&F will presumably need to manage in order to honor the various lifetime models developed by PeerJ over the years.

T&F, like its peers outside the top four publishing houses, remains in the challenging position of being too large to be easily acquired, but not quite large enough to keep up with the market leaders, who are making sizable investments in portfolio expansion. 

As no numbers have been publicly released regarding the purchase price paid for PeerJ, this will be worth scrutiny the next time T&F offers a financial report. 

A Curie-ous Sale


The other significant scholarly communications acquisition this month was that of MPS Limited purchasing American Journal Experts (AJE) from Springer Nature.
AJE was part of the Research Square portfolio, which was purchased by Springer Nature in December 2022 (following a prior investment in the company). It is surprising that Springer Nature would divest the principal economic engine of a portfolio they acquired just 15 months before, while retaining the Research Square brand and preprint platform.
Even more surprising, the sale includes the new AI-powered scientific writing tool Curie, which Springer Nature launched with a fanfare just five months ago. 
We speculate that the sale of AJE, a non-core asset for Springer Nature, was made to help bolster the balance sheet in advance of the company’s (yet again) anticipated IPO. The inclusion of Curie is, however, a mystery. As Curie was new, it is unlikely to have much revenue associated with it hence its divestiture is unlikely to add great deal to the balance sheet. Yet one would think the new AI tool would signal to potential investors that Springer Nature is at the leading edge of publishing and investing in AI tools and services – surely a good thing? 

eLife’s First Year as a “Preprint Review Service”


Although it’s too early to assess the success of eLife’s bold attempt to reshape the scientific journal, the numbers reported after one full year of the “preprint-review service model” raise some cautionary flags for others considering such a move.
As a reminder to those that have lost track of the details of the eLife model (or whose eyes glaze over when they hear “preprint-review service model”), any manuscript submitted to eLife that passes a triage by editors and is sent out for peer review is considered “accepted” into the journal, and authors can then choose to either make revisions in response to the external peer review or ignore those reviews entirely. When the author feels the paper is ready, they may declare the manuscript, whether revised or not, to be the published “Version of Record.” The question of whether this publishing experiment would scare authors away or result in a gold rush of those seeking a seemingly easier publication route into a journal with a relatively high Journal Impact Factor (currently 7.7, ranking 7th out of 92 in the Journal Citation Reports “Biology” category) has now been answered.
In 2023, the first full year of eLife’s new model, 6,200 manuscripts were submitted, marking a 22% decline from 2022, and 33% fewer than 2021eLife also seems to have tightened up its criteria for sending articles out for review, with only 27.7% articles avoiding desk rejection in 2023, down from the 34.8% reported in each of the previous two years. While some submission decline can be attributed to growing market competition and the overall post-pandemic reduction in the literature, it is clear that authors did not flock to the new model, possibly out of the same reputational concerns that caused consternation internally within eLife. The higher standards for avoiding desk rejection also meant that there wasn’t an enormous increase in the overall acceptance rate, which was 27.7% (all submitted articles sent out for peer review) in 2023 compared with the 23% and 21% accepted after peer review in 2022 and 2021, respectively.
The drop in submissions and the rise in desk rejections is likely to have had a significant negative financial impact on eLife. Although eLife remains free from existential concerns around sustainability due to the deep pockets of its funding agency owners, initial projections that the journal would thrive financially from the new model have proven incorrect. eLife lowered its APC from $3,000 to $2,000, assuming that since payment is made at the point of peer review, more papers would be subject to paying the fee, resulting in revenue remaining steady. But with fewer submissions and a smaller percentage sent out for peer review, 1,719 reported papers with a $2,000 APC would result in $3.4 million before accounting for waivers and discounts, a drop of at least 37% from 2022’s reported author revenue.
The eLife model will also wreak potential havoc on bibliometric research. eLife reports that 1,332 of the 1,719 papers sent out for review were officially “published,” meaning there are 387 papers accepted by the journal in 2023 that are still floating around somewhere. Furthermore, with just over 500 of those manuscripts seeing revised versions, which carry a new DOI, bibliometric databases seem unclear on how to account for multiple versions of the same paper. Web of Science currently lists 1,284 research articles for 2023 for eLife, which aligns roughly with the total number of “published” papers, while Dimensions counts 1,700 research papers for 2023, and seems to include revised versions as separate articles.
As the market leader in preprint review services (a dubious distinction perhaps), eLife’s performance raises questions about the research community’s willingness to engage with this approach and preprints in general. Data cited in a recent set of recommendations for accelerating open preprint review notes that, as of 2020, the ratio of preprints to journal articles was only 6.4%. Another recent analysis saw tremendous variation in uptake between journals and fields of study. Daniel Bingham observes that preprint review has reached something of a low plateau: “After 6 years, only a few thousand articles are being reviewed per year across 13 different services. And it looks like growth is stalling.” While it’s still early days, for now at least preprint review has failed to gain much traction in the research community.

OSTP On the Hook for “In-Depth” Financial Analysis


As we noted last November, the US Congress made a very clear statement of their dissatisfaction with the White House Office of Science and Technology Policy (OSTP)’s implementation of the Nelson Memo, which calls for immediate public access to federally funded research data and papers. OSTP overstepped their authority in issuing the policy without the required advance notice to Congress and produced a financial analysis that the House of Representatives viewed as not “serious” (the bar for seriousness in the US House of Representatives is not known to be particularly high, so this was saying something).

While Congress has backed off on significantly slashing the OSTP’s budget (the new figures freeze the budget at 2023 levels rather than the proposed 30% reduction), the new Appropriations Act, passed and signed into law, requires, via the explanatory statement that accompanies the act, that OSTP develop an “in-depth” financial analysis of the likely impact of the policy. The explanatory statement reads:

The agreement directs OSTP to produce an in-depth financial analysis of the August 25, 2022, Memorandum to Executive Departments and Agencies titled, “Ensuring Free, Immediate, and Equitable Access to Federally Funded Research” including the policy’s anticipated impact on Federal research investments, research integrity, and the peer review process, as was previously directed in House Report 117-395, no later than 100 days after the enactment of this act. If OSTP fails to provide the Committees with the report within 100 days, then OSTP must pause implementation of the memorandum until the agency produces the report.

The deadline for this analysis (unless OSTP wishes to pause implementation of the Nelson Memo) is therefore mid-June. That’s not a lot of time for what should be a complex piece of research.

OSTP has already collected data showing that the average APC paid by federal researchers is around $4,000. With around 255,000 articles published listing US federal funding in 2022 (according to Dimensions, an inter-linked research system provided by Digital Science), that puts potential publication fee costs at $1.02 billion per year, presumably increasing annually as the literature grows. Estimating how many journals will enact mandatory “gold” routes (and when) for federally funded research will be an essential aspect of the study. A further bit of analysis is how many of those papers will be helpfully covered by transformative agreements (effectively, transferring the cost of publication from federal agencies to universities).

Costs will also need to be estimated for compliance efforts at research institutions, which ran at about 83% of APC costs for the Research Councils UK (RCUK) policy back in 2013, but is likely more streamlined now with Transformative Agreements centralizing much activity. In addition, potential expected increased costs will need to be calculated for repositories such as PubMed Central and PubAg.

Although the costs for immediate public access to papers will only represent a small portion of the overall research budget, it does put the OSTP in an uncomfortable position it has yet to acknowledge since the Memo was released: the policy will not somehow magically be implemented without either requiring an increase in agency budgets or a decrease in the amount of money that agencies can put toward research as some portion will now be diverted away to covering publication and data costs. This is a point that even Nelson herself continues to elide– writing recently on the X platform that the policy imposes “no necessary cost to the researcher.” The word “necessary” is doing a lot of work here. There is no cost to the researcher if the researcher publishes in a journal that does not require that federally funded research be published via Gold OA (or impose other publication fees). In other words, Nelson (and so far, OSTP) have not acknowledged that removing publication embargos will have any impact on the economics of publishers and seem to assume that publishers will continue publishing federal research with no changes to their policies as a result of the memo. This is… unlikely. We anticipate that more and more journals will mandate the Gold route or impose other fees as the policy comes into effect in order to accommodate the lack of publication embargo and inevitable erosion of subscription revenues.

Where things get even trickier is the open data requirement of the policy, something that the OSTP has entirely ignored in all their financial modeling to date. We know from the recent Association of Research Libraries study that individual research institutions are already spending between $800,000 and $6,000,000 per year for the very light data management and sharing practices in place before the Memo goes into effect. When every federally funded researcher on campus is required to make their data publicly available, we can expect those costs to skyrocket, again putting the OSTP in the unenviable position of having to explain how this tremendously expensive policy can be enacted without any additional funding.

As of now, researchers will be able to ask for additional funds in their grants to cover these costs, but no additional money is being provided to funding agencies. Instead, many are seeing their budgets reduced, and next year’s proposed agency budgets look lean as well. That will mean a smaller number of larger grants will be issued, reducing the amount of research being done and the number of people who will be employed doing science. Open data is projected to provide economic benefits, but we at The Brief have our doubts whether the OSTP’s coming analysis will offer any sort of modeling of the balance of those benefits with the potential losses in research productivity resulting from less funding going directly to research. The analysis is also unlikely to cover the potentially devastating effects the policy will have on independent publishers and research societies.

One side note: we have heard concern from colleagues about the use of the term “open access” in the explanatory text. The OSTP has always been careful to use the term “public access” to avoid confusion over CC BY licenses and whether Green OA practices are valid. Here our understanding is that this is not a change in agenda or policy, rather it’s likely that the Congressional staffers who wrote that section were not aware of the linguistic subtleties involved.

Correction Notice: The original version of this story noted only the absence of the language from the text of the Act itself and neglected to include the explanatory statement. We thank the two eagle-eyed readers who sent this information in and we have now updated this section. We regret the omission. 

cOAlition S Annual Review


We can’t help but read some symbolism into cOAlition S releasing its Annual Review 2023 as a flipbook, complete with page-turning sounds, a 25-year-old skeuomorphic format that was largely dismissed by the market back when it launched in 1999. Read into that decision what you will about cOAlition S’s ability to make the right choices about the future of publishing. 

Irony aside, interpreting cOAlition S’s numbers is difficult given inconsistencies in their data collection processes. Those experienced in using bibliometric databases understand that there is often some lag in the indexing of content, hence one must take care in looking at very recent data as it is likely incomplete (we are still dubious of 2023 data in all the major databases and probably won’t fully trust it until at least mid-2024). cOAlition S’s analysis in the report is based on Dimensions 2023 data collected at the beginning of January 2024, and incomplete 2022 and 2021 data collected in mid-December of those years. While these data show an overall year-on-year increase for articles funded by the cOAlition – 155,035 in 2021, 168,200 in 2022, and 201,614 in 2023 – more complete current Dimensions data tell a different story, with funded article volume in decline – 225,426 in 2021, 211,022 in 2022, and 199,163 in 2023. This mirrors the overall literature, presumably due to an increase in publication volume during the pandemic and the decline thereafter. 

Newer data also reflect some changes in individual article status over time. Articles that were considered “Bronze” OA may or may not still be freely available, and articles published in 2021 and 2022 in subscription journals with 12-month Green OA embargoes are now freely accessible. cOAlition S’s choice of data source doesn’t offer any information on the license attached to any given article, so it’s unclear which articles listed under various OA modes are actually compliant with Plan S requirements. That said, current Dimensions data show the total percentage of cOAlition S-funded papers that are OA (whether Gold, Hybrid, Green, or Bronze), has remained relatively steady – 86% for 2021, 86% for 2022, and 84% for 2023 – rather than climbing as reported – 73%, 79%, 81%, respectively. The more recent Gold OA numbers show fairly steady percentages – 39% to 41% by current data compared with the reported 34% to 40% growth over the 3-year period. The percentage of articles taking the Green OA route appears to be shrinking, dropping in 2023 to less than half of what was seen in 2021 – current data show a decrease from 18% to 7%, compared with the reported drop of 12% to 7%. This difference may be a result of the aforementioned embargoes expiring on the older content. Meanwhile Hybrid OA uptake went up by more than 40% over the time period – 24% to 34% by current data, 24% to 32% by previous data. This is presumably due to increased availability of Hybrid journals to Plan S authors through Transformative Agreements. 

No matter which numbers one uses, the biggest growth area for compliance is publication in Hybrid journals, a carve-out that the cOAlition will sunset at the end of 2024 when it cuts off support for Transformative Journals and transformative agreements, although what this means in the case of transformative agreements is unclear, as universities, and not cOAlition S, typically pay for them. Green OA and the much-touted Rights Retention Strategy seem increasingly to be afterthoughts. The perceived success of Plan S then likely depends on one’s view of whether its purpose was to increase the number of OA papers in general (which it has) or to put an end to subscription journals (which it hasn’t).

Jisc on a Decade of Transitioning to OA


Ten years after Jisc began negotiations for its first transitional agreement (a form of the more commonly used term “transformative agreement”), the organization has released a review of how things have gone. The conclusions, which should not come as a surprise to anyone, are that these agreements have substantially increased OA uptake, but are also beginning to significantly restructure the OA market. In agreement with the cOAlition S data above, Jisc reports a shift in the market. Where Gold OA was growing faster than Hybrid OA, this has now reversed, with Hybrid growth outpacing that seen for fully OA journals. And as is the case for cOAlition S, Green OA is seeing a steady decline. The transitional agreement portion of the OA market is dominated by the largest publishers – Elsevier, Springer Nature, Wiley, and Taylor & Francis. This finding parallels that of a recent preprint, which found that globally, Elsevier, Springer Nature, and Wiley “accounted for three-fourths of open access articles through transformative agreements.” Jisc also notes that while transitional agreements were meant to be “temporary and transitional,” they do not seem to be having a significant effect on the flipping of Hybrid journals to fully OA. By Jisc’s estimates, “Based on the journal flipping rates observed between 2018–2022 it would take at least 70 years for the big five publishers to flip their TA titles to OA.”

Briefly Noted


As this month’s The Brief went to press, the Bill & Melinda Gates Foundation announced that it is apparently breaking with Plan S and adopting Plan U. The Foundation will cease funding article processing charges (APCs) for its grantees as of 2025, instead requiring them to post a preprint version of any funded paper. Gates is staying in cOAlition S (membership has never been contingent on following Plan S). More next month as this story develops.

De Gruyter has completed its acquisition of Brill and the two companies are officially merged as De Gruyter Brill

Two more annual reports are out this month. The Directory of Open Access Journals (DOAJ) Annual Highlights shows the Directory reaching the milestone of 20,000 journals listed, with an acceptance rate for journal applications of 24%. ORCID also offered up a 2023 Annual Report, which we’re happy to see shows the organization running at a surplus for each of the last five years, allowing the building of a “nest egg” for future development or responding to market changes, an essential for sustainability for such important infrastructure. 

Clarivate’s Full Year 2023 Financial Results were released, showing a 1.2% decrease in revenue, “driven primarily by the divestiture of MarkMonitor.”

MDPI released a statement about its updated publication ethics policies, specifically its corrections policy, which will now note any minor corrections made to a paper via a footnote, and more importantly its special issues policies, which now place responsibility for oversight of special issues with the hosting journal’s regular Editor in Chief (or a designated Editorial Board member). Whether it’s causative or merely coincidental, MDPI’s output appears drastically down this year, with just over 43,000 articles published in 2024 as of March 16, compared with more than 77,000 in the same period in 2023 (a more than 40% decline).

An interesting study in Scientific Reportssuggests that successful researchers tend to publish in a relatively small set of journals, rather than hopping around between a lot of different publications. While its scope is limited to researchers studying social media, the conclusions suggest that cultivating an audience is an effective career advancement strategy.  

Quis custodiet ipsos custodes? A research integrity report commissioned by the Australian government includes a fake reference.

Good news for the ever-innovative crew at OurResearch, as the Arcadia Fund has awarded the organization a $7.5 million grant to continue to build out OpenAlex, intended to provide an open index of scholarly outputs.

The rush to jump on the AI bandwagon continues as this month ReadCube released Literature Review and Digital Sciences announced Dimensions Research GPT. Both products appear to offer support for collecting and summarizing large quantities of literature. While the Dimensions product specifically states that it uses OA publications, ReadCube is less clear in its training set. We at The Brief are looking forward to trying both these products, but our skeptical natures can’t help thinking about Ed Zitron’s recent piece asking whether we have already reached Peak AI (at least for this AI hype cycle). 

The American Physiological Society announced the move of its ten primary research journals to the Subscribe to Open model as of 2025. 

Reference and style aficionados rejoice: the 9th edition of TheCSE Manual is imminent.  

Although touted as an “open access” policy, the plan espoused by Australia’s chief scientist, Cathy Foley, sounds a lot more like the sort of national subscription model proposed for India. We were particularly amused by the angry quote from a scientist decrying the plan – “Why do Australians have this desire to make Americans rich?” – given that the two publishers mentioned in the article are based in the Netherlands and Germany.

Despite Twitter/X having “lost the momentum of activity it gained during the COVID-19 pandemic,” a study of science activity on the platform from Altmetric concludes that the platform “remains the leading source” of science social media attention, with activity that is still above pre-COVID levels. has joined RubriqAxios Review, Peerage of Science, and others in the “portable peer review” graveyard. The site’s founder, Daniel Bingham, offers an insightful view on why he thinks it failed and why crowdsourcing review can’t hope to replace journals, focusing particularly on the essential work that editorial teams do in persuading reviewers to review and then moderating the feedback. Stephen Heard offered related thoughts on the nature of volunteerism in academia and why “put it out there and see who offers” strategies usually fail.

Since the first globally registered TA in 2014 the number of TAs and number of countries that subscribe to them has grown, but the growth has slowed in recent years. This may indicate that the largest publishers have already made agreements with the countries/consortia most willing/able to enter into TAs.  – Jisc (A Review of Transitional Agreements in the UK)