How medicine learned to love the preprint

7 minute read


The pressure on journals to bypass traditional peer-review processes has never been so great.


Since the beginning of last year, reputable medical journals have been inundated with manuscripts containing information about SARS-CoV-2.

Should they be shared instantly or in the weeks or months it might take to undergo scientific peer review?

A recent study, published in Plos Biology (having itself first been released as a preprint), found that in the first 10 months of the pandemic, almost a quarter of all covid-related scientific papers were being hosted on preprint servers.

The authors analysed the basic preprint metadata (including DOIs, titles, abstracts, author names etc.) for two preprint servers and compared manuscripts relating to covid with non-covid papers.

The meta-research also looked at the traffic analytics for the articles and found that preprints about covid were being accessed, downloaded and shared in far greater volume than other preprints on the same servers.

The analysis offered a richer picture of the increased popularity of instant scientific publishing, as thousands of researchers scrambled to get ahead of the virus. The pressure on journals to bypass their traditional peer-review processes had never been so great.

The first online preprint server, called arXiv (pronounced “archive”), was created almost three decades ago for use by mathematicians and physicists who wanted to share their theories among peers.

Releasing without peer review had long been frowned on by the medical publishing giants, but was made a near-necessity by the pandemic.

In March last year, the Medical Journal of Australia (MJA) was the first Australian journal to launch its own preprint server.

Traditionally, the MJA had operated a print and online model, with all papers undergoing a full external review before being published.

But when the pandemic hit last year, Professor Nick Talley, editor-in-chief of the MJA, said it was clear rapid access to science could save lives.

“We felt that it was incredibly important to disseminate information as quickly as we possibly could and [recognised that] the traditional model is much, much slower than a preprint server model,” he told The Medical Republic on The Tea Room podcast.

But while the MJA saw speed as being critically important, it also wanted to ensure the information disseminated was accurate.

“We built a model of rigorous and rapid internal review, with or without external peer review (depending on the situation), in order to make a decision for articles that we felt needed to be out in general circulation as quickly as possible,” Professor Talley said.

Changing the MJA’s publishing model also came with significant challenges.

“It interrupted our usual workflow, [and] it added extra work for the team and also for those who were kind enough to peer review for us quickly – which people were willing to do,” Professor Talley said.

“It has depended on the article type but we’ve managed to publish within 48 hours in some cases, and [in other cases], even earlier.”

And for now, the MJA has decided that preprints are here to stay.

“We really feel there’s been value and we haven’t decided how many we should preprint – what we’re doing is making decisions on an individual basis where we feel internally that an article needs to be out there sooner rather than later,” Professor Talley said.

“We stand by the philosophy that we want to disseminate important material quickly. However, we also want to be sure it’s as robust as possible, and as accurate as possible, before we do so – and that remains our philosophy.”

While there are obvious benefits to rapid scientific publishing, clinicians have been facing the consequences of trying to keep pace with the growing torrent of new information.

For more than a year now, the National Covid-19 Clinical Evidence Taskforce had been doing most of the heavy lifting, sorting the wheat from the chaff when it came to emerging evidence.

It was born out of the Australian Living Evidence Consortium, which was already reviewing the latest evidence and regularly updating clinical guidelines for stroke, diabetes, heart and musculoskeletal health. It’s this process of constant review that made clinical guidelines considered ‘living’.

And when the pandemic hit and the preprint evidence on covid started coming out like a fire hose, the group decided it was time to create living guidelines for this new disease too.

Associate Professor Julian Elliot, executive director of the Taskforce, said developing the guidelines for clinicians was a strategy for keeping on top of the infodemic.

“I think we’ve always seen preprints as an important source of evidence [but] we always take a critical appraisal approach to any published research – whether it’s peer reviewed or not, we very intensely interrogate the design,” he told The Tea Room podcast.

At the height of the pandemic last year, the taskforce was having to screen about 500 scientific papers each day.

“When we started the taskforce at the end of March, we were obviously trying to move very quickly to develop national evidence-based recommendations for clinicians across the country in a time of great uncertainty,” Professor Elliot said.

The taskforce tended not to discriminate based solely on whether a manuscript has undergone formal peer-review.

“Peer review is important, and it does add a level of provenance, but we continue to be open to using preprint data as long as there’s sufficient data contained within the preprint that we can use to adequately assess the quality of the study and the results reported,” Professor Elliot said.

At the very least, the prominence of preprints in communicating the covid pandemic may be what finally cemented their role in the biological sciences.

“I think that at this point in the journey, like with many other things in covid, [preprints] are probably here to stay,” Professor Elliot said.

“They’ve been well established in other scientific fields. This has to be the tipping point in health and medical research.”

But, he added, that brought a need for a robust system that allowed us to take advantage of the hastened availability scientific knowledge, while having an awareness of what it could and could not answer.

He said that through the process of reviewing covid evidence every day, the taskforce had determined three things as being vital for success: an experienced team, not compromising on quality and looking for ways to work with other groups striving for the same goals.

“Because of what we’ve achieved here and in the US, there are a number of groups around the world that are moving into this ‘living guideline’ model,” Professor Elliot said.

Similarly, the World Health Organisation has also committed to shift their focus to living guidelines for the management of covid, malaria, and maternal and children’s health.

“As these endeavours are established around the world, there’s a significant opportunity for us to be working more closely with those groups to share the work – and that improves the efficiency, feasibility and sustainability,” Professor Elliot said.

A need for sustainability in reviewing evidence had never been so important, with the number of papers only set to increase as the virus continued to spread.

Professor Elliot said he regularly thinks back to what Bill Gates and others said very early on in the pandemic: that this disease was likely to play out over a couple of years, not a couple of months.

“I think [the taskforce] has always had an assumption that it would continue for some time,” he said.

And it’s rapid publishing, paired with rapid review, that enabled clinicians to keep so many people safe during some of the peaks of covid transmission.

But in the meantime, between the waves, researchers, reviewers and publishers continued to work as hard and fast as they could – knowing all too well that we’re not out of the woods yet.

End of content

No more pages to load

Log In Register ×