It’s December so that must mean another update from the International Committee of Medical Journal Editors (ICMJE) on their guidelines for peer-reviewed medical publications. (The ICMJE guidelines are one of the go-to resources that should always be consulted when preparing a publication.)
What’s new this year?
- Journals are encouraged to de-emphasize the Impact Factor as a means of quantifying the journal’s quality. Instead, journals should “provide a range of …metrics relevant to their readers and authors.”
- “Purposeful failure to disclose conflicts of interest” is now listed as a type of scientific misconduct. Certainly makes sense in light of the recent news stories about some glaring omissions in disclosure of prominent researchers.
- Authors should use a preprint server that is clearly identified as one (not one posing as a peer-review system).
- The date of clinical trial registration is defined as “the date the registration materials were first submitted to a registry.”
- “Authors should use neutral, precise, and respectful language to describe study participants.” Related to one of my favorites ideas in medical writing – put the person first, not the disease.
Thinking about becoming a principal investigator of a research laboratory at a university? Want to know how you stack up against your PI colleagues? A research trio has published a new study that examines the relationship between several publication metrics and the likelihood of becoming a PI. They found that: Continue reading Could you be a PI?
Recent developments in scientific publishing have many folks scrutinizing open-access journals a bit more closely. A journalist with Science concocted a fake manuscript that, in his words, was a “credible but mundane scientific paper, one with such grave errors that a competent peer reviewer should easily identify it as flawed and unpublishable.”
[Update 11/12/13 (wow, what a great date!): Here’s a post-sting interview with the journalist, John Bohannon.]
In total, he submitted minor variations on the same manuscript to 304 open-access journals, of which 154 accepted the paper for publication. (None of the manuscripts was published. After acceptance, Bohannon would admit to uncovering a “serious flaw that invalidates the conclusions” and withdraw the paper.) Of the 154 journals that accepted the flawed manuscript, it appeared that 82 performed no peer-review before accepting.
While these results may seem disheartening for open-access publishing, it’s worth noting that no comparable investigation was performed on “traditional” journals that operate by subscription (see rebuttals on the Scholarly Kitchen, and the Guardian websites). So, there’s no way to say that open-access journals are different or worse than traditional journals in terms of their peer-review process. (Would be interesting to see the results of that study.)
Professional organizations exist, such as Committee on Publication Ethics (COPE), that provide guidelines for peer-reviewers and publishers to follow. COPE admits to having some of the journals uncovered by the sting on its list of vetted publications and vows to reexamine its approval process. A list of possible predatory publishers and journals is maintained by a librarian at the University of Colorado – Denver. From the results of this open-access sting, it appears that this “predatory” list is fairly good at spotting questionable practices, though some journals listed as predatory correctly rejected the flawed paper.
Next time you’re sending off a manuscript for review, pay close attention to the journal you or your colleague has selected. It may be worth gathering a little more background information about the practices and people behind the journal. Do you recognize folks on the editorial board? Have your colleagues published in this journal before? For the long-term health of your career and reputation, its better to get 1 or 2 high-quality publications on your CV instead of 3 times that amount at journals that may operate with questionable practices and intentions.