"We are disappointed that these highly questionable practices continue. In some cases, we are dealing with quite blatant inaccuracies. In the most serious cases, we have written to the author's institution to share our concerns…The longer-term goal needs to be raising author awareness about research #misconduct and #fraud and to ensure better compliance with journal eligibility criteria and the requirements of the #AllTrials Initiative."
https://onlinelibrary.wiley.com/doi/10.1111/jan.15620

#ClinicalTrials #Data #OpenData

@petersuber One would hope that the journals that are going out of their way to identify QRPs get rewarded with better reputations and more high-quality submissions. But in practice, who knows how long it will take for readers and authors to notice....
@kdnyhan
Yes, and likewise with the referees, as opposed to the journals. It looks like uncovering the kinds of mistake highlighted here requires unusually high levels of referee time and commitment.
@petersuber For sure. When I peer review a manuscript I don't think of my job as New Yorker-style fact-checking!
@petersuber But if you asked people who consume, but don't produce, journal articles -- maybe they take the imprimateur (or "certification" as medRxiv puts it) of peer review as a seal of approval that implies more confidence than the peer reviewers think it does
@kdnyhan
This happens all the time, especially on contentious questions. If a peer-reviewed paper supports our antecedent position, then we have more confidence in its results. In all our roles (authors, readers, editors, referees…), scholars know that peer review doesn't guarantee truth. But when we have an interest, it's tempting forget or downplay that critical degree of doubt or uncertainty.
@petersuber For sure! I can teach people to use a critical appraisal tool, but it's hard to teach them - or even myself - to remember to apply those critical appraisal skills to papers that I want to accept at face value....

@petersuber Specifically regarding their point about ethics approvals:

doi.org/10.1016/j.jclinepi.2021.01.020: Hayden et all showed that, among included articles (which had been published in non-predatory journals) in a back pain systematic review:
72% trial registration or published protocol not reported
19% missing ethics statement
24% missing COI statement

TBF the documents they analyzed were published before or up to 2018 -- maybe it's better these days?