INSPECT-SR is a tool to address risks posed by untrustworthy randomized controlled trials:
https://www.cochrane.org/about-us/news/new-tool-detects-problematic-trials-they-distort-evidence

The focus is on use in #SysReviews, but I think it may offer opportunities for training in #RCT methods.

I'll find out - just registered for @jd_wilko 's webinar:
https://www.trybooking.com/uk/events/landing/94843

#ResearchWaste #ResearchIntegrity

New tool detects problematic trials before they distort evidence | Cochrane

Need extra motivation to share your data?

➡️ Complete data extraction possible for 22 studies
➡️ By contacting authors we got complete data for 21 extra studies 🙏🏼
➡️ 20 studies excluded due to preventable reporting issues 😡
➡️ 8 authors told us they had lost the data FOREVER 😭

#researchwaste

📰 https://doi.org/10.1101/2024.10.29.620852

Of registered clinical trials 2016-2019 in Denmark, Finland, Iceland, Norway, and Sweden, the results of 1 in 5 have never been reported

https://www.medrxiv.org/content/10.1101/2024.02.04.24301363v1

"Results reporting for clinical trials led by medical universities and university hospitals in the Nordic countries was often missing or delayed"

(Nilsonne et al, 2024)

#ResearchIntegrity #ResearchWaste

Results reporting for clinical trials led by medical universities and university hospitals in the Nordic countries was often missing or delayed

Objective To systematically evaluate timely reporting of clinical trial results at medical universities and university hospitals in the Nordic countries. Study Design and Setting In this cross-sectional study, we included trials (regardless of intervention) registered in the EU Clinical Trials Registry and/or [ClinicalTrials.gov][1], completed 2016-2019, and led by a university with medical faculty or university hospital in Denmark, Finland, Iceland, Norway, or Sweden. We identified summary results posted at the trial registries, and conducted systematic manual searches for results publications (e.g., journal articles, preprints). We present proportions with 95% confidence intervals (CI), and medians with interquartile range (IQR). Protocol: <https://osf.io/wua3r> Results Among 2,113 included clinical trials, 1,638 (77.5%, 95%CI 75.9-79.2%) reported any results during our follow-up; 1,092 (51.7%, 95%CI 49.5-53.8%) reported any results within 2 years of the global completion date; and 42 (2%, 95%CI 1.5-2.7%) posted summary results in the registry within 1 year. Median time from global completion date to results reporting was 698 days (IQR 1,123). 856/1,681 (50.9%) of [ClinicalTrials.gov][1]-registrations were prospective. Denmark contributed approximately half of all trials. Reporting performance varied widely between institutions. Conclusion Missing and delayed results reporting of academically-led clinical trials is a pervasive problem in the Nordic countries. We relied on trial registry information, which can be incomplete. Institutions, funders, and policy makers need to support trial teams, ensure regulation adherence, and secure trial reporting before results are permanently lost. What is new? ### Competing Interest Statement The authors have declared no competing interest. ### Clinical Protocols <https://osf.io/wua3r> ### Funding Statement CA reports funding for this project by the Knut and Alice Wallenberg Foundation through the Wallenberg Foundation Postdoctoral Scholarship Program at Stanford (KAW 2019.0561). The complete funding statement is available in the manuscript. ### Author Declarations I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained. Yes The details of the IRB/oversight body that provided approval or exemption for the research described are given below: Data on clinical trials were collected from publicly available databases: <https://clinicaltrials.gov/> and <https://eudract.ema.europa.eu/> I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals. Yes I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance). Yes I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable. Yes All data produced are available online at <https://zenodo.org/records/10091147> All code used for data processing available online at <https://github.com/cathrineaxfors/nordic-trial-reporting> <https://zenodo.org/records/10091147> <https://github.com/cathrineaxfors/nordic-trial-reporting> [1]: http://ClinicalTrials.gov

medRxiv
New NIOO publication: Supporting study registration to reduce research waste, by @antica_c and others.
#preregistration #registeredreports #ecology #researchwaste #researchprocess #openscience
https://doi.org/10.1038/s41559-024-02433-5

The most important question before supporting any #OpenScience initiative is how exactly it helps build a usable and reliable scientific knowledge base.

Few do.

#ResearchWaste

@lakens "among other things ... "Publish or perish!"" - I would like to know more about these "other things" 😅
Jokes aside, here's a fitting quote by Hannah Arendt (1972): "People write things which should never have been written and which should never be printed. Nobody’s interested." #ResearchWaste

On Monday is the deadline for applications for a new Co-Editor in Chief for "Quality of Life Research"
https://www.isoqol.org/isoqol-seeks-co-editor-in-chief-2023/

We managed to dedicate space to a number of pertinent topics in #HRQL research and practice such as:

Meaningful change
https://link.springer.com/journal/11136/volumes-and-issues/32-5

#ResearchWaste
https://link.springer.com/journal/11136/volumes-and-issues/31-10

Nonparametric IRT #Psychometrics
https://link.springer.com/journal/11136/volumes-and-issues/31-1

#ISOQOL are looking for an individual to continue leadership in this direction.

#Editing #Publishing

ISOQOL Seeks Co-Editor-in-Chief | ISOQOL

@pdakean @oliverpyc @kdnyhan
Good discussion!

Reporting guidelines are deficient (focus on transparency/completeness, not quality; consensus only on (often outdated) minimal standard; only for the 'typical' project; many projects do not use one design), but as a
'a minimal standard as to what to report' (and what to plan for when designing a project!)
they provide one element in the scaffolding to reduce #ResearchWaste

https://www.bmj.com/content/363/bmj.k4645

Our own humble take #HRQL
https://psyarxiv.com/879xp/

Research waste is still a scandal—an essay by Paul Glasziou and Iain Chalmers

Progress has been made towards reducing the 85% of wasted effort in medical research—and the huge amounts of money misspent and harm caused to patients—but there’s still a long way to go, say Paul Glasziou and Iain Chalmers In their history of the evolution of guidelines for reporting medical research, Doug Altman and Iveta Simera showed that poor design, conduct, and reporting of medical research have been concerns for over a century: “The quality of published papers is a fair reflection of the deficiencies of what is still the common type of clinical evidence. A little thought suffices to show that the greater part cannot be taken as serious evidence at all.”1 Indeed, more than 250 years ago, the Scottish doctor James Lind declared in the introduction to his review of reports on treating scurvy: “Before this subject could be set in a clear and proper light it was necessary to remove a great deal of rubbish.”2 Quantifying the extent of poor reporting of medical research seems not to have begun until 1966 (box). After assessing 295 publications in 10 “most frequently read” medical journals, Schor (a statistician) and Karten (a medical student) concluded: “In almost 73% of the reports . . . conclusions were drawn when the justification for these conclusions was invalid.”3 However, the title of their article, “Statistical evaluation of medical journal manuscripts,” was unlikely to ignite action among clinicians to deal with a situation that threatened their patients’ wellbeing. The wake-up call came 30 years later, in 1994, with an editorial in The BMJ by the journal’s chief statistical adviser, Doug Altman. He described as a scandal4 that “huge sums of money are spent annually on research that is seriously flawed through the use of inappropriate designs, unrepresentative samples, small samples, incorrect …

The BMJ

Alarm bells should start ringing when a PI or head of department starts setting targets related to quantity of papers. As a lot of junior researchers' time is going to be be wasted churning out papers of little scientific merit, and of no benefit to anybody other than PI or head of department.

#ResearchWaste

COMET initiative newsletter out:
https://mailchi.mp/422a8e7f2f98/jan2023-6226622

Core outcome sets are an important infrastructure to reduce #ResearchWaste.

Read about research waste in #HRQL research in our most recent special section on that topic (Oct-2022):
https://link.springer.com/journal/11136/volumes-and-issues/31-10

#PrePrint of our editorial:
https://psyarxiv.com/879xp/
(w Claudia Rutherford, Sydney)

#Psychometrics

500: We've Run Into An Issue | Mailchimp