Short reminder for our Meta-Rep conference on #metaScience, #replicability, robustness issues, ... in the beh/soc/cog sciences.

Attend/present your work at META-REP 2024 in Munich (Oct 28-31): ‪ https://www.conference2024.meta-rep.uni-muenchen.de/index.html

Submission deadline is April 30 (there will be no extension)

Our two keynote speakers will be Fiona Fidler and Daniël Lakens @lakens.

#metaRep #MetaResearch

META-REP Conference 2024 - LMU Munich

*CALL FOR SUBMISSIONS*

Interested in #metaScience, #replicability, robustness issues, ... in the beh/soc/cog sciences? Then attend/present your work at META-REP 2024 in Munich (Oct 28-31): https://www.conference2024.meta-rep.uni-muenchen.de/index.html

Submission deadline April 30.

#metaRep #MetaResearch

META-REP Conference 2024 - LMU Munich

Anne Scheel @annescheel kicks off the research retreat of the #metaRep research group with her new workshop: „The soft underbelly of hypothesis tests“.

Many quotable insights, e.g.:
"Explanation without well-established explananda is futile.“

Hear, hear!

#MetaScience

Our field experiment on code sharing behavior in the social sciences has been published in PLOS One 🥳 (co-authored by @laura_schaechtele and Andreas Schneck).

Read the full article open access https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0289380

(Main take-aways in the thread below)

#metaRep #openscience #opendata #opencode #replication

Care to share? Experimental evidence on code sharing behavior in the social sciences

Transparency and peer control are cornerstones of good scientific practice and entail the replication and reproduction of findings. The feasibility of replications, however, hinges on the premise that original researchers make their data and research code publicly available. This applies in particular to large-N observational studies, where analysis code is complex and may involve several ambiguous analytical decisions. To investigate which specific factors influence researchers’ code sharing behavior upon request, we emailed code requests to 1,206 authors who published research articles based on data from the European Social Survey between 2015 and 2020. In this preregistered multifactorial field experiment, we randomly varied three aspects of our code request’s wording in a 2x4x2 factorial design: the overall framing of our request (enhancement of social science research, response to replication crisis), the appeal why researchers should share their code (FAIR principles, academic altruism, prospect of citation, no information), and the perceived effort associated with code sharing (no code cleaning required, no information). Overall, 37.5% of successfully contacted authors supplied their analysis code. Of our experimental treatments, only framing affected researchers’ code sharing behavior, though in the opposite direction we expected: Scientists who received the negative wording alluding to the replication crisis were more likely to share their research code. Taken together, our results highlight that the availability of research code will hardly be enhanced by small-scale individual interventions but instead requires large-scale institutional norms.

Proud to announce that our field experiment on code sharing behavior in the social sciences with @laura_schaechtele and Andreas Schneck has been awarded the ESRA Early Career Award 2023!
More information on the paper coming soon…(feel free to check our preregistration in the meantime: https://osf.io/bqjcz)

#esra23 #openscience #metascience #metaRep #reproducibility #replication

OSF

What’s the prior probability of a hypothesis being true in #psychology?

* Dreber et al (2015; 10.1073/pnas.1516179112) give an estimate of 9%.

* Wilson & Wixted (2018; 10.1177/2515245918767122) estimate 6-10% for social psych and 27% for cognitive psych.

Are there any other papers estimating this quantity?

(Asking for our #MetaScience project, where we want to calibrate our agent based model #ABM of academia.)

CC @uebernerd @briannosek @tomhardwicke @EJWagenmakers
#Bayes #metaRep