Daniel Krähmer

68 Followers
189 Following
5 Posts
PhD candidate at LMU Munich, Department of Sociology.
Studying replicability and robustness of quantitative social science research.
Websitehttps://www.en.ls4.soziologie.uni-muenchen.de/research/rob-meta-rep/index.html
ORCID0000-0002-4100-5372
3. Code management seems to be a considerable problem among social scientists. To ensure the availability of research code, large-scale institutional solutions are desperately needed.
2. Micro interventions perform rather poorly in nudging researchers towards sharing code. If anything, mentioning the „replication crisis“ in a code request might yield slightly higher code returns (contrary to our hypothesis).
1. Code sharing remains the exception rather than the rule among social scientists. Despite multiple requests, only 385 of 1028 authors shared their code with us (37.5%).

Our field experiment on code sharing behavior in the social sciences has been published in PLOS One 🥳 (co-authored by @laura_schaechtele and Andreas Schneck).

Read the full article open access https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0289380

(Main take-aways in the thread below)

#metaRep #openscience #opendata #opencode #replication

Care to share? Experimental evidence on code sharing behavior in the social sciences

Transparency and peer control are cornerstones of good scientific practice and entail the replication and reproduction of findings. The feasibility of replications, however, hinges on the premise that original researchers make their data and research code publicly available. This applies in particular to large-N observational studies, where analysis code is complex and may involve several ambiguous analytical decisions. To investigate which specific factors influence researchers’ code sharing behavior upon request, we emailed code requests to 1,206 authors who published research articles based on data from the European Social Survey between 2015 and 2020. In this preregistered multifactorial field experiment, we randomly varied three aspects of our code request’s wording in a 2x4x2 factorial design: the overall framing of our request (enhancement of social science research, response to replication crisis), the appeal why researchers should share their code (FAIR principles, academic altruism, prospect of citation, no information), and the perceived effort associated with code sharing (no code cleaning required, no information). Overall, 37.5% of successfully contacted authors supplied their analysis code. Of our experimental treatments, only framing affected researchers’ code sharing behavior, though in the opposite direction we expected: Scientists who received the negative wording alluding to the replication crisis were more likely to share their research code. Taken together, our results highlight that the availability of research code will hardly be enhanced by small-scale individual interventions but instead requires large-scale institutional norms.

Proud to announce that our field experiment on code sharing behavior in the social sciences with @laura_schaechtele and Andreas Schneck has been awarded the ESRA Early Career Award 2023!
More information on the paper coming soon…(feel free to check our preregistration in the meantime: https://osf.io/bqjcz)

#esra23 #openscience #metascience #metaRep #reproducibility #replication

OSF