Interesting pointer from a #JEDI thread:
Description of a process to integrate evaluation of computational reproducibility into #PeerReview:
https://f1000research.com/articles/10-253/v2

I also like the ethos and wonder whether #PeerReview in general could learn from this?

1. Codecheckers record but don’t investigate or fix.
2. Communication between humans is key
3. Credit is given to codecheckers.
4. Computational workflows must be auditable.
5. Open by default and transitional by disposition.

#NightshiftEditor

F1000Research Article: CODECHECK: an Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility.

Read the latest article version by Daniel Nüst, Stephen J. Eglen, at F1000Research.