Interesting pointer from a #JEDI thread:
Description of a process to integrate evaluation of computational reproducibility into #PeerReview:
https://f1000research.com/articles/10-253/v2
I also like the ethos and wonder whether #PeerReview in general could learn from this?
1. Codecheckers record but don’t investigate or fix.
2. Communication between humans is key
3. Credit is given to codecheckers.
4. Computational workflows must be auditable.
5. Open by default and transitional by disposition.






