Tony Ross-Hellauer

553 Followers
99 Following
187 Posts
Leader - Open and Reproducible Research Group
http://orrg.eu ⁞ TU Graz & Know-Center ⁞ Team #OpenScience

6/ 📚 Read the full study:
🔗 “Open Science at the Generative AI Turn”

Published in Quantitative Science Studies (MIT Press):
👉 https://doi.org/10.1162/qss_a_00337

Let’s work together to ensure that #AI & #GenAI align with the values of #OpenScience!

5/ 🌍 A Call for Responsible Use
To ensure GenAI aligns with Open Science values:
To ensure GenAI aligns with Open Science values:
- Researchers must integrate GenAI with care and scrutiny.
- Developers need to create transparent, unbiased tools.
- Policymakers must balance innovation and risk.
4/ 🔍 The Risk
Despite the potential, there are challenges:
❌ Opaque “black box” models undermine transparency
❌ Bias in training data risks reinforcing inequalities
❌ High computational demands raise sustainability concerns.
3/ ✨ The Opportunity
GenAI can:
✅ Increase efficiency of enhanced documentation
✅ Simplify complex science into accessible language
✅ Break language barriers through translation
✅ Enable public participation in research
✅ Promote inclusivity, accessibility, and understanding.
2/ TL;DR. Mohammad Hosseini, Serge Horbach, @kristiholmes and I explore GenAI's enormous potential to enhance accessibility and efficiency in science. But we emphasise that to do so, GenAI must bespeak Open Science principles of openness, fairness, and transparency.

1/ 🚨 NEW PAPER! “Open Science at the Generative AI Turn”

In a new study just published in Quantitative Science Studies, we explore how GenAI can both enable and challenge Open Science, and why GenAI will benefit from adopting the values of Open Science. 🧵

#OpenScience #AI #GenAI

For more details and concrete recommendations, we refer to our paper. https://doi.org/10.1093/reseval/rvae055
Understanding the social and political dimensions of research(er) assessment: evaluative flexibility and hidden criteria in promotion processes at research institutes

Abstract. Debates about appropriate, fair and effective ways of assessing research and researchers have raged through the scientific community for decades,

OUP Academic

In our new study, based on qualitative analysis of free-text responses from 121 international researchers to a survey, we examined researchers' perceptions of the social and political dimensions influencing research assessment processes.

We highlight a significant discrepancy between formal evaluation criteria and their practical application, demonstrating the “performativity of assessment criteria” and raising the question to what extent criteria can or should be transparently communicated.

Reform of research assessment, especially to avoid issues of over-quantification and empower qualitative assessment, is an increasingly hot topic.

Part of the debate concerns the tension between the rigidity and flexibility of assessment criteria. Should they be set in stone to avoid biases or remain flexible to allow tailor-made assessments?

New Paper!

“Understanding the social and political dimensions of research(er) assessment: evaluative flexibility and hidden criteria in promotion processes at research institutes”, just published in Research Evaluation by me, Noemie Aubert Bonn and Serge Horbach.

https://doi.org/10.1093/reseval/rvae055

Understanding the social and political dimensions of research(er) assessment: evaluative flexibility and hidden criteria in promotion processes at research institutes

Abstract. Debates about appropriate, fair and effective ways of assessing research and researchers have raged through the scientific community for decades,

OUP Academic