Digging deeper into data citations: recognizing and rewarding data work – InfoDoc MicroVeille

We evaluate science mostly through papers. But researchers report that up to 75% of project effort is data work — collecting, cleaning, documenting, and preparing datasets. A reminder that research outputs ≠ research work.

New paper in Research Evaluation: https://doi.org/10.1093/reseval/rvag008

#ResponsibleMetrics #OpenScience #DataCitation #ResearchEvaluation

Most research evaluation still rewards papers, not the work that makes them possible. Yet researchers say up to 75% of a project can be data work: collecting, cleaning, curating, documenting.

 https://doi.org/10.1093/reseval/rvag008

Maybe it's time to stop pretending that publications alone represent research.

#OpenScience #ResearchEvaluation #DataCitation #ResponsibleMetrics #Scientometrics

New paper in Research Evaluation explores how researchers actually cite data. Key insight: data citations are far more complex than simple indicators of data reuse.

 https://doi.org/10.1093/reseval/rvag008

They reflect scientific practice, community norms, attribution, and even reputation-building. A timely reminder: metrics alone cannot capture the real value of data work.

#OpenScience #DataCitation #ResearchEvaluation #ResponsibleMetrics #Scientometrics

New research highlights barriers to proper dataset citation.

A study in the Data Science Journal finds that many commonly used reference managers do not adequately support data citation workflows, slowing recognition and reuse.

🔗 https://datascience.codata.org/articles/10.5334/dsj-2026-005

#OpenScience #DataCitation #FAIRdata #ResearchData #OpenResearch #Metadata

Essential Aspects of Tools for Developing Scientific Data Management Plans | Data Science Journal

Data Science Journal

CESSDA's Monitoring Report 2024–2025 on Data Citation Practices highlights an increasing trend among its Service Providers to promote dataset citation. Repositories like DANS are now providing recommended citations on dataset pages, simplifying user referencing and boosting data visibility in scholarly work. As a CESSDA partner, DANS supports clear citation practices and shared its policy for the study. Read more: https://edu.nl/xyyee

#DataCitation #ResearchData

Help @makedatacount , join the #kaggle "Finding Data References" competition, and win fame and fortune (well, a $100,000 USD prize) #DataCitation
https://kaggle.com/competitions/make-data-count-finding-data-references/

Only 1 week to go!

Learn how the MONITOR service is expanding dataset indicators to track:

- Direct & indirect citations
- Dataset-publication links
- Enhanced FAIRness checks

Be part of the conversation on improving data visibility across the research landscape!

20 May 2025 @ 12:00 pm - 01:00 pm

Register: https://www.openaire.eu/eventdetail/1385/monitor-community-call

#ResearchData #OpenScience #FAIRdata #OpenAIREMonitor #DataCitation

MONITOR Community Call

Are you interested in having a Dashboard in OpenAIRE-MONITOR and want to learn more about the service? Have you already got a Dashboard in OpenAIRE-MONITOR...

#OpenScience #DataCitation
New CESSDA Data Citation Guide published🚨
Citing research data ensures transparency, recognition for data authors, and supports open science. Read more about core citation components and tailored recommendations at: datacitation.cessda.eu

Last talk is from Chris Mentzel
giving some insight and examples from what came out of the #MDCsummit #datacitation corpus hackathon earlier this week. Went well and there were 5 projects that came out of it.

It’s not production level code but the outputs of the MDC DCC hackathon are on GitHub here https://github.com/Make-Data-Count-Community/mdc-hackathon

GitHub - Make-Data-Count-Community/mdc-hackathon: Repository for September 4, 2024 MDC hackathon

Repository for September 4, 2024 MDC hackathon. Contribute to Make-Data-Count-Community/mdc-hackathon development by creating an account on GitHub.

GitHub