⏳ One week left in the #OpenRefine 2025 Giving Campaign.

We’ve reached 51% of our $50K goal, thanks to early community support. Closing the remaining gap matters for planning 2026 activities and sustaining the project.

👉 Donate: https://openrefine.org/donate

👉 Store: https://store.openrefine.org

QuickStatements 3.0 進階編輯工作坊中用了大約一小時的時間,透過真實案例展示了從資料抓取到批次編輯的完整流程:從獲取非結構化來源的資料開始,再利用 OpenRefine 內建的功能進行資料清洗與整理,接著準備輸出編輯流程,最後交由 QuickStatement 3.0 來將資料自動化上傳到 Wikidata。整體流程清晰順暢,效率極高,也相當容易理解;只要具備基礎的資料科學概念,就能快速上手。

完整報導:https://diff.wikimedia.org/zh-hant/2025/12/08/%e7%b6%ad%e5%9f%ba%e6%95%b8%e6%93%9a%e5%8d%81%e4%b8%89%e5%91%a8%e5%b9%b4%e7%89%b9%e5%88%a5%e6%b4%bb%e5%8b%95%e5%9b%9e%e9%a1%a7-part-3/

#wikidata #openstreetmap
#QuickStatements #OpenRefine
#維基數據 #維基資料 #開放街圖 #活動 #event
#台北市 #中正區
#WikidataBirthday
#data #Database #dataset

✨ Vergangene Woche fand unser #BringYourOwnDataLab zu #LinkedOpenData am @IEG in Mainz statt!

Der erste Tag bot zahlreiche spannende Impulse und lebhafte Diskussionen. Am zweiten Tag folgten praxisorientierte Hands-on-Sessions zu Themen wie #Wikidata, #OpenRefine und #SPARQL. Die Teilnehmenden erhielten dabei gezielte Unterstützung für ihre eigenen Datensätze von unserer Expertin Martina Trognitz.

➡️ Freut euch auf weitere Workshops im kommenden Jahr.

#DigitalHumanities #Dataliteracy

Gerade erschienen 😎: Hanna Meiners und ich beschreiben, wie wir am DDK - Bildarchiv #FotoMarburg @wikidata als Gateway institutionellen Wissens nutzen.

Meiners, Hanna-Lena, and Klaus Bulle. 2025. ‘Disrupting the Dust: Identifier Properties and the Future of Cultural Heritage Metadata in Wikidata’. Journal of Open Humanities Data 11: 65.

https://doi.org/10.5334/johd.412

#JOHD #externalIdentifier #OpenRefine #LinkedOpenData #LOD #openGLAM

RE: https://fosstodon.org/@OpenRefine/115650213949908618

On the importance of #GivingTuesday to #FLOSS:

In an alternate universe, #OpenRefine is still called Google Refine. It's a tab in Sheets but you can't use it with sensitive data because they will be used to train Gemini. It has a lot of staff, but it answers to stakeholders, not users. Money is good, the team is never told that there's no funding for maintenance, that "we only foster development right now", but there are daily reminders to integrate AI or be terminated.

🎉 We’re halfway there!

The #OpenRefine 2025 Giving Campaign has reached 51% of our $50K goal 🙌

Help us keep OpenRefine open, independent & free for everyone.

💎 Donate → https://openrefine.org/donate

👕 Shop → https://store.openrefine.org

#GivingTuesday #FLOSS #OpenSource

Hier die Folien zum heutigen @digiSberlin -Workshop "Umwege zu besseren LIDO-Daten. Workflows externer Normdatenanreicherung" (https://doi.org/10.5281/zenodo.17779761). Es ging um #reconciliation mit #openrefine auf Basis von CSV- und LIDO-Exporten (unter Verwendung des noch sehr beta-haften #LIDORefineWeb (https://www.digis-berlin.de/lidorefine)
Umwege zu besseren LIDO-Daten. Workflows externer Normdatenanreicherung

Slides zum digiS-Workshop vom 2. Dezember 2025.

Zenodo

Danke an alle die heute beim Workshop "Nomrdatenanreicherung von LIDO-Datensätzen" dabei waren!✨
Für alle anderen sind die Folien jetzt auf zenodo zu finden: https://doi.org/10.5281/zenodo.17779761

@museum #LIDO #Normdaten #openrefine

Umwege zu besseren LIDO-Daten. Workflows externer Normdatenanreicherung

Slides zum digiS-Workshop vom 2. Dezember 2025.

Zenodo
Not actually the behaviour I'd expected: when adding URLs from #reconciliation results in #openrefine "URLs" actually means (page) URLs, not the entity URIs which I'd assumed would be the thing one usually wants to have. When reconciling, I usually want to create a link to the authority record, which is represented by the URI, not the URLs.

On November 10, Wikidata Taiwan hosted the third and final episode of the special event celebrating Wikidata’s 13th birthday. For this closing session, we turned our attention to QuickStatement 3.0, exploring how its capabilities can elevate and empower our existing workflows.

We began with a familiar warm-up demonstration and spent about an hour showing how teamwork makes everything better. Using real-world subjects, we walked through the full workflow: extracting data from unstructured sources, cleaning and wrangling it in OpenRefine, preparing the output schema, and finally sending everything into QuickStatement 3.0 for automated batch edits to Wikidata. The process proved clear, efficient, and surprisingly accessible; anyone with a foundation in data science could pick it up with minimal effort.

Full report:https://diff.wikimedia.org/2025/11/26/wikidata-13th-birthday-taiwan-special-event-recap-part-3/

#wikidata #openstreetmap
#QuickStatements #OpenRefine
#維基數據 #維基資料 #開放街圖 #活動 #event
#台北市 #中正區
#WikidataBirthday
#data #Database #dataset