I am excited to share some brief news on the recent progress we made in pushing bibliographic data on all #Arabic #Periodicals published before 1930 and their editors to #Wikidata: we did it!

With a bit of SPARQL one can now browse our data set from https://projectjaraid.github.io/ as a graph (https://tinyurl.com/jaraid-graph), table (https://w.wiki/9rDP) or a map (https://w.wiki/9o3Z).

More detailed descriptions of our effort will follow in the form of blog posts (and potentially a longer thread).

Holding data beyond #HathiTrust, #OCLC and the German #ZDB have not been pushed yet.

#PeriodicalStudies #MultilingualDH #ArabicPeriodicals #Arabic #Ottoman #Mahjar #الصحافة_العربية #DigitalHumanities #dh

Jara'id: A Chronology Of Arabic Periodicals (1800-1929)

I'm rather new to SPARQL and equal parts smitten by the power of built-in visualisations and frustrated by the state of examples and documentation of more complex queries for those of us not overtly familiar with with other query languages.

Anyhow, if you are interested in the most popular titles of Arabic periodicals until 1930, here they are: https://w.wiki/9nxE.

TL;DR: Reform (الاصلاح), Liberty (الحرية), The Nation (الوطن), The Morning (الصباح), Education (المعارف) take the crown.

#PeriodicalStudies #الصحافة_العربية #MultilingualDH #DH

My dive into #SPARQL and the #Wikidata environment continues and I just discovered some of the wonderful tools hosted on https://toolforge.org/.

Here is a map of all periodicals published in #Palestine (defined by a rectangular bounding box) before 1930: https://w.wiki/9u$o. Items on the map link to #Reasonator (https://reasonator.toolforge.org/), which provides an improved view of linked data available from Wikidata.

#PeriodicalStudies #الصحافة_العربية #MultilingualDH #DH #DigitalHumanities

Portal:Toolforge - Wikitech

Also of relevance to the #multilingualDH crowd: #Wikidata's link shortener has a hard-limit of 2000 characters. If your #SPARQL query contains non-ASCII symbols, the resulting URL encoding of these characters quickly breaches that limit. The current workaround is using another link shortener or the Query Chest at https://query-chest.toolforge.org/.
Query Chest

@tillgrallert

This is amazing!

It has always been 1 of my pipe-dreams to do something similar for published/digitized Arabic/Islamic texts. The folks at https://openiti.org/ have already created metadata for Arabic texts, now it just needs someone smart/willing to transform that metadata into visualizations/graphs 😉

Open Islamicate Texts Initiative

Open Islamicate Texts Initiative

@kentoseth

Thank you!

One could definitely talk to them and propose a submission of their bibliographic data to Wikidata. Same is true for all sorts of projects collecting metadata on Arabic manuscripts, such as #BibliothecaArabica|s khizāna (خزانة) at https://khizana.saw-leipzig.de, which includes all of Ziriklī's data, or #Qalamos at https://www.qalamos.net/ collecting information on "Oriental" manuscripts in German holdings.

Khizana

I finally managed to write up some of the thoughts that went into contributing the bibliographic information on Arabic periodicals and their known holdings to #Wikidata. I have published a preprint to Zenodo: https://zenodo.org/records/14112648. Feel free to share and to comment under this post.

#PeriodicalStudies #MultilingualDH #ArabicPeriodicals #Arabic #Ottoman #Mahjar #الصحافة_العربية #DigitalHumanities #dh #LinkedOpenData

Adding Every Arabic Periodical Published Before 1930 to Wikidata: Moving the Scholarly Crowd-Sourcing Project Jarāʾid to the Digital Commons

This paper documents the contribution of comprehensive bibliographic data on all Arabic periodicals published worldwide until 1930 to Wikidata. We discuss the need for such a data set, which originated with scholarly crowd-sourcing project Jarāʾid and comprises information on more than 3000 periodicals, about 2700 editors, and almost 350 holding institutions, the weaknesses of our original approach, and how the move to Wikidata, the largest public and open knowledge graph, addresses them. We demonstrate how Wikidata also satisfies the two predominant use cases of this data set not currently served by available library and discovery systems. Finally, we show how the move to Wikidata generates continuous engagement with wider Wikimedia communities and demonstrate the reusability of our approach with a second data set of periodicals from the Ottoman Empire.

Zenodo

@tillgrallert

Wonderful news! Thanks for sharing and 1001 congrats!