Beloved programming community: many of you are hearing about the US DoJ threatening Wikipedia.

Some of you are thinking of ways to thwart this. Download the Wikipedia dumps, put it on IPFS or hand-couriered USB drives or other less-censorable systems.

A good impulse, but missing the point.

Wikipedia is not just a big document or a software artifact.

Its true value is that it is effortlessly available to a wide audience, can be updated rapidly, with no preconditions to view or edit.

Many nerds dream about less-censorable distributed tech, and think a great event like this will finally make their dream relevant. Move Wikipedia over and the audience will switch!

The audience will not switch. Distributed networks with no chokepoints are possible, but are always inconvenient or insecure. The audience was already finding it more convenient to chat with AIs.

The audience may not even be allowed to switch! The government can easily influence device manufacturers.

It's certainly possible that a new knowledge-sharing paradigm could eventually bloom, one that's native to the properties of a distributed network.

But if you want to preserve the value of Wikipedia _today_, its connection to audiences _today_, you're not going to win by dodging it with clever tech.

You have to actually fight this.

@neilk Agree 99%.

Tiny detail: when will @wikipedia / #WikiMedia enable edits via @Tor and Apple’s #iCloud #PrivateRelay service? I haven’t been able to post/edit for a few years, thanks to this understandable but crude security measure. Nor it seems can 3-5% of editors 😢 https://www.ianbrown.tech/2024/03/31/1741/

Why is Wikimedia (still) blocking contributions via VPNs?

Wikimedia has blocked anyone using Virtual Private Networks, including Apple's Private Relay service, from editing pages -- even when logged in 😬 It's easier to block wiki-vandals if they can’t e

@wikipedia @1br0wn @neilk Sadly likely only after implementing some invasive device tracking system. Anonymizing proxies get blocked because moderators otherwise feel they can’t keep up with blocking purposeful vandalism by known bad actors. https://en.wikipedia.org/wiki/Wikipedia:Advice_to_users_using_Tor
Wikipedia:Advice to users using Tor - Wikipedia

@bd808 @wikipedia @1br0wn @neilk I know. Annoying, as there are perfectly good privacy-preserving techniques which could do the job much better, using cryptography.
@wikipedia @1br0wn @neilk I think I would like to hear more about these techniques you envision. I spent part of my career inventing novel device fingerprinting techniques. The last ~12 years I have been trying to atone for that by working on privacy preserving open systems. Tools that assist in moderation at scale without requiring durable tracking tokens are very interesting for me.

@bd808 @wikipedia @1br0wn @neilk Excellent — then I’m very interested in hearing your thoughts too!

What you’re describing is significantly more ambitious than what I had in mind, which was simply a Tor/iCloud Private Relay user proving they were a given registered Wikipedia/media editor to Wikipedia servers, without revealing their actual IP address to Wikimedia Foundation…

@wikipedia @neilk @1br0wn That can be done today by using a registered account that has the global IP block exemption user right. The hard part of that for the user is that they have to have established a trusted user reputation to gain the right. That functionally requires a nontrivial history of contributing from an unblocked origin. https://meta.wikimedia.org/wiki/IP_block_exempt
IP block exempt - Meta

@bd808 @wikipedia @neilk @1br0wn Indeed. Although when I inquired about this — and I had a non-trivial although hardly large existing, registered history — it was not offered as an option.

A related approach, of course, is to grant incredibly limited editing abilities, with associated warnings to page reviewers, for new “private” accounts. Then let them slowly build up trust over time and many known good edits.

File:Wikimania 2018 - Edit Conflicts, Offline Contributions, and Tor.pdf - Wikipedia

@neilk @1br0wn @wikipedia What seems like a reasonable editing limitation to you in this trust earning idea?

The currently stated problem here is that vandalism by determined attackers is too time consuming to moderate when admins cannot ban a “device” from creating new accounts or making anonymous edits. For a trust earning path to avoid this perceived burden I guess it would need some separation from typical vandal fighting.

@1br0wn @neilk @wikipedia Ideas like @cscott’s isolated queue of edit suggestions rather than direct edits seem possible, but I wonder how we could estimate the benefit vs new moderation effort cost. Building something that comes with new tech and social costs that ideally would be outweighed by the contribution gain. https://kolektiva.social/@cscott/114411726308033280
C. Scott Ananian (he/him) (@cscott@kolektiva.social)

@1br0wn@eupolicy.social @bd808@mastodon.social @wikipedia@wikis.world @neilk@xoxo.zone my contribution back in 2018 was https://en.wikipedia.org/wiki/File:Wikimania_2018_-_Edit_Conflicts,_Offline_Contributions,_and_Tor.pdf

kolektiva.social

@bd808 @1br0wn @neilk @wikipedia as far as I'm concerned it's a solved as a /technical/ problem; what needs to be done is solve the /social/ problem of nurturing a community of folks willing to babysit the queue. What will motivate that community? Perhaps it will be locality or national pride (a queue for expats to help folks in their native country edit around restrictions) or a pseudonym or social credit system (I'm motivated to help merge "Tom's" edits, despite not knowing who/where Tom is in real life, because of Tom's history of excellent contributions), or pure altruism, or ????

The interesting technical problems to me are the ones that would enable/motivate a particular social solution to the community building problem.