We are approaching 1000 #FriendsofGNOME! 🎉🎉🎉
Who is joining us to celebrate #GNOME 50?
| Pronouns | he/him |
We are approaching 1000 #FriendsofGNOME! 🎉🎉🎉
Who is joining us to celebrate #GNOME 50?
In 2015 I was on a beach in Hawai'i helping build the prototype of what became Signal. I argued that the app needed pseudonyms because abusers know their victims' phone numbers. I lost the fight that day. History proved me right, and Signal would move to usernames under @Mer__edith's stewardship.
In this new essay, I trace the line from Barlow's Declaration of Independence of Cyberspace through smart-home forensics, metadata killings, and Archive Team's non-consensual Tumblr scrape to ask: when did we decide that a jpeg is a photograph, that a profile is a person, that storage is memory?
The answer involves a boat off Honolulu, the early days of Signal, Iran's missiles over Amazon's Dubai AWS facilities, and the communities already building for a world where the server goes dark. This is an essay about infrastructure, memory, archiving without consent, and what we lose when we mistake the filesystem for memory.
It is also the angriest and most personal text I've ever written. I'm furious, and you should be too. We bet an entire civilisation on a brutal and unreliable stack. Now, fate has come to collect that wager.
California has a lot to fucking answer for.
https://newdesigncongress.org/en/pub/who-will-remember-us-when-the-servers-go-dark/
الإمبراطوريات لا تحرر
يستبدلون قفص واحد بآخر
Empires do not liberate. They replace one cage with that of another
"Capable LLMs require a logic of dominance and of disregarding consent of the people producing the artifacts that are the raw material for the system. LLMs are based on extraction, exploitation and subjugation. Their politics is violence. How does one “liberate” that? What’s the case for open source violence?"
(Original title: Acting ethically in an imperfect world)
https://tante.cc/2026/02/20/acting-ethical-in-an-imperfect-world/

Life is complicated. Regardless of what your beliefs or politics or ethics are, the way that we set up our society and economy will often force you to act against them: You might not want to fly somewhere but your employer will not accept another mode of transportation, you want to eat vegan but are […]
People pontificating about whether codebases containing LLM-generated code are subject to IP protection all seem to be forgetting the key point that the law always sides with capital
When big media decided that pirating an mp3 file should be a criminal (not civil) offence, the law sided with them
When big tech decided that pirating every piece of media on the internet for AI training was fair use, the law sided with them
"AI is built on the collective knowledge of humankind."
No. Nononononono. It is not built on _knowledge_, it it built on _data_. And not everyone's experiences are available as data, many communities are excluded. Also: "Collective" implies some sort of collaboration and shared activity. But "AI" is just accumulation by a few powerful.
So No. It's not collective but extractive, not knowledge but data, not humankind but the hegemonic western view. Everything in that statement is wrong.
i don't really want to hear anymore about how ai "works for me" or "doesn't work for me" or anything like that
this is conceding the framing of the debate on totally self-centered terms and ignoring the massive societal effects of this hideous technology
this is how capitalism trains you to think and it's wild to see how many people still have these individualism brainworms even when they can clearly see the societal cost and it also impacts them specifically