https://daringfireball.net/linked/2026/03/13/grief-and-the-ai-split
CompSci Teacher / Husband / Dad / Inveterate traveller / Bad jokes / Vegan / Slightly terrified of machine learning taking over the art world
He/Him
CompSci Teacher / Husband / Dad / Inveterate traveller / Bad jokes / Vegan / Slightly terrified of machine learning taking over the art world
He/Him
@Doomscroll I think that people who make things have a sense that doing it right, to the very best of their ability, without cutting corners, is the path to a profound respect for other people, which we all need (and which seems to be in such short supply these days).
It’s also the road to gratitude, to telling the universe we are grateful to be alive and grateful for humanity, and we want to contribute our most inspired and careful work so we can say thank you, over and over.
«The moment was absurd but revealing; the university wasn’t resisting bullshit education, it was onboarding it. Education at its best sparks curiosity and critical thought. “Bullshit education” does the opposite: it trains people to tolerate meaninglessness, to accept automation of their own thinking, to value credentials over competence.»
An excellent read about how/why uni admin pushes the use of "AI" and the impact
https://www.currentaffairs.org/news/ai-is-destroying-the-university-and-learning-itself
The Guardian missed out by not saying the Turkish football federation has suspended 1 kibiplayer.
The Python Software Foundation shows more spine than every single tech giant in just one single decision.
> Diversity, equity, and inclusion are core to the PSF’s values
https://pyfound.blogspot.com/2025/10/NSF-funding-statement.html
Let’s talk about AI art.
Casual thoughts about memory and our misconceptions about it, based on the last few weeks of seeing a LOT of lay mental models about memory and cognition:
- it is well known in the psychological and cognitive sciences that memory is not like a man-made recording device. Your memory is not a tape recorder, or analogous to computer memory. Nevertheless people will insist on constantly using computer metaphors for memory. Just know this is widely regarded as inaccurate