"They Drank Our Milkshakes"
A post I've written about AI scraping swarms and how they negatively impact hobby developers such as myself.
https://whateverthing.com/blog/2026/03/23/they-drank-our-milkshakes/

"They Drank Our Milkshakes"
A post I've written about AI scraping swarms and how they negatively impact hobby developers such as myself.
https://whateverthing.com/blog/2026/03/23/they-drank-our-milkshakes/
I've published a new blog post: "Human Creations", on the difference in content generation by LLMs, and the creation of text, art and code by humans.
You can find it at https://derickrethans.nl/human-creations.html or at @blog

LOL this is the problem with relying on AI tools, as well...
"...His core argument: Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention.
The research backs this up. Psychologists call it the “vigilance decrement”, monitoring a nearly perfect system is boring, boredom leads to mind-wandering, and drivers need 5 to 8 seconds to mentally reengage after an automated system hands control back. But emergencies unfold faster than that...."
I tasked an AI agent with the implementation of an algorithm from a research paper. 15 minutes later: clean code, green tests, plausible visualizations. Hours later: I'm still not sure if it's correct.
What happens when AI generates code faster than you can understand the domain?
https://phpunit.expert/articles/faster-than-understanding.html?ref=mastodon

One of the promises of AI is that it can reduce workloads so employees can focus more on higher-value and more engaging tasks. But according to new research, AI tools don’t reduce work, they consistently intensify it: In the study, employees worked at a faster pace, took on a broader scope of tasks, and extended work into more hours of the day, often without being asked to do so. That may sound like a win, but it’s not quite so simple. These changes can be unsustainable, leading to workload creep, cognitive fatigue, burnout, and weakened decision-making. The productivity surge enjoyed at the beginning can give way to lower quality work, turnover, and other problems. To correct for this, companies need to adopt an “AI practice,” or a set of norms and standards around AI use that can include intentional pauses, sequencing work, and adding more human grounding.
There is an assumption that joy and work are different things.
If something is extremely boring or annoying with a task at work, it's treated as an "oh well, that's life" by most people and most bosses.
I think that's fundamentally flawed.
I also don't think joy means to make things easy. Some of the most delightful and creative experiences I've ever had are when I had to work in constraints.
1/
Software development is fundamentally a practice that removes friction & risk from human processes.
To remove friction from software development itself by loading its output with risk is to misunderstand that purpose. #AI
Some might call it laziness. I don't think so.
The engineer's cycle:
Boring task → automate it → automation becomes interesting → rabbit hole → emerge with a gigantic backlog → repeat
Artists don't thrive on 9-5. Neither do engineers who think like this.