The first ten minutes I spent on social media this morning made me feel all kinds of things. Why is it that people who routinely use LLMs are so loud and brash and proud, making these tools appear as essential and inevitable?

A post by a dev whose app I use said something along the lines of: "no use exercising your coding skills, AI is too good now, you can't compete with it anyway".

Another post by a user on an instance I try to engage with wrote - literally: "tired of overthinking every decision?" and then disclosed he had created an AI that will "run a weighted decision matrix so you don't have to." In all seriousness.

What is this dystopian world where human qualities are devalued, critical thinking is discarded and surveillance capitalism is ignored at the altar of AI worship?

If they are loud and proud, maybe so I can be too... but in the opposite direction.

This weekend I will start the MIT's Missing Semester class (the 2020 Lectures, so pre-AI) because in this brave new world hyping up techno-fascist LLMs, knowing the basics of code are essential IMHO.

So my March "project" will be a deep dive in MIT's Missing Semester and my April project will be off-grid mesh radio communication.

What about you, what are you doing to resist?

Special props to @emilymbender @cwebber and @tante for being outspoken on these issues... you're my beacons of hope

#NoAI

@elena @emilymbender @cwebber @tante

I completely understand your frustration. It’s true that AI has made the sheer act of "writing code" incredibly easy, but those claiming coding is dead are missing the entire point of software engineering. Writing code is merely the most basic layer of any development project.

If you look at how top-tier dev teams (like at Google) operate, the very first thing they do isn't writing code. It's System Architecture Design. How will the system run? How do the components interact to achieve the ultimate goal? They spend massive amounts of time deeply conceptualizing the core workflows.

Once that overarching architecture is established, everything else—regardless of what programming language, algorithm, or AI tool you use—is just flesh serving the skeleton. Code simply executes the workflow that has already been defined.

This is exactly why AI cannot replace human engineers right now. AI generates code based on your instructions. And those instructions are actually the distillation of your core architecture and thought process. The AI is just an executor; it cannot invent a complex workflow from thin air.

The absolute most scarce and valuable skill right now isn't writing syntax; it's the ability to conceptualize a complete project and its entire logical workflow from scratch. That is where the true human value lies, and that is the real core Intellectual Property (IP).

Learning the foundational layers like the MIT Missing Semester is exactly what builds this architectural mindset. Keep pushing!

@LucasAegis @elena @emilymbender @cwebber @tante One of the things top-tier devs at Google are doing *today* is using GenAI to reverse-engineer the design and architecture of legacy systems. Big systems, 100k+ LoC, where the original authors have long ago moved to other teams and the expertise on the internals is missing. GenAI is very helpful in reviewing and synthesizing that legacy code, giving reasonable answers on system components and data flows. It needs to be an iterative process, of course, with a senior engineer really thinking about the hypotheses that the LLM is generating, validating against the code, and steering appropriately. But damn if it isn't effective at helping create real human understanding.