Recently, I mused over what it would take, from my perspective, to significantly change my view that the tech industry's infatuation with non-intelligent "intelligence" is a net-negative for society.

https://pythonbynight.com/blog/what-does-it-take

Below are a few choice quotes from my post. (a sort of TLDR)

1/

#LLM #AI

More. Better. @ PyCascades 2026

PyCascades 2026 talk on what it takes to be a better software engineer.

Python By Night

The leaders of this technology are categorically unethical and detached from society, and I believe their leadership is taking us into a xenophobic future only fit for technocrats subsisting off of slave labor.

2/

Deceptive designs that profit off of anthropomorphism, and dark patterns used to gather private data should be outlawed. (This would have the added benefit of also crippling the predatory ad-tech industry.)

3/

I would need to see a transparent attempt to compensate the "humans in the loop" with salaries commensurate to the tasks that they are asked to perform, as well as benefits for any mental health strain or other risks associated with these tasks.

4/

Explicit regulation should prevent for profit companies proliferating their tools into the educational sector without any form of oversight.

5/

And one would hope that the tech companies that facilitate the generation of CSAM would be extremely eager to discourage, prevent, or disallow the creation of said content.

In the case of some companies, they're not only being passive about this, they are actively encouraging it.

6/

...existing in a world where tech companies have access to my content (or any content) and can swallow it up wholesale without explicit consent is utterly demoralizing.

7/

For the ones that think LLM tools will allow people to be more creative, as they have more power and resources at their disposal—I'd say just take a look at how it's being used now.

Create bland essays. Answer bland emails. Write bland README docs. Produce bland code.

There are no sharp edges.

8/

@pythonbynight Very much this, I've been referring to our current society as a cyberbland dystopia. I think the blandness of it all is also why there's less resistance to it than I feel there should be, because it doesn't feel offensive enough, or rather, the offensiveness is covered up by just how bland and boring it all is.

@tante

@ainmosni Yeah. I have a sense (though no proof) that these systems are very appealing both, to individuals that fit within a certain "average" (culturally or demographically), or those who may feel outside of the status quo and would like to blend in as much as possible.

In both cases, the user gets the sense that their usage elevates their usefulness/utility, and perhaps are drawn into a somewhat warped hero-complex ("I can do so much more now that I could never do before...")

That's obviously a generalization, and there's likely lots of room for nuance there...

But these systems certainly do not celebrate the jagged edges... the things that make us unique/outsiders.

It will smooth away your bad grammar, your colloquialisms, ticks, and differentiators. And if you try to coerce it to be "more like you," what you get back is just an amalgamation of something else...