I am encountering more and more "helpful" people that AI-splain answers.
"I heard that you asked about xyz. I don't know the answer, but I asked chatgpt for you, and it said...."
WTF? Jusk keep quiet, if I wanted a slop answer I could ask the slop-o-mat myself!
@tante
I realize that the only way I could respond to that would be: "cool, but I didn't ask chatgpt. What's your response?"
Then I could watch their brain short circuit having to internalize and paraphrase chatgpt. Or at least, they'd be leery to mention that ever again.
@tante In the movie "A Soldier's Story," the black Captain Davenport visits the white camp commandant's home. The commandant's wife brings the commandant a drink on the hot day, and solicitously offers to "have someone bring you a drink" to Davenport.
Yes, she maintains propriety towards an officer reporting to her husband, but no, she absolutely will not personally hand a drink to a black man.
Same energy.
For me, the absolute worst context is a work discussion where someone says this. To me, it communicates āI donāt know this subject, and now itās your job to figure out if this tracks with reality.ā
Iāve made it a personal policy to ignore the contents that follow those words _and_ inform the person speaking of it too.
Admitting to self-lobotomizing your life.
Wenn der Kollege, dessen Expertise Du schƤtzt, auf eine sehr spezifische Frage aus seinem Fachgebiet nur mit einer Bildschirmseite copy&paste aus Copilot antwortet š Das ist doch das gleiche, wie wenn hier jemand eine Frage stellt und jemand anderes eine KI-Antwort drunterkleistert. Gibt's dafür schon ein Wort? Vor allem für das Gefühl der EnttƤuschung? š¤
@tante Agreed
I explicitly note in emails that there was no use of AI/LLMās in any way, just to make it clear
The tech industry spent years and billions to create real world products to help businesses get ahead.
https://futurism.com/future-society/data-center-trump-stargate
Now the entire industry has been subsumed into furthering fossil fuel funded fascism, scams, & state surveillance data harvesting.
https://www.theguardian.com/technology/2026/mar/23/mps-urge-uk-government-halt-palantir-contract-fca
https://www.sfchronicle.com/tech/article/project-2025-oracle-19654875.php
1/
2/
These organizations are ignoring the tech needs of the real world economy in favor of creating a global financial collapse instead.
They act like corrupt government contracts, Goebbels-style propaganda, Wall Street frauds, & oil oligarchy is the only possible niches available for the future.
https://moderndiplomacy.eu/2025/12/18/how-globalized-fraud-made-its-way-into-silicon-valley/
https://www.nytimes.com/2023/04/15/business/silicon-valley-fraud.html
https://www.cnn.com/2025/07/22/tech/openai-sam-altman-fraud-crisis
Their end-times head-space is profoundly harmful.
https://wlockett.medium.com/peter-thiel-just-revealed-how-utterly-screwed-the-entire-ai-industry-is-df7a6e4d5d60
https://www.wired.com/story/the-real-stakes-real-story-peter-thiels-antichrist-obsession/
3/
Improved features for profitable mainstream applications are being sacrificed for spyware features that are loathed.
https://futurism.com/artificial-intelligence/microsoft-screwed-up-windows-11-copilot
They're even willing to sink their own stock to get AI implemented.
https://247wallst.com/investing/2026/03/23/melius-analyst-microsofts-copilot-reorganization-is-a-red-flag/
The people who developed those features were laid off & not replaced to make more room on the balance sheet for a flood of AI money from the fossil fuel industry.
Koch supports AI solely for partisan gain.
https://www.forbes.com/sites/mattdurot/2025/07/17/bill-gates-charles-koch-and-three-other-billionaires-are-giving-1-billion-to-enhance-economic-mobility-in-the-us/
@tante Why? Because it suggests you are not smart enough to do something? Or some other reason?
I guess it is transparent. It provides help (the best given person can give, probably), the source and instructions how to get it by yourself in the future. If you don't like it, you can just ignore the answer. Or person.