Since I haven't posted it in a few years... Behold! The greatest video art piece ever made: Björn Melhus' No Sunshine (1997).

I make music when I can. I own too many synthesizers. I program C++ for money, but it's usually reasonably fun.
he/him or they/them, as you prefer.
| My music on Bandcamp | https://arcseconds.bandcamp.com/ |
| Discography | https://arcseconds.net |
| Photography | https://visible.pictures/ |
I think something that's worth highlighting is that both communities are concerned with empowerment and disempowerment. And I tend to think these tools tend to *appear* empowering, but are actually disempowering, in their current configuration.
I don't believe LLMs are fundamentally disempowering, and could be part of an empowering future, but the present *industrial deployment* of AI tech within our *socio-economic environment* is net-disempowering. And I worry that there is a big rush to adopt with so little settled about the legal implications on the one side, and with *well known* problems for AI generated code on the other.
Not all AI coding usage is necessarily doomed to be a problem: using local models to "lint" or discover vulnerabilities/bugs is actually probably very good, in the way that having fuzzers is good. But there is so much pressure to adopt beyond the space of what's good and to dismiss real concerns that I am worried it is going to take a long time to undo damage
Since I haven't posted it in a few years... Behold! The greatest video art piece ever made: Björn Melhus' No Sunshine (1997).

In order to make coding LLMs work better, it's often helpful to provide more information about the code being worked on. As LLMs work entirely based on the human languages they were trained in, the best thing to do is to provide plain English text about the code.
Ultimately the software industry has invented the most fucked up method to force software engineers to document their code. All this text intended for LLMs is often useful to humans, too.
Depressing fact I just learned: when pumping oil from a well, the oil industry flares (burns into the atmosphere) gas when they don't think it's economically viable to put it in a pipeline for later use.
In 1990 the Alberta, Canada oil industry flared enough gas to heat the entire city of Calgary for that year.
Cool cool.