If Codeberg is trying to "compete" against GitHub and GitLab, why does it refuse to take a look at AI assistants? Apart from infringing on authors' rights and questionable output quality, we think that the current hype wave led by major companies will leave a climate disaster in its wake: https://disconnect.blog/generative-ai-is-a-climate-disaster/

Other _sustainable_ (and cheaper!) ways for increasing efficiency in software development exist: In-project communication, powerful automation pipelines and reducing boilerplate.

Generative AI is a climate disaster

Tech companies are abandoning emissions pledges to chase AI market share

Disconnect

@Codeberg
> Other _sustainable_ (and cheaper!) ways for increasing efficiency in software development exist

This is the big thing for me even if I ignore everything else that's an issue. I've tried to use AI code assistants multiple times and I can't shake the feeling that getting good with them would be a lot of work for not nearly as much payoff, when I know there are infrastructure things I could work on that would make me WAY more pawductive and that wouldn't have those downsides >w<

@foxyoreos @Codeberg i tried copilot during the free technical preview, and some time more recently tried a copilot-like that was self hosted and ran on my own GPU (copilot would have been a lot more effective but i wasn't gonna give them money just to try it)

and quite honestly, i just found it more distracting than useful, it just made me less effective

when i had it i'd often end up waiting for it, when more often than not, it honestly would have been faster to just write it myself, instead of waiting for it to generate, and then reading it to make sure its actually what i want, and potentially tabbing through alternatives or trying to prompt it if its not what i want...

a lot of the time i already knew what i'm trying to do and it was just a distraction, sometimes i'm trying to do something new (eg the reason i'm writing it myself in the first place is that i'm doing something existing solutions don't) it would fail to grasp that be actively unhelpful (by clinging to assumptions from the more common method that i was trying to deviate from), and in the times i actually don't know what i'm doing (the situation "help" in theory could be most applicable) my problem is usually more big picture than what a code autocomplete AI is going to be any help with (and even if it could, usually my struggle is more with understanding the problem and wrapping my brain around it than actually implementing it, even if i wanted to make the magic box barf out an implementation i'd still need to understand the situation enough to explain it to the magic box and if i understand enough for that i understand enough to start doing it myself)

despite being someone with quite bad motivation who you'd think could benefit from the "help" i found it just wasn't helpful, i think i'd benefit more from improved code search/navigation and documentation

sure maybe its just that i'm not using the AI "correctly" but for the effort of learning to use AI more effectively i could also just already be on the way of just doing it myself, and honestly if i'm programming, i'd rather be programming not trying to understand and review code generated by a magic box (has a much lower energy consumption to be doing it without the magic box too)

@delta @Codeberg

> i think i'd benefit more from improved code search/navigation and documentation

This, I have this thought almost every time I try to use one. Maybe I could get better, but..

> sure maybe its just that i'm not using the AI "correctly" but for the effort of learning to use AI more effectively...

Yep, that's the catch. I'm supposed to invest a lot of time into learning something that's constantly changing. There's other stuff to learn.

@delta @Codeberg I feel weird saying that because I know it's a trope to say "the thing I don't like also isn't as good as everypony says", but literally I don't think I've had a coding session where an LLM was more than neutral. Could just be me, I'm not saying nopony at all could find them helpful, but like..

There are a lot of things I could do to improve my code, I don't see why I should give LLMs a special investment.

@delta @Codeberg Even ignoring all the other stuff, idk

It just feels especially weird to look at the energy cost and be left thinking, wait a sec, we're going to destroy the planet for *this*?

@delta @Codeberg

> despite being someone with quite bad motivation who you'd think could benefit from the "help" i found it just wasn't helpful

Also lol I get what you mean and that's a really good point. I could benefit a lot from somepony just getting me to write code, bouncing ideas off of them. Even if it wasn't good at writing code, at least it could do that!

And it's surprisingly bad at doing that in a lot of cases X3