RE: https://mstdn.social/@hkrn/116284264915152671

lol oh my god i feel **so fucking smug** right now, it's incredible. my whole body is tingling.

i was using this package in one of my projects. i found it had a bug, and when i went to maybe try to make a contribution to the open source repository, i found it to be a huge shitpile of vibe-coded mess. methods that were thousands of lines long with **hundreds** of arguments, it was impossible, and **very** alarming. it was clear to me that no one was watching the shop, so i immediately set about removing it from my project. and now, this. 🤗
there are **tons** of AI-related projects that use LiteLLM. it is a key part of the basic infrastructure of LLM-based development. if you use an LLM-based project, there is a good chance it uses LiteLLM.
(if you're curious, it does this very useful thing of standardizing LLM APIs into a single format. makes it easy for your app to switch between Anthropic, OpenAI, Google, z.ai, etc.)
this is actually a huge reason i have decided not to jump into LLM and AI agent-related development. the ecosystem is (as you would expect) run and maintained by people who are all-in on vibe coding, so a package you might like and include in your project could easily become a dangerous, unmaintainable mess within months. i don't know if people understand how brittle the whole thing is. everything is constantly, **constantly** changing.
like, it's moving **way** too fast for anyone to be able to tell if things are going to break or get injected with some malware. the whole thing is a house of cards built on top of a bomb.
oh my fucking god.
@peter is wrapping a vibe coded mess into a package so it looks reasonable the new sub-prime mortgage?
@NaN @peter You put it in a container and it’s a collateralized technical debt obligation.
@mathew @peter I dread to think what we call it once it's wrapped in a container orchestrator
@NaN @peter An Automated Insecurity Generator, or AIG for short?