OpenAI's move to allow generating "Ghibly stlye" images isn't just a cute PR stunt. It is an expression of dominance and the will to reject and refuse democratic values. It is a display of power

https://feddit.org/post/9946957

OpenAI's move to allow generating "Ghibly stlye" images isn't just a cute PR stunt. It is an expression of dominance and the will to reject and refuse democratic values. It is a display of power - feddit.org

Lemmy

I say this as someone who frequently uses generative ai, and actively chooses to pay for the service.

Fuck openai.

This company has utterly failed to fulfill their mission statement, and they will be unable to make right by humanity until ALL software they have created is available to the public as FOSS (free and open source software). Openai claimed that this is exactly what they were going to do, and then they just didn’t. So fuckem.

Have you heard of ollama? You can run deepseek and stuff locally super easy. I know it’s not a complete replacement, but it feels nice to use an LLM guilt free. I’ve compared the 14b distilled model from deepseek vs the paid version of ChatGPT and it made me cancel my account.
What do you use to run it locally? If there was something that could use speech to text reliably to be able to use a open source option, I consider switching.

FWIW speech to text works really well on Apple stuff.

I’m not exactly sure what info you’re looking but: my gaming PC is headless and sits in a closet. I run ollama on that and I connect to it using a client called “ChatBox”. It’s got a gtx 3060 which fits the whole model, so it’s reasonably fast. I’ve tried the 32b model and it does work but slowly.

Honestly, ollama was so easy to setup, if you have any experience with computers I recommend giving it a shot.

Quickstart - Ollama English Documentation

ollama 的中英文文档,中文文档由 llamafactory.cn 翻译

Yeah, I think the Apple speech to text is pretty decent, but I think on ChatGPT they use the whisper API to return the text and it just seems to be a lot more reliable, especially when it comes to understanding random words in context

How much VRAM do you have on the 3060 to be able to fit the whole thing on the GPU?

True. Honestly apples software is just getting worse by the day. It’s sad.

It’s a version with 12gb of vram. I use it to game though. If you want a real GPU for this, I hear the Tesla P40 is the best.