Docan: app AI open source per Linux con UI Glass, multi-provider (Gemini, ChatGPT) e playbooks YAML potenti. Responsive, sicura e locale! #LinuxAI #Docan #OpenSource #AIChat
Docan: app AI open source per Linux con UI Glass, multi-provider (Gemini, ChatGPT) e playbooks YAML potenti. Responsive, sicura e locale! #LinuxAI #Docan #OpenSource #AIChat
Ubuntu hoặc Garuda (Arch-based) đều phù hợp cho AI locals. Ubuntu LTS ổn định, hỗ trợ GPU NVIDIA tốt qua "Additional Drivers". Garuda tối ưu sẵn cho hiệu năng cao với kernel và thư viện AI đi kèm. Chọn Ubuntu nếu muốn dễ dàng và hỗ trợ cộng đồng rộng; chọn Garuda nếu cần cấu hình sẵn cho AI/ML. Cả hai đều hoạt động tốt với card RTX 3050 của bạn. #LinuxAI #LocalLLM #Garuda #Ubuntu
(Đếm: 215/500 chars)
https://www.reddit.com/r/LocalLLaMA/comments/1oh079j/any_linux_distro_better_than_others_for_a
Hoi iedereen! 👋
Vragen aan de community:
Heeft iemand ervaring met deze GPU’s? Welke zou je aanbevelen voor het lokaal draaien van grotere LLMs?
Zijn er andere budgetvriendelijke server-GPU’s die ik misschien heb gemist en die geweldig zijn voor AI-workloads?
Heb je tips voor het bouwen van een kosteneffectieve AI-workstation? (Koeling, voeding, compatibiliteit, enz.)
Wat is jouw favoriete setup voor lokale AI-inferentie? Ik zou graag over jullie ervaringen horen!
Alvast bedankt! 🙌"
#AIServer #LokaleAI #BudgetBuild #LLM #GPUAdvies #ThuisLab #AIHardware #DIYAI #ServerGPU #TweedehandsTech #AIGemeenschap #OpenSourceAI #ZelfGehosteAI #TechAdvies #AIWorkstation #MachineLeren #AIOnderzoek #FediverseAI #LinuxAI #AIBouw #DeepLearning #ServerBouw #BudgetAI #AIEdgeComputing #Vragen #CommunityVragen
Hey everyone 👋
I’m diving deeper into running AI models locally—because, let’s be real, the cloud is just someone else’s computer, and I’d rather have full control over my setup. Renting server space is cheap and easy, but it doesn’t give me the hands-on freedom I’m craving.
So, I’m thinking about building my own AI server/workstation! I’ve been eyeing some used ThinkStations (like the P620) or even a server rack, depending on cost and value. But I’d love your advice!
My Goal:
Run larger LLMs locally on a budget-friendly but powerful setup. Since I don’t need gaming features (ray tracing, DLSS, etc.), I’m leaning toward used server GPUs that offer great performance for AI workloads.
Questions for the Community:
1. Does anyone have experience with these GPUs? Which one would you recommend for running larger LLMs locally?
2. Are there other budget-friendly server GPUs I might have missed that are great for AI workloads?
3. Any tips for building a cost-effective AI workstation? (Cooling, power supply, compatibility, etc.)
4. What’s your go-to setup for local AI inference? I’d love to hear about your experiences!
I’m all about balancing cost and performance, so any insights or recommendations are hugely appreciated.
Thanks in advance! 🙌
@[email protected] #AIServer #LocalAI #BudgetBuild #LLM #GPUAdvice #Homelab #AIHardware #DIYAI #ServerGPU #ThinkStation #UsedTech #AICommunity #OpenSourceAI #SelfHostedAI #TechAdvice #AIWorkstation #LocalAI #LLM #MachineLearning #AIResearch #FediverseAI #LinuxAI #AIBuild #DeepLearning #OpenSourceAI #ServerBuild #ThinkStation #BudgetAI #AIEdgeComputing #Questions #CommunityQuestions #HomeLab #HomeServer #Ailab #llmlab
@system76
I love #LLM, or as they're often called, #AI, especially when used locally. Local models are incredibly effective for enhancing daily tasks like proofreading, checking emails for spelling and grammatical errors, quickly creating image descriptions, transcribing audio to text, or even finding that one quote buried in tons of files that answers a recurring question.
However, if I wanted to be fully transparent to #bigtech, I would use Windows and Android with all the "big brotherly goodness" baked into them. That's why I hope these tools don't connect to third-party servers.
So, my question to you is: Do you propose a privacy-oriented and locally/self-hosted first LLM?
I'm not opposed to the general notion of using AI, and if done locally and open-source, I really think it could enhance the desktop experience. Even the terminal could use some AI integration, especially for spell-checking and syntax-checking those convoluted and long commands. I would love a self-hosted integration of some AI features. 🌟💻
#OpenSource #Privacy #AI #LocalModels #SelfHosted #LinuxAI #LocalLLM #LocalAI