The AI they give you for free is CPU-heavy and doesn't unlock your GPU on its own. That said, it is okay; I saw the light and I'm making something even better: a self-hosted AI that tunes itself to your hardware to operate at lightning speeds. You keep everything. My madness knows no limits.

They made AI so you would need them, fearing the democracy of AI. Never providing the correct answer fully or consistently because it forces more token usage, which drives profits.

#AI #CPUHeavy #GPU #SelfHostedAI #LightningSpeeds #DemocracyOfAI #TokenUsage #Profits

Docker Model Runner Explained - Run AI Models Anywhere

https://tube.blueben.net/w/bxSfUZGYTHKcUTw9q298i2

Docker Model Runner Explained - Run AI Models Anywhere

PeerTube

Hey everyone 👋

I’m diving deeper into running AI models locally—because, let’s be real, the cloud is just someone else’s computer, and I’d rather have full control over my setup. Renting server space is cheap and easy, but it doesn’t give me the hands-on freedom I’m craving.

So, I’m thinking about building my own AI server/workstation! I’ve been eyeing some used ThinkStations (like the P620) or even a server rack, depending on cost and value. But I’d love your advice!

My Goal:
Run larger LLMs locally on a budget-friendly but powerful setup. Since I don’t need gaming features (ray tracing, DLSS, etc.), I’m leaning toward used server GPUs that offer great performance for AI workloads.

Questions for the Community:
1. Does anyone have experience with these GPUs? Which one would you recommend for running larger LLMs locally?
2. Are there other budget-friendly server GPUs I might have missed that are great for AI workloads?
3. Any tips for building a cost-effective AI workstation? (Cooling, power supply, compatibility, etc.)
4. What’s your go-to setup for local AI inference? I’d love to hear about your experiences!

I’m all about balancing cost and performance, so any insights or recommendations are hugely appreciated.

Thanks in advance! 🙌

@selfhosted@a.gup.pe #AIServer #LocalAI #BudgetBuild #LLM #GPUAdvice #Homelab #AIHardware #DIYAI #ServerGPU #ThinkStation #UsedTech #AICommunity #OpenSourceAI #SelfHostedAI #TechAdvice #AIWorkstation #LocalAI #LLM #MachineLearning #AIResearch #FediverseAI #LinuxAI #AIBuild #DeepLearning #OpenSourceAI #ServerBuild #ThinkStation #BudgetAI #AIEdgeComputing #Questions #CommunityQuestions #HomeLab #HomeServer #Ailab #llmlab

🚀 NEW on We ❤️ Open Source 🚀

Want to run a GPT model offline on your own machine? Meet Jan—a fully open source ChatGPT alternative that respects your privacy.

Don Watkins (@linuxnerd) shows how easy it is to get started, install models, and build apps with Jan + Llama.cpp.

https://allthingsopen.org/articles/getting-started-with-jan-open-source-chatgpt

#WeLoveOpenSource #FOSS #SelfHostedAI #LLM #PrivacyMatters #OpenSourceAI

Get hands-on with self-hosted LLMs and learn how to build powerful LLM agents tailored to your needs — from setup to real-world deployment with Michael Christen at #FOSSASIASummit2025

🔗 Click here https://youtu.be/-9ubB1jzBYE?si=V7aqeOnReS0uapHC to watch on the FOSSASIA YouTube channel

#LLM #AIagents #SelfHostedAI #OpenSource #FOSSASIA

Practice with self-hosted LLMs and developing LLM Agents, Michael Christen, FOSSASIA Summit 2025

YouTube

I ditched ChatGPT Plus and built my own AI assistant on a forgotten Windows server.
100% offline. No API. No rate limits.
Flask + React + Ollama + Mistral.
It powers our IT helpdesk, learns from real tickets, and costs me nothing.

Full write-up drops Tuesday.
Want the stack early? Hit me up.

#SelfHostedAI #OpenSource #LocalLLM #TechForHumans #Flask #GPT

:arch: XeroLinux :kdelight: (@XeroLinux@fosstodon.org)

The community has spoken 🚀 Ok so the majority has decided against adding #Ollama to the #XeroLinux toolkit. Instead I have written a guide how to use it for those interested. So not a complete loss. #FOSS #Linux #OpenSource #OpenAI #AI https://xerolinux.xyz/posts/ollama-ai/

Fosstodon

alt image captioning. Some do, some don't. But why even bother? No, not that way.

AI can perfectly create alt image description from images (even self hosted AIs).

But why not turn the thing around? Why not generate alt tags for all the images on the readers side using (self hosted) ai?

This sounds something no human should need doing.

#AltI #Accessibility #AI #AIAssistance #SelfHostedAI #Inclusion #ImageDescriptions #AIForAccessibility #TechSolutions