I've been working more on my local homelab LLM project. It runs gpt-oss:20b as its base model in conjunction with a RAG system (BM25+embeddings) accessed with a custom web UI. Nothing groundbreaking there. What's exciting for the geek in me is how the system uses what I call policy-driven RAG. Spoiler, no LLM is truly rule based. What my system does is uses a file called other_topics.txt to determine if model output comes solely from RAG data (the default), from general gpt-oss:20b model data (aka other topics), or a hybrid mode. Does it scale to larger deployments? I have no idea. But it's very homelab geek cool. #opensource #localllm #pangeahillsai #openai #gpt-oss #homelab
I call my homelab bitfrost.lan. It's a play on bifrost in Norse methology about the rainbow bridge. So when I started my local LLM project, I naturally called it bitfrost. AI. Unfortunately, that is a real site, not related to me at all. I'm now calling my project PangeaHills. I'm retiring soon to the North Georgia foothills. The Appalachian Mountains are the ancient remants created when Pangea was formed. Thus, PangeaHills. Happy Geeking. #homelab #geek #pangeahillsAI

Another RAG-to-Riches story.
โ€ฆok, not really โ€” Iโ€™m just an average homelab goblin who accidentally reinvented a thing the real AI folks probably solved 3 years ago. ๐Ÿ˜‚

But I call it policy-driven RAG:
where rules, not model weights, decide when my local LLM answers from the doc corpus and when itโ€™s allowed to use general knowledge.

Basically:
โ€œStay in your lane unless I say otherwise.โ€
RAG for homelab stuff
LLM for the wider world
Never phones home

Might be old hat to the pros, but it absolutely blew my geek mind that it actually works.

Just a homelab nerd living his best offline-AI life.

#homelab #LocalAI #RAG #SelfHosted #PangeaHillsAI #LLM #NerdLife #ServerGoblin #ComputersAreFun

Built something fun in the lab: PangeaHills.ai, my own locally-hosted, policy-driven RAG + LLM stack.
Completely offline, totally self-contained, powered by a bunch of noisy equipment pretending to be a cloud. ๐Ÿ˜„

The neat part? Itโ€™s rule-driven, not weight-driven:
โ€ข Homelab questions must stay inside the RAG universe.
โ€ข General topics only switch to model knowledge when my routing rules explicitly allow it.
โ€ข The LLM never โ€œguessesโ€ when to leave RAG โ€” it follows policies, not vibes.

Feels like having an AI that actually stays in its lane because you built the lane lines yourself. ๐Ÿšง๐Ÿค–

#homelab #selfhosted #LLM #RAG #PolicyDrivenAI #SelfHostedAI #HomeLabLife #BSD #Linux #PangeaHillsAI #nerdlife