Clanker Adjacent (my blog)

https://lemmy.world/post/44455344

Clanker Adjacent (my blog) - Lemmy.World

Ola Elsewhere [https://codeberg.org/BobbyLLM/llama-conductor], I’ve been building a behaviour shaping harness for local LLMs. In the process of that, I thought “well, why not share what the voices inside your head are saying”. I hope that’s ok to do. With that energy in mind, may I present Clanker Adjacent (because apparently I sound like a clanker) thanks lemmy! [https://lemmy.world/post/43503268/22321124] There’s not much there yet but what there is may bring a wry smile. If reading long form stuff floats your boat, take a look. I promise the next post will be “Show me your 80085”. Share it if you like it. Clanker Adjacent [https://bobbyllm.github.io/llama-conductor/] PS: Not a drive by. I lurk here and get the shit kicked out of me over on /c/technology

that looks interessing any guides where this is in an docker compose stack with olama and open webui? i want to experiment on an i5 6th gen. mini pc.

noob here.

Sorry, not really my bailiwick. You’d be best asking the ollama or OWUI about that.

Done

I’ll give you the noob safe walk thru, assuming starting from 0

  • Install Docker Desktop (or Docker Engine + Compose plugin).
  • Clone the repo: git clone https://codeberg.org/BobbyLLM/llama-conductor.git
  • Enter the folder and copy env template: cp docker.env.example .env (Windows: copy manually)
  • Start core stack: docker compose up -d
  • If you also want Open WebUI: docker compose --profile webui up -d
  • Included files:

    • docker-compose.yml
    • docker.env.example
    • docker/router_config.docker.yaml

    Noob-safe note for older hardware:

    • Use smaller models first (I’ve given you the exact ones I use).
    • You can point multiple roles to one model initially.
    • Add bigger/specialized models later once stable.

    Docs:

    • README has Docker Compose quickstart
    • FAQ has Docker + Docker Compose section with command examples