@ecurtin very nice! Going to spend some time trying to hook it up to Open WebUI to serve my local models instead of Ollama. I’ve got a bit of inertia with Open WebUI - it’s hooked up to various other APIs… I see there’s a bit of friction currently though 😜