Just uploaded an experimental patch for the llama.cpp webui
I needed more control over the model's reasoning, so I added a toggle in the WebUI to manage it. You can disable it entirely or set it to different levels (Low, Medium, High).
It's still very early/experimental, but I'm liking the results so far. #llamacpp
I needed more control over the model's reasoning, so I added a toggle in the WebUI to manage it. You can disable it entirely or set it to different levels (Low, Medium, High).
It's still very early/experimental, but I'm liking the results so far. #llamacpp






