Yay, #Perplexica worked with #Ollama even on very weak test #hardware!
✨ 💫
I used the Ollama huihui_ai/fara-abliterated:latest model for that. It is still slow AF, but it works!
The proof of concept is alright ✅ I also have strong hardware; I just had to test to see if it works. Yet, there is some Ollama or startup bug inside Perplexica. It's a little bit of #reconfiguration wiring to get that stuff working.
(1/2)