The more I play with #ollama and #openwebui the more I think: This is what the future of AI should be.

A local personal assistant that just works out of the box, and you can customize in an trivial way.

No sordid manipulations. No billionarie oligarchs. Just a convenient tool you can use.

Hey @zeioth, you should try using #openclaw with #ollama. Not for the agent stuff, but just as a swap in for #openwebui. I am finding the memory component of #openclaw to be really perspective changing and totally inline with your thoughts about the future. I don't need something to respond to my email, but I think I do need a more private, more local #llm that remembers and iterates on work across sessions and days.

@hnnn I believe you and thank you for telilng me. It's true openwebui doesn't have a very good memory managing function in the community at the moment. I use Adaptative Memory v4. It rarely adds anything to the memory. But it's true when it adds something it's hight quality.

Anyway, for optimal results I think the best is to manually review the memories, and make sure they are correct, because that's gonna be extra context in all your queries. So less is more.