I finally find the right hardware and excuse to setup openclaw at home and it’s indeed really fun and useful. But i would never under any circunstance make it run with cloud llm provider. Say hi to r2claw! 🦀
@pancake what’s your set up? Sounds like the right approach anyway!
@Cali it’s the jetson thor, a 128GB of vram with ollama (openclaw can be installed from ollama cli) i just created a separate user for it and it runs with glm 4.7 flash. Takes about 60GB of vram so it can run sub agents nicely.