I was sent this by @seth and thought it was such a great tool it was worth sharing.

Want to know what LLM models you can run on your hardware. Use this as a reference to easily know. Its awesome.

#AI #Ollama #SelfHosted #HomeLab

https://www.canirun.ai/

CanIRun.ai — Can your machine run AI models?

Detect your hardware and find out which AI models you can run locally. GPU, CPU, and RAM analysis in your browser.

CanIRun.ai