| Website | https://lstaples.sdf.org |
| GitHub | https://github.com/lstaples3 |
| Website | https://lstaples.sdf.org |
| GitHub | https://github.com/lstaples3 |
Last month, an article comparing the Apple M4 & M5 processors referenced an app called ‘Locally AI’. It enables users to “Run Llama, Gemma, Qwen, DeepSeek, and more locally on your iPhone, iPad, and Mac. Offline. Private. No login.”
It’s free, quite small, and collects no user information. It works pretty much like other AI apps. (1/3)