Increasingly, my number one feature hope for OS 27 is a larger context window for Apple's on-device model. There are so many features I am trying to build that I would love to run 100% locally, but I can't.

As a basic example, you literally can't use it as a meeting note generator because any meeting more than 5-10 minutes is going to exceed the window.

@matt_birchler https://apps.apple.com/us/app/locally-ai-local-ai-chat/id6741426692 will give you a massive window for the on-device model. Unfortunately, it will inevitably reveal how inept the model is too. 😅
Locally AI - Local AI Chat App - App Store

Download Locally AI - Local AI Chat by Adrien Grondin on the App Store. See screenshots, ratings and reviews, user tips, and more apps like Locally AI - Local…

App Store
@macmanx Yeah, it does it through third party models that require a few GB extra to download and use. It could work, but man, Apple's model is already there and is fine, it should work for more use cases 😭
@matt_birchler It supports Apple’s foundational model too, no extra download required, only if you want more.
@macmanx For sure, I guess my my use cases (podcast and meeting recordins) you literally always need more 😅
@macmanx I take it back…I installed the app and it fails at everything I'm trying to do as well, even when I pick the best models it has. Bummer for sure.