| Time Chunks | https://www.timechunks.app |
| Astrophotography | https://www.astrobin.com/users/nickkohrn/ |
| Pronouns | he/him/his |
| Time Chunks | https://www.timechunks.app |
| Astrophotography | https://www.astrobin.com/users/nickkohrn/ |
| Pronouns | he/him/his |
Liquid Glass: How to double the taps necessary for any task.
Addendum: there’s bad SwiftUI code, ignorant of modern API, bashing HStacks and GeometryReaders together until something works. That needs training to overcome.
Then there’s failures of the framework. Like how you can attach an alert to a button, but not if it’s inside a menu. Or how you can add toolbar items to a bare window on macOS but not on iOS.
Those are failures of “learn once, write everywhere” and Apple needs to be better about them. Too many caveats, too many gotchas.
I’ve complained LLMs are bad at SwiftUI, and that’s not entirely accurate.
It has bad default approaches, because it’s all trained stale StackOverflow answers. It can be steered in the right direction if you know what you’re doing, but that’s kind of the problem.
I’ve said it before and I’ll say it again: Apple needs to massively improve its documentation and sample code to overcome this. Thousands of sample projects, actively maintained.
(Hi WWDR! Hire me to do this!)
Apple development in 2026 is all about the "Liquid Glass" redesign and the democratization of on-device AI. While we wait for Siri to reach its full potential via Gemini integration, agentic LLMs are already changing how we write and refactor SwiftUI. The future of the Apple ecosystem is here.🚀
https://martiancraft.com/blog/2026/02/what-excites-us-about-2026/