50 Followers
100 Following
195 Posts
Mobile Developer formerly at Automattic, on the Day One app for iOS and Mac. Likes to dabble in game dev. Amateur actor.
🏳️‍🌈 He / him.
Bloghttps://frosty.blog
Githubhttps://github.com/frosty
I installed the iOS 26 public beta on my main phone a couple of weeks ago. I took such an instant dislike to Liquid Glass that I'm writing this on a Samsung Galaxy Z Flip 7, and I also have a Pixel 9a on the way to try out.
Is this thing still on? I took a huge break from Mastodon, and I don't know if more people hang out here or on Bluesky these days. I have literally 1 follower over there though, so I guess I'll carry on posting here for now?
Finally installed iOS 26 on my primary device. I normally don’t have a problem with UI updates, but I instantly dislike Liquid Glass so much that I’m honestly about to order an Android phone to try out.

I can't believe I just now learned this in Xcode

You can select some text, cut it with ⌘X, select a folder in the navigator and paste with ⌘V, and it puts it in a new file

And all this time I've kept all my code in one giant file

don't make us into cartoons! please!

I was grossed out when Apple got to image generation in their WWDC keynote, and I was grossed out when I read that Applebot scraped "the open web" to train their AI model, with publishers only being able to opt out after the fact. Disappointing to say the least.

https://www.macstories.net/linked/apple-details-its-ai-foundation-models-and-applebot-web-scraping/

Apple Details Its AI Foundation Models and Applebot Web Scraping

From Apple’s Machine Learning Research1 blog: Our foundation models are trained on Apple’s AXLearn framework, an open-source project we released in 2023. It builds on top of JAX and XLA, and allows us to train the models with high efficiency and scalability on various training hardware and cloud platforms, including TPUs and both cloud and

Stolen from threads:

I want AI to help me figure things out. Not make things up for me.

Apple is clearly looking at the figuring out part - a lot of useful features were presented (albeit with a lot of future tense in the statements). A big plus that it's being done in a private context.

But it's the made up art where everyone (rightly) focuses their attention. Given the huge number of creative people using Apple products, taking away our collective imagination feels like a huge misstep.

First three top-of-mind thoughts about Apple Intelligence:

• I really wish they would have addressed the environmental story. That would help me personally accept these features more. But I know they can’t… because the story isn’t good.

• I actually think a lot of it looks quite nice, feels Apple-esque, and the UI they’re doing is really cool

• I remain viscerally repulsed by any AI generated art lol (In their Notes demo the sketch looked better!)

True absolute star of the keynote: MathNotes