I made a Mac productivity app!

✨ SUBSTAGE ✨

…puts a command bar underneath your Finder windows and lets you use natural language to convert media, manage files, perform calculations and more!

🌐 👉 https://substage.app

#Swift #SwiftUI

Although Substage translates natural language into Terminal commands, I’m hoping it finds a broader audience beyond developers. My number one use case is converting media — quickly resizing images, re-encoding videos, and more.

Would love to hear what you think! 🚀

In case you’re wondering, my iOS time planning app Hour by Hour (selkie.design/hour-by-hour) is also still in development!

You’re welcome to join the TestFlight if you want to give Hour by Hour a try: https://testflight.apple.com/join/zcDWv7WN

Join the Hour by Hour beta

Available on iOS

As always, quality is guaranteed by this very good girl, who claps her flippers and hoots with joy whenever I put something new out.

(She really needs a name, any suggestions? 🦭)

Ooh thanks so much for the boost @marcoarment, I’m a huge fan of @atpfm! ❤️
@joethephish The seal in puffin rock is Silky which kinda works with Selkie? (Yes I have a 3 year old which is why I know this haha)
@axxl oh I kinda love that, that’s cute
@joethephish This looks awesome! I instantly joined the TestFlight 👏
@joethephish This is brilliant! Where’s the best place for feedback, if you’re wanting any?
@JoeR Oh thanks!! Feedback definitely appreciated 🙏 [email protected]
@joethephish @marcoarment This is great.
Can you please add some accessibility labels for some of the icons though?
Also it would be great if when pressing the hotkey the VoiceOver focus would be in the textfield for the prompt.
@johann interesting! apologies for the shoddy accessibility support, I’ll take a look. Good catch on the prompt field not getting focus too!
@joethephish this is actually a useful way to integrate AI into macOS! Unfortunately that means you’ll be sherlocked at wwdc 2026

@nickoneill indeed, this is one of the reasons I prioritise this rather than my iOS app!

But maybe… maybe Apple will wait until 2036 to Sherlock me 😜

@joethephish Started testing this last night – love the idea! Wish that I could provide my own API keys as well as save "presets" for frequent commands.

@viticci ❤️ thanks Federico!!

Literally when listening to recent episodes of Connected and AppStories I was thinking you would probably want that... and so it's err... no coincidence that it's what I'm working on right now 🙂

(I was going to message you when I was done but I guess you found me first! 😅)

@viticci Also 100% agreed on the presets ideas - it's on my list! I have two ideas in mind:

1) Auto-completion based on recent history
2) Probably you're suggesting: caching the response of the AI so that commands can be executed again immediately for different files

@joethephish @viticci Hello! I've been checking this out today too. One thought on presets would be defining certain phrases to be translated into certain parameters. For example, defining “Resize image for website" using “for website" to know that that means a proportionately resized image that is 1400 pixels wide.

@johnvoorhees @viticci awesome, really happy to hear you’ve been trying it too John!

And yeah! I was thinking of allowing you to define a personal “system prompt” with this type of stuff, or possibly even per-folder/project settings.

@joethephish @viticci Excellent! I need to spend more time with Substage, but I like what I've seen so far a lot.
@johnvoorhees @viticci thank you! ☺️ just let me know if run into any issues or have any more ideas, it’s still early days and I’m open to feedback, especially from you lot!
@joethephish And yes, both of these sound great too!
@joethephish Ha! Love it. This is great – and down the road, I guess it'd be nice to set rules for different file types=models, like run a specific model based on file size, filename, file type, etc.
@viticci ooh interesting, nice idea, hadn’t thought about that one. Love it
@joethephish very cool! i think i would use this. seems like it takes up a lot of space though. possible to reduce the size of the top and bottom margins?
@joethephish Hi ! The app looks sleek and useful ! But can you share where this wallpaper/background is from ? 😍
@regularju it’s just a standard macOS Sequoia wallpaper! https://basicappleguy.com/haberdashery/sequoiawallpaper
macOS Sequoia Wallpapers — Basic Apple Guy

macOS Sequoia desktop wallpapers.

Basic Apple Guy
@joethephish Oh, ok! I did not even check the new wallpapers when I upgraded. Thanks a lot !
@regularju Yeah I've done that in the past too! This one is definitely a gorgeous one.
@joethephish I was literally just musing to myself this morning that this is what Apple’s AI should be! Natural language file conversion and cleaning! Way to go.
@belovedmelody thanks! Let’s see how long till I get Sherlocked 😀
@joethephish Where do I send bug reports? - does not open automatically need to “hide show” each time - shows userfolder by default, does not show on any other folders. Lastly please make option for ollama locally.

@florian hi! Sorry to hear you’re having issues. Right now the best place is here: https://substage.featurebase.app

What version of macOS are you running? I’m not sure why you’re having those 2 issues I’m afraid!

On the Ollama support you can upvote the existing request! (It’s coming anyway though)

@florian Ollama support is out now! I’ve also fixed some issues with visibility toggling not quite working right, maybe it’ll fix your case. Here’s a post about the update: https://selkie.design/blog/bring-your-own-ai/
Selkie Design

Beautifully smart apps for iOS and Mac

@joethephish This looks interesting except for the privacy issues. You say that you don’t send the contents of files, but one of your examples at the top is “word count”. Wouldn’t that inherently send the contents?

@paulehoffman nope, word count runs “wc” which runs locally and just outputs a number.

Don’t get me wrong though I’m sure if you use Substage a bunch of times for various things, some commands will eventually output sensitive content that will get sent for summarisation.

I’m working on local LLM as an option, but the latency isn’t great (on my M1)

@joethephish Very cool. Beautiful website and onboarding as well. I made myself a similar thing for the command line, but selecting the files in a GUI plus better feedback and risk assessment is a lot nicer, well done.

Quite different than text-to-commandline, but something like this would be neat for the settings app, to at least find stuff with natural language, since it's a mess and the search doesn't work. Maybe one could get to the contents with the Accessibility API.

@combatwombat thank you, glad you like it! 😀

And yeah, something like this for Settings would be great! I suspect you wouldn’t even need Accessibility API, just a very richly described database that maps to deep link URLs that could open the settings app in the right place?

@joethephish nice idea and well executed app. If you offer one-time purchase option, I will buy it. Subscription model for this kind of app doesn’t make sense to me.
Selkie Design

Beautifully smart apps for iOS and Mac

@joethephish great! Just bought it 😍. Thanks!
@joethephish it seems cool, any plans about porting it on windows / linux / html?
@WeAreMuesli thank you! Nope no plans I’m afraid.
@joethephish This is an awesome app; thank you very much! Do you have a recommendation for a local model (for LM Studio)?
@esamecar Glad you like it! It kinda depends how beefy your Mac is, and I don't use local models myself so I'm not really an expert, but I have found that Qwen 2.5 Coder works pretty well! I wouldn't attempt to use anything less than 7-8b parameters, the results will get pretty hit and miss.
@joethephish Thank you very much again! Last question: If I use an online model does it upload file names or content or does it only use that model to generate cli instructions?
@esamecar @joethephish check out my blog post for some tips on this, happy to hear feedback or ideas of topics to cover in the future: https://log.alets.ch/119/
119 #apertus #mlx

Getting started with local AI: running Apertus with LM Studio on Apple MLX, with thoughts on the Why, What, and How.

log.alets.ch
@loleg Thanks, interesting read. Unfortunately, the apertus model returned only non-working terminal commands in substage for me but I'll try it in other contexts for sure.
@esamecar thanks for testing - agentic support in the next release should fix this, and I can keep you posted. What about SERA-14B or Devstral 2?