Procedural water to the horizon, sky, island generation
Not me offloading raytracing workloads to my iPad because it's so much faster than my Mac…
I spent untold hours in Bryce3D as a kid making silly scenes just like this. It did this stuff with machines a thousand times less powerful than my Mac mini. Certainly puts things into perspective…
It's real nice having a complex app like this on iPad, that's for sure. This is by far the most complex and impressive thing I've vibecoded with Codex 5.3. I haven't seen or touched a single line of code, I didn't have a detailed plan to work from — this isn't at all like the structured ports I detailed previously
I thought my water should have some translucency and falloff, so I can put things underneath it
Procedural clouds seemed like a good idea

Recap: I vibecoded (code unseen, no plan) a 3D editor/renderer that has a scene graph, editing controls, primitives and gizmos, materials, procedural terrain and water, and hardware-accelerated Metal raytracing with soft shadows, clouds and bounce lighting, that runs on Mac and iPad.

Tool: Codex 5.3 Medium
Time: About a day's worth of work has gone into it

I figured I needed some kind of glass shader
The thing about material shaders is I don't quite know the right questions to ask — the unknown unknowns. Am I taking into account the right refraction, internal reflection, attenuated shadows, etc. Why is this too bright, why is this too dark. You can tell from these examples where things are obviously wrong, and it takes quite a bit of iteration

'Attenuated shadows'

A little better

Maybe I should have bought a faster Mac before trying to write a raytracer…
Trying not to fry my GPU with caustics, but Metal isn't happy
I was sitting through hour-long renders (!) on my iPad yesterday, so I did an optimization pass on the hardware acceleration and it's much, much improved for simpler scenes, even on an M1
While this raytracer may never become a finished app, there are certainly elements from it I intend to yoink for future projects — like the really neat toolbars that go around all the screen edges, they would fit into a complex pro app very nicely
Just casually building and raytracing a scene on an iPad mini 6, nbd
Of course it runs on iPhone, what do you take me for?
Liquid, Glass
So, like, what do I even do with this app?
I made my control groups collapsible, with a priority system. Honestly they're my favorite part of this prototype
The old viewport gizmo was faked in 2D, so I had it rewrite it in Metal and with a different projection, and now it's much better. I also added exponential decay to the orbit gesture so you can fling the camera around

Ha, cute, you can even fling the raytracer around 🤣

Also I added an expanded progress indicator

Just a normal teapot.

Hadn't tried the visionOS build, but it works too.

With a caveat.

visionOS is far more fragile to anything like 3D rendering. Saturating the GPU like this slows the compositor to a slideshow, and even got to a point where Metal was leaking out of the window into the OS and I was seeing squares of corrupted video memory in front of me until I got the equivalent to a SpringBoard crash.

Functionally, this could be a visionOS app.

Practically, no.

You can see here that the moment I invoke the raytracer, everything goes to shit on visionOS. The userspace went down right at the end of the video, where it cut
What if you could step into your Bryce scenes?

Pretty much everything I've worked on with Codex up to now has been stuff I could have built myself, within my area of expertise (or learnable), it just would have taken weeks or months.

This 3D scene app is something I never would have been able to build myself. I would have needed a team of rendering experts with domain-specific knowledge and human-years of research

I love how visionOS, uniquely, *explodes* when rendering goes wrong.

Hello [MacBook] Neo.

So now that visionOS 26 lets you spawn immersive scenes from UIKit apps, I had Codex implement me an immersive scene using Metal and CompositorServices that mirrors the in-window viewport and lets you live in your scene 😁

It's real frickin cool.

The raytracer might be off limits for visionOS, but there's a lot of interesting stuff to do in other areas

I figured why not use RealityKit for the material previews, so now they are actual spheres.

Miraculously, it all still works — the Metal viewport, the Metal immersive scene, and the RealityKit UI elements, but it's very clear the Vision Pro (M2) doesn't have much headroom to build an actual app around this stuff

The raytracer is, for now, a no-go on visionOS. It's possible I could throttle it and stay within visionOS' systemwide render budget. But it's probably worth improving the RT performance a bunch on its own first before I come back and try it here. I might run out of steam on this prototype before then.

This entire app project is still in my 'Temp' folder, where throwaway projects live 😅

This project, which runs on iPhone, iPad, Mac, and Vision Pro (with Immersive Space), is now 16.5K lines of code
I thought it was finally time to add vertex editing and subdivision. Now it's a 3D modeling tool and not just a raytracer

Some more things to show off here on this iPad mini 6!

• Longpress band gesture
• Multi-select
• Vertex editing
• Subdividing
• My 'generate a Cornell Box' button
• (And the raytracer, of course)

There is a lot of really neat stuff in this app. Still using Codex 5.3 Medium, still haven't touched a line of code myself

All of this still works great on iPhone too
The touch gestures all work on visionOS too, but on all platforms it has keyboard and mouse support for all your precise selection and modifier key needs
Boolean operations seem pretty complex, but I made a start at it
Playing a bit of musical chairs with the floating controls in the toolbars now that I'm starting to run out of space for new UI
Late night modeling on my iPad 🤪
I made sure all my interactions work right with the Logitech Muse (i.e. stylus won't orbit the viewport, will modally lock to highlighted gizmo axes, etc), so now I can do a bit of vertex editing on the Apple Vision Pro, channeling @Dreamwieber

I never really thought about it before, but multitouch is actually legit for 3D modeling tools, maybe even better than a desktop. On a Mac, you need to hold modifier keys (or buy a multi-button mouse) to do everything you want with the viewport, but on touch you can orbit, pan, zoom, and multi-select very easily. If you special-case the stylus too, like I am, it feels very powerful.

Almost all of this extends to spatial computing, though visionOS struggles a bit with two-hand gestures

The raytrace operation will now be dispatched into the background on iOS, allowing for long-running background tasks

iPadOS has never been better for rich, complex, desktop-class apps. Almost all of the old barriers and blockers are gone.

Sadly, Apple waited until most developers had run out of patience with the platform.

If this thing had Xcode, a real Xcode, it would be effectively complete

(Maybe I should vibe code Xcode, next.)
Complex scenes are kinda fun 😎

Even my iPhone 12 Pro Max can raytrace!

Which I guess is not all that surprising, considering the A14 chip is the same generation as the M1

That’s a whole lot of raytracing from a little iPad. Biggest render to date, at 5120x2880 — took about an hour to get to the final pass.

Crashed right at the end, so I didn’t get to take a picture of the final output 🥲

The raytracer still needs a bunch of work, but it’s more and more capable day by day

I was tired of running into raytracer resource limits, so now it uses wavefronts and a scheduler with a 4+ Kloc rewrite. Seems a lot better for complex scenes and lots of glass. All that work spent optimizing the previous renderer's performance ramp and failure recovery pays off now that the new renderer is so much more robust
Also I've switched my Codex model to GPT 5.4, OpenAI says it outperforms gpt-5.3-codex and it has over double the context window, so I figured a renderer rewrite was the right time to step up a level
I feel like I need a bunch of Apple's glass spheres on my desk to try to fine-tune my raytracer 😂
Working on light and glass
The raytracer rewrite bears fruit — it can now make it through a long-running raytrace on visionOS without bringing the userspace down. It still chokes on more-complex shaders like glass, but it's a big improvement

Retro.

@alexr

Slowly improving my glass

This 3D modeling app and the switch to the more-expensive GPT 5.4 model might finally be the thing that burns through my weekly usage budget. It's now 25K lines of code; the more-complex renderer really ballooned things a bit, but it's worth it.

I finally moved it out of my temp folder and into my projects folder, and handed it off to a fresh GPT 5.4 session with a bunch of documented context about the intricacies of the project. Hopefully that's enough to jump-start it

I gave it an easy task to start with: a 'Frame Selection'/'center camera' button. Select an object or a group of objects and reposition the camera to fit

I know how to use Photoshop, and I wish 3D tools were as easy to use as a Photoshop. So I think that's the kind of app I want to create*. Not knowing what I don't know, I'm sure to stumble across features that no self-respecting modeling tool would add, but are great for people like me. That's the beauty of cross-domain knowledge, I guess.

Anyway here's a pen tool.

(*you know, if this ever actually becomes something I would want to ship and put my name to)

Hello
Keeping all three platforms in sync (Mac, iOS, and visionOS) is important to me. If I ever were to ship something like this, it would be cross-platform from day one
I was expecting a couple-day forced vacation, but it seems like OpenAI reset everybody's usage limits to apologize for service issues yesterday so I'm back to full 😂
Little bit of Apple Pencil shape-drawing, and support for cutouts

The Logitech Muse hardware does feel like a piece of junk. Connecting it to the headset is really flakey — you pretty much have to unpair it and re-pair it every time you want to use it, before visionOS will recognize it and account for it in passthrough, but I suspect that's Apple's bug. Together, though, it makes it feel pretty lame to use.

Anyway, my stylus-drawing code works on visionOS too

@stroughtonsmith You know you’ll have to ship this, right?
@stroughtonsmith wasn’t AI supposed to work instead of us??

@stroughtonsmith Photoshop is your example of an easy to use tool?

I feel like you’re making Pixelmator to Blender’s Photoshop.

@TheEjj Photoshop is a mid-complexity pro tool. You don’t need a degree in it like 3D modeling. That’s why it’s much easier to clone, too
@stroughtonsmith The state of actual desktop class apps on iPadOS is pretty sad. This would be an immediate improvement already.

@stroughtonsmith next thing to tackle: USD.

Just spitballing here, but while on one hand you could just treat USD as an import format, you could also go all the way: use USD Stage as the core data model, refactor the renderer out, make it a HydraRenderDelegate, and have your property manipulation drive sparse edits on target layers. 👀

@stroughtonsmith 5.4 is more token efficient, so yes it’s more expensive per token but not clear how much more usage on the $20 plan it really uses. Looks to be about the same for me. Setting to high or xhigh really burns through the quota though. I use https://codexbar.app/ to keep an eye on that.
CodexBar

Tiny macOS menu bar app that keeps your Codex, Claude Code, Cursor, Gemini, Antigravity, Droid (Factory), Copilot, and z.ai limits visible.

@stroughtonsmith you know it's funny but I can't help but ponder at the global combined number of developer hours and GPU hours wasted making glass 🤪
@stroughtonsmith @alexr for true retro you need a disco ball!
@stroughtonsmith Chef’s kiss (and why can’t Genmoji do the obvious here)
@stroughtonsmith App Store screenshot
@stroughtonsmith the guy from the design team that left cleaned his desk and took all that glass crap away.
@stroughtonsmith Like these vintage 1987 mirrored ones?
@stroughtonsmith do you think codex is better than Claude Code?
@eierund I've never used Claude and I haven't found a reason to yet
@stroughtonsmith context starts to fall off at around 260k iirc, it’s very easy to pollute so be mindful of that if you do see some issues.
5.4 by all accounts is a huge step up
@stroughtonsmith Have you peeked under the hood at much of the code for this project? I saw you mention that you were completely hands off with the code for this one, but I'd definitely be curious to get your take on the overall code quality you're seeing from these models.

@brndnsh people keep asking me that. It writes code to the standard you ask it to write to. It's been following my style guide, so all of its code looks like mine.

Except for the Metal renderer, that's voodoo.

@stroughtonsmith It looks like you're quickly progressing towards an eventual release. I think you're doing more than playing around at this point. haha
@MacObservatory the project is still in my temp folder 😶

@stroughtonsmith Petition to rename all A Series SoC as M Series "Mini".
A14 -> M1 Mini
A18 Pro -> M4 Mini

The benchmarks don't lie.

@stroughtonsmith The moment you vibe code it and it doesn’t have a modal pop up whenever it connects to the watch, we will know we’ve achieved the singularity
@stroughtonsmith yes please! Would be interesting how much of the toolchain can be included out of the box (of course not AppStore compatible)
Frida (@cr4zyengineer) on X

Ever wanted Xcode on iOS it self? Entirely offline without even a Mac or cloud compiling… then I created the solution.. Nyxian.. it runs on jailed iOS.. and is open source, you can compile and research it your self.. it even has increment and threading. https://t.co/NP8jZ1Nj2P

X (formerly Twitter)
@stroughtonsmith Much like with the windowed UI, I feel like Apple is putting in more work avoiding Xcode while previously making Swift Playgrounds nearly capable enough. It seems simpler to just have Xcode at this point.

@stroughtonsmith @viticci

A Xcode you can develop on with only an iPad, or an Xcode you can develop for for the iPad only, on a laptop?

@stroughtonsmith only available on M3+ iPads, though 😩
@gregggreg oh no this runs on devices before that. I think it needs an M1/A14-class device or higher
@stroughtonsmith using GPU in background tasks requires M3+ iPad I thought?
@gregggreg do you have a source for that? I haven’t seen that documented

@stroughtonsmith https://developer.apple.com/forums/thread/797538?answerId=854825022#854825022

"...which is that the iPhone 16 Pro does not support background GPU. I don't know of anywhere we formally state exactly which devices support it, but I believe it's only support on iPad's with an M3 or better (and not supported on any iPhone)."

[iOS 26 Beta] BGTaskScheduler.supp… | Apple Developer Forums