Back at it
Tile-based multi-pass Metal raytracer

Operating well outside of my level of expertise here with Codex, but it's doing a pretty credible job at a lot of complex Metal rendering code I would never be able to wrap my head around

(Ignore the UV issues)

'We have Valence at home!'
You can just do stuff, now. It's wild.
I'm not yet sure what the end goal is here, but I'm making a lot of progress regardless
Y'all know I'm using UIKit/Catalyst, right?

I started with a blank project template 5 hours ago. Now I've got a little 3D scene graph editor with gizmos, wireframe and shaded view modes, texturing, drag and drop OBJ file importing, and tile-based raytraced rendering, that runs on Mac and iPad.

Thanks Codex!

Some more glamour shots of this 3D app in the iPad Simulator
OBJ drag and drop
Multi-select and grouping
6720 loc

You know what I really could use at WWDC?

Teach a generative model to build high-quality vector SF Symbols so we can make custom ones, like say a full set for 3D modeling suites or PencilKit drawing apps, on demand ๐Ÿ‘€

โ€ฆasking for a friendโ€ฆ

@stern

Upgraded my raytraced renderer with bounce lighting and true reflections. Now it looks pretty legit!
The floor also needed some bounce lighting

'Could somebody with no programming experience recreate Photoshop with an LLM?'

I have absolutely zero Metal and near-zero 3D modeling experience. I know the basics of how to use a scene editor, and the names of rendering terms.

And I effectively vibecoded all of this in less than half a day with Codex 5.3 Medium, based on screenshots of a cool-looking app I've never used (Valence3D) and just a general sense of what a 3D app should do

Raytracing uses every last ounce of my poor M1 Mac mini's GPU. It has never seen such a workout
Since I am doing this on the GPU, not CPU, finding a way to render complex meshes without blowing the budget is a headache. There's a hard cap on how much work you can do before the Metal driver crashes
I think I should probably add a denoiser to thisโ€ฆ
Added some targeted calls to the hardware raytracing APIs, and a toggle switch so I can flip between non-accelerated RT and accelerated RT. There are plenty of bottlenecks in my renderer to stop it taking /full/ advantage of acceleration, but holy heck itโ€™s much faster regardless. This is on the M5 iPad Pro

Turns out the M1 supports enough of the hardware accelerated raytracing APIs that I get a massive speedup here, too?

Trying to parse Apple's Metal feature support tables is hurting my brain, so I'll just accept it and move on

Mandatory texture option
I can almost render this dragon mesh now before the GPU gives up ๐Ÿ‰
Submeshes and MTL texture loading
We're doing water now.
I think I'm just making Bryce now? That's a thing
Procedural water to the horizon, sky, island generation
Not me offloading raytracing workloads to my iPad because it's so much faster than my Macโ€ฆ
I spent untold hours in Bryce3D as a kid making silly scenes just like this. It did this stuff with machines a thousand times less powerful than my Mac mini. Certainly puts things into perspectiveโ€ฆ
It's real nice having a complex app like this on iPad, that's for sure. This is by far the most complex and impressive thing I've vibecoded with Codex 5.3. I haven't seen or touched a single line of code, I didn't have a detailed plan to work from โ€” this isn't at all like the structured ports I detailed previously
I thought my water should have some translucency and falloff, so I can put things underneath it
Procedural clouds seemed like a good idea

Recap: I vibecoded (code unseen, no plan) a 3D editor/renderer that has a scene graph, editing controls, primitives and gizmos, materials, procedural terrain and water, and hardware-accelerated Metal raytracing with soft shadows, clouds and bounce lighting, that runs on Mac and iPad.

Tool: Codex 5.3 Medium
Time: About a day's worth of work has gone into it

I figured I needed some kind of glass shader
The thing about material shaders is I don't quite know the right questions to ask โ€” the unknown unknowns. Am I taking into account the right refraction, internal reflection, attenuated shadows, etc. Why is this too bright, why is this too dark. You can tell from these examples where things are obviously wrong, and it takes quite a bit of iteration

'Attenuated shadows'

A little better

Maybe I should have bought a faster Mac before trying to write a raytracerโ€ฆ
Trying not to fry my GPU with caustics, but Metal isn't happy
I was sitting through hour-long renders (!) on my iPad yesterday, so I did an optimization pass on the hardware acceleration and it's much, much improved for simpler scenes, even on an M1
While this raytracer may never become a finished app, there are certainly elements from it I intend to yoink for future projects โ€” like the really neat toolbars that go around all the screen edges, they would fit into a complex pro app very nicely
Just casually building and raytracing a scene on an iPad mini 6, nbd
Of course it runs on iPhone, what do you take me for?
Liquid, Glass
So, like, what do I even do with this app?
I made my control groups collapsible, with a priority system. Honestly they're my favorite part of this prototype
The old viewport gizmo was faked in 2D, so I had it rewrite it in Metal and with a different projection, and now it's much better. I also added exponential decay to the orbit gesture so you can fling the camera around

Ha, cute, you can even fling the raytracer around ๐Ÿคฃ

Also I added an expanded progress indicator

Just a normal teapot.

Hadn't tried the visionOS build, but it works too.

With a caveat.

visionOS is far more fragile to anything like 3D rendering. Saturating the GPU like this slows the compositor to a slideshow, and even got to a point where Metal was leaking out of the window into the OS and I was seeing squares of corrupted video memory in front of me until I got the equivalent to a SpringBoard crash.

Functionally, this could be a visionOS app.

Practically, no.

You can see here that the moment I invoke the raytracer, everything goes to shit on visionOS. The userspace went down right at the end of the video, where it cut
What if you could step into your Bryce scenes?

Pretty much everything I've worked on with Codex up to now has been stuff I could have built myself, within my area of expertise (or learnable), it just would have taken weeks or months.

This 3D scene app is something I never would have been able to build myself. I would have needed a team of rendering experts with domain-specific knowledge and human-years of research

I love how visionOS, uniquely, *explodes* when rendering goes wrong.

Hello [MacBook] Neo.

@stroughtonsmith your results are amazing, but got me curious about Codex writing Metal, so I asked it to make a solar system simulation

After two hours of back-and-forth it still can't seem to texture a sphere in a way that doesn't cause the lighting and texture to rotate with the view

I gave it six goes at trying to get the bloom effect correct, but in the end had to paste an entire blog post into its context about how to render bloom in Metal

@stroughtonsmith watching you build these things over the past few weeks has been amazing. Do you think there is a chance you will create a blog post or have a thread with more details about how you are promoting the AI agents?
@stroughtonsmith for my use, the "vibe Coding" is most helpful for edge cases where it would be a huge rabbit whole to implement a relatively small feature, so we're aligned there. Thank you for sharing your work online

@stroughtonsmith interestingly for me it is the reverse: a basic modeler with a basic built in ray tracer would take me maybe a day or two to build on my own with an imgui based UI.

But all of the native iOS/iPadOS/visionOS stuff would have required me to have dedicated Apple frameworks experts help out on. My experience with writing Apple native apps is super limited. But with Codex, Iโ€™ve now made several nice custom, just-for-me native apps to replace things that were just scripts before.

@stroughtonsmith Erm, is it real 3D? :)
@gklka it's Metal, it's not RealityKit
@stroughtonsmith Too bad. These 3D editors are perfect fit for Vision
@stroughtonsmith have you tried closing your eyes to make it render faster? ๐Ÿ˜‚
@edmn the OS is doing it for me
@stroughtonsmith ooooh and a Suzanne, my favorite!
@stroughtonsmith thatโ€™s how ya do it!
@stroughtonsmith at the start I thought you were going for something like a desktop Tinkercad or Shapr 3D, I would use that ๐Ÿ˜Š
@stroughtonsmith You might have to get used to the idea of making stuff just for fun again :)

@stroughtonsmith It could be a scene blocking app. Attaching a material tag to each color which can be passed along to an AI generator with adherence instructions.

Currently > Reference Photograph > FSpy > Blender > Workbench Render Image with separated colours > Gemini is a bit fiddly. But an AI friendly simple modeller could make this a lot easier.

@stroughtonsmith you have the makings of a fine screensaver.
@stroughtonsmith make it a polished Storyboarder. It was great tool but hasnโ€™t been supported in a while https://wonderunit.com/storyboarder
Storyboarder - The best and easiest way to storyboard.

Storyboarder makes it easy to visualize a story as fast you can draw stick figures. Quickly draw to test if a story idea works. Create and show animatics to others. Express your story idea without mak

Wonder Unit
@stroughtonsmith Thank goodness this project has been years in the making or else i would be mad at the efficiency and rapid development.
@stroughtonsmith have you considered rendering at a smaller scale and using MetalFX to upscale? they even have a denoising upscaler now, seems like it was pretty much designed for exactly this
@finnvoorhees I'm already using the metalfx denoiser after every pass, but I want to render out high res images, not just upsampled ones
@stroughtonsmith Say, could you make a benchmarking mode in it? Iโ€™d love to be able to speed test ML and graphics performance between my M2 Pro Mac mini against the MacBook Neo next week ๐Ÿ‘€
@dgriffinjones I don't think it's stable enough for me to be able to do that. I have to baby the renderer a bit to get good results. Also I don't know yet if my performance is gated on me rather than the hardware!
@stroughtonsmith Did you see Glaze from Raycast? https://www.glazeapp.com/
Glaze by Raycast. Desktop apps, reimagined by you.

Create software for you and your team. Lives on your Mac, connects to your files, tools and hardware.

@stroughtonsmith I guess everything is ultimately converging towards a Salvador Dali painting? I want to see liquid clocks now! ๐Ÿ˜‚
@stroughtonsmith I wrote a vastly less capable 3D editor/renderer for my final year project at university. Year being the operative word there as thatโ€™s roughly how long it took me.
@ankinson @stroughtonsmith And you learned something doing it.
@tsturm @stroughtonsmith Yes, of course. We had radically different goals though. Producing what Steve has in a day, even if itโ€™s just a throw-away bit of fun, remains striking to me.
Makes you wonder who's code the LLM learned from. Is there a full metal project out there already? Does it have similar default menu elements? How much of it is novel here? How much more optimized would it be hand crafted? Is it negligible?
@stroughtonsmith With your long-time experience and know-how about Appleโ€™s SDKs, do you think the code it generated was high quality or did you need to make a lot of manual adjustments?
@Arcticulate the code is effectively indistinguishable from my own. I have made zero adjustments
@stroughtonsmith Ah! Thanks, thatโ€™s genuinely interesting.