Back at it
Tile-based multi-pass Metal raytracer

Operating well outside of my level of expertise here with Codex, but it's doing a pretty credible job at a lot of complex Metal rendering code I would never be able to wrap my head around

(Ignore the UV issues)

'We have Valence at home!'
You can just do stuff, now. It's wild.
I'm not yet sure what the end goal is here, but I'm making a lot of progress regardless
Y'all know I'm using UIKit/Catalyst, right?

I started with a blank project template 5 hours ago. Now I've got a little 3D scene graph editor with gizmos, wireframe and shaded view modes, texturing, drag and drop OBJ file importing, and tile-based raytraced rendering, that runs on Mac and iPad.

Thanks Codex!

Some more glamour shots of this 3D app in the iPad Simulator
OBJ drag and drop
Multi-select and grouping
6720 loc

You know what I really could use at WWDC?

Teach a generative model to build high-quality vector SF Symbols so we can make custom ones, like say a full set for 3D modeling suites or PencilKit drawing apps, on demand 👀

…asking for a friend…

@stern

Upgraded my raytraced renderer with bounce lighting and true reflections. Now it looks pretty legit!
The floor also needed some bounce lighting

'Could somebody with no programming experience recreate Photoshop with an LLM?'

I have absolutely zero Metal and near-zero 3D modeling experience. I know the basics of how to use a scene editor, and the names of rendering terms.

And I effectively vibecoded all of this in less than half a day with Codex 5.3 Medium, based on screenshots of a cool-looking app I've never used (Valence3D) and just a general sense of what a 3D app should do

Raytracing uses every last ounce of my poor M1 Mac mini's GPU. It has never seen such a workout
Since I am doing this on the GPU, not CPU, finding a way to render complex meshes without blowing the budget is a headache. There's a hard cap on how much work you can do before the Metal driver crashes
I think I should probably add a denoiser to this…
Added some targeted calls to the hardware raytracing APIs, and a toggle switch so I can flip between non-accelerated RT and accelerated RT. There are plenty of bottlenecks in my renderer to stop it taking /full/ advantage of acceleration, but holy heck it’s much faster regardless. This is on the M5 iPad Pro

Turns out the M1 supports enough of the hardware accelerated raytracing APIs that I get a massive speedup here, too?

Trying to parse Apple's Metal feature support tables is hurting my brain, so I'll just accept it and move on

Mandatory texture option
I can almost render this dragon mesh now before the GPU gives up 🐉
Submeshes and MTL texture loading
We're doing water now.
I think I'm just making Bryce now? That's a thing
Procedural water to the horizon, sky, island generation
Not me offloading raytracing workloads to my iPad because it's so much faster than my Mac…
I spent untold hours in Bryce3D as a kid making silly scenes just like this. It did this stuff with machines a thousand times less powerful than my Mac mini. Certainly puts things into perspective…
It's real nice having a complex app like this on iPad, that's for sure. This is by far the most complex and impressive thing I've vibecoded with Codex 5.3. I haven't seen or touched a single line of code, I didn't have a detailed plan to work from — this isn't at all like the structured ports I detailed previously
I thought my water should have some translucency and falloff, so I can put things underneath it
Procedural clouds seemed like a good idea

Recap: I vibecoded (code unseen, no plan) a 3D editor/renderer that has a scene graph, editing controls, primitives and gizmos, materials, procedural terrain and water, and hardware-accelerated Metal raytracing with soft shadows, clouds and bounce lighting, that runs on Mac and iPad.

Tool: Codex 5.3 Medium
Time: About a day's worth of work has gone into it

I figured I needed some kind of glass shader
The thing about material shaders is I don't quite know the right questions to ask — the unknown unknowns. Am I taking into account the right refraction, internal reflection, attenuated shadows, etc. Why is this too bright, why is this too dark. You can tell from these examples where things are obviously wrong, and it takes quite a bit of iteration

'Attenuated shadows'

A little better

Maybe I should have bought a faster Mac before trying to write a raytracer…
Trying not to fry my GPU with caustics, but Metal isn't happy
I was sitting through hour-long renders (!) on my iPad yesterday, so I did an optimization pass on the hardware acceleration and it's much, much improved for simpler scenes, even on an M1
While this raytracer may never become a finished app, there are certainly elements from it I intend to yoink for future projects — like the really neat toolbars that go around all the screen edges, they would fit into a complex pro app very nicely
Just casually building and raytracing a scene on an iPad mini 6, nbd
@stroughtonsmith can one vibecode Cyberpunk 2077 with pathtracing? i wonder
@stroughtonsmith Thank goodness this project has been years in the making or else i would be mad at the efficiency and rapid development.
@stroughtonsmith have you considered rendering at a smaller scale and using MetalFX to upscale? they even have a denoising upscaler now, seems like it was pretty much designed for exactly this
@finnvoorhees I'm already using the metalfx denoiser after every pass, but I want to render out high res images, not just upsampled ones
@stroughtonsmith Say, could you make a benchmarking mode in it? I’d love to be able to speed test ML and graphics performance between my M2 Pro Mac mini against the MacBook Neo next week 👀
@dgriffinjones I don't think it's stable enough for me to be able to do that. I have to baby the renderer a bit to get good results. Also I don't know yet if my performance is gated on me rather than the hardware!
@stroughtonsmith Did you see Glaze from Raycast? https://www.glazeapp.com/
Glaze by Raycast. Desktop apps, reimagined by you.

Create software for you and your team. Lives on your Mac, connects to your files, tools and hardware.

@stroughtonsmith Add a feature to use all the macs in the network as a render farm 😁
@stroughtonsmith a few things to look into that have to due with light interacting with objects: Rayleigh Scattering, Mie scattering, Kubelka-Munk scattering and absorption model, Saunderson correction to said K-M model. I have no clue if these are ever used in a typical ray tracing algorithm, but they come up when discussing the color development of coatings. My understanding of typical ray tracing is that it follows Snell's Law but I don't know beyond that.
@stroughtonsmith I guess everything is ultimately converging towards a Salvador Dali painting? I want to see liquid clocks now! 😂
@stroughtonsmith Water… then glass… oh I see now, all that work to recreate Liquid Glass 😁.
@stroughtonsmith but is it Liquid Glass 🙃
@stroughtonsmith I wrote a vastly less capable 3D editor/renderer for my final year project at university. Year being the operative word there as that’s roughly how long it took me.
@ankinson @stroughtonsmith And you learned something doing it.
@tsturm @stroughtonsmith Yes, of course. We had radically different goals though. Producing what Steve has in a day, even if it’s just a throw-away bit of fun, remains striking to me.
Makes you wonder who's code the LLM learned from. Is there a full metal project out there already? Does it have similar default menu elements? How much of it is novel here? How much more optimized would it be hand crafted? Is it negligible?
@stroughtonsmith With your long-time experience and know-how about Apple’s SDKs, do you think the code it generated was high quality or did you need to make a lot of manual adjustments?
@Arcticulate the code is effectively indistinguishable from my own. I have made zero adjustments
@stroughtonsmith Ah! Thanks, that’s genuinely interesting.
@stroughtonsmith I was always impressed by your skills to go from idea to PoC to shipping, but this is next level. I need my own project now to try this tech to its fullest. I work in a large enterprise where AI means Microsoft Copilot. I tried Claude on a smaller personal project and was impressed but never thought you could vibe-code a working 3D app in less than 24 hrs.
@stroughtonsmith so you did nothing and deserve no credit for it
@nicolas17 I've done a ton, learned a ton, and spent two 16-hour days working on it. That's like saying managers do nothing and deserve no credit. If you have a 3D modeling tool in your back pocket, feel free to present it 😉
@stroughtonsmith so it's not even helping you work less hours?
@nicolas17 this is a bigger project than the others I've tried

@stroughtonsmith is this ever coming to TestFlight? It's making me miss Bryce so much

Sunsets next?

@simsaens this is still a ‘throwaway prototype’ 😅 I wasn’t planning on productizing it
@stroughtonsmith @simsaens sure, but give it another week. 😉
@TheEjj @stroughtonsmith in another week it will be Blender, I want Bryce
@stroughtonsmith you are slowly building the Trapper Keeper I had in the 90s. 💗
@stroughtonsmith Looks like the excitement made you stay up all night …

@stroughtonsmith is this going to be broadly available? Any chance for an iPhone version?

Seeing these updates has made me incredibly nostalgic for iTracer, which was a basic 3D rendering app for the original iPhone, but also the first app that made me think, “woah, this is literally just a small computer.”

@stroughtonsmith One area that 3D software is only touching the edges of is using primitives to block out the scene and creating adhered AI renders using reference images. Currently a multi step process or a kludgy Blender add-on but so useful.
@stroughtonsmith loved loved loved Bryce and Strata. was so fortunate to have access to Power Macs in summer school. @ljharb
@stroughtonsmith That sounds like a handy handoff feature for those with older Macs but newer iPad/iPhone.
@stroughtonsmith Watching the progress of this app over the past day has been incredible.
@edmn I'm just as much along for the ride as everybody else
@stroughtonsmith love the Amiga reference :)
@stroughtonsmith funny thing is. They say , our jobs will be over with vibe-coding. But I don't see everyone creating apps now. Just because there are good tools, doesn't make everyone a carpenter..