I made GitHub Copilot write a raindrop sound synthesizer in Swift (plus some SwiftUI and Charts) and wrote about how it feels, as an experienced developer, to wrangle an LLM-based coding assistant for Swift/Mac development.

https://www.cocoawithlove.com/blog/copilot-raindrop-generator.html

Using Copilot to write a raindrop audio synthesizer using AVAudioEngine

A quick look at the state of LLM assistants for Swift programming but also having fun with AVAudioEngine and SwiftUI Charts

Cocoa with Love
@cocoawithlove Thanks for writing this up. I’ve never used an LLM, and it was really interesting to hear about your experiences.
@mattiem @cocoawithlove this actually inspired me to give ChatGPT a try on implementing a simple Metal shader, an area I know almost nothing about. And it wasn’t terrible! Not great but gave me enough to get started. This was only with like 10 minutes of playing around
@calicoding @cocoawithlove Something I always wonder about is what you miss out on learning by skipping to (more or less) an answer. You never know what you’ll stumble across along the way, but sometimes there’s nothing other than frustration…

@mattiem for sure. Especially if something is wrong and I need to fix it. For example, I grabbed a random shader from Shadertoys, had ChatGPT convert it to Metal, dumped it in a SwiftUi view, and performance was terrible. No idea how to fix that.

At the same time, there’s a limit to what I can learn, and I’m just trying to get something quick. If I really wanted to learn about Metal shaders I would do the homework.

@calicoding TODO: learn how this works one day 😉

@mattiem It turns out that good ol’ CAEmitterLayer is good enough for this silly little app.

Now let's see if I can actually stay up late enough to see this live LOL