In this blog post, I demonstrate a hands-on example of using AI tools with a legacy technology system to build the foundation for a modern software solution.

This is the SpecOps method in practice.

https://spec-ops.ai/blog/posts/reverse-engineering-legacy-app/

Using AI to Reverse-Engineer a Legacy Application into a Modern Software Specification

How we used AI to extract a complete specification from a 25-year-old Microsoft Access application, producing 13 modular documents in roughly 4–5 hours of collaborative work.

@mheadd I'm interested in whether you'll be able to demonstrate multiple test-passing versions based on different stacks from the same spec. And from there, I'll be very interested in the usability of the resultimg implementations, and how/if that can also be tackled with AI and incorporated into the spec, with the downstream versions regenerated.

Also: Has it occurred to you yet that this is "what if waterfall, but the cost of development/change was zero"? 😉

@mogul Why wouldn't you be able to? I'm a bit confused by the waterfall reference.

I'm advocating for human-led, iterative work with an AI coding agent and I tried to emphasize that in the post. I know there are some folks talking about AI only, but it's not how I work with these tools. I also talk at length (and ad nauseam) about the critical role that human oversight, review, and verification play in the process. In fact, once you have a decent spec to work from, there's no AI requirement

@mheadd Right, I understand the concept/ambition, just want to see the fully worked proof that the spec generated this way is "enough" to then generate and iterate in downstream implementations.

(Re: waterfall, it's the " document all your specs and verification up-front" aspect, not the lack of iteration. I'm just yanking your chain.)