Here's a leading question: would you choose an LLM-based coding solution if you had an alternative that you knew was 100% accurate and never hallucinated? Yeah, that's what I thought. We can make that. Code generation doesn't have to "vibe".
@headius That’s what agents are though. You give the AI tools so it catches its own hallucinations and they never actually reach the final output. It obviously still can produce bad or broken code but this is the whole premise of what I call harness engineering. Or is that what you’re talking about?