If using LLMs for development is so great, why are people building so many tools for sandboxing it? "It works great, but can be catastrophically wrong" does not sound great at all.
@tymwol prompt injection is a hell of a drug.