This study from Stanford shows that people who use GitHub copilot produce code with more security flaws than people who don't; it's roughly the same size as the study GitHub keeps quoting saying it makes developers faster. https://www.theregister.com/2022/12/21/ai_assistants_bad_code/
Study finds AI assistants help developers produce code that's more likely to be buggy

At the same time, tools like Github Copilot and Facebook InCoder make developers believe their code is sound

The Register
@seldo The real question is: how much of this stuff ends up in prod? And how many of the security issues stem from undefined behavior that's impossible in more secure languages?
@HoloPengin @seldo you can write insecure code in any language. How it’s insecure may change.
@laffer1 @seldo True, but memory safety issues are a pretty big error class that is rather difficult to cause in Rust