This is one of the things that frustrates me about these LLM based coding tools, too much wrong headed certainty. I've been using these classes and their ancestors for going on 30 years now and I sure as hell don't know off the top of my head. Saying I don't know is tons better than hallucinating an incorrect answer.
@paul Well - I’m a human and developer (but in other languages) and I don’t even get the question 😅
But I’m not a LLM so I can clearly say: „what is the question? What is the problem you like to solve?“
My experience so far with those LLMs: write more Prosa. Explain your task, describe the problem and what you have tried. Ask for a (better) solution or let it „analyze“ it etc. Rather short (trick) questions are not what LLMs very good in.

