So, after 1-2 months testing chatgpt, I have to curb my enthusiasm a bit:
1) ChatGPT misses the "why": when you (a human) work on something, you are thinking step by step. And you are making mental notes of why you did something this or that way. ChatGPT skips that step and when being asked, it cannot reflect on why it generated something. Instead, it will give you a good-sounding answer based on why someone else would answer it like this. That makes it appear it is reflecting but it doesn't.