At first the talk about AI "becoming" sentient and boasting that you have "no introspection" felt contradictory (and breathtakingly stupid)
The I realised, it's much easier to get something "sentient" if sentience doesn't include the capability of introspection.
🤷🏻♂️