so it turns out that much (if not all?) of the issue with husband's potentially former long-term client is that they put some of his work into shatGPT and it told them it had *loads* of errors. they then sent the list to him ... trouble is that of all the errors it found, only two were real. one was a single-letter misspelling of a single word (which has now been fixed), and the other was a mistake that the client themselves had made after husband's involvement.
the rest — including dates and all — were entirely fabricated by AI. and the client didn't bother to read any of it or check it or ...
he's replied to them and explained everything point by point, but whether they read his mail or just give it to the AI ...
🫠