I sometimes try to use the Microsoft #Copilot that comes bundled with #Office365 now. All the training for this feature warn you thoroughly to double check answers and so on due to the hallucination problem. But its still frustrating as hell when you give it a simple task and it fails miserably.

I told it to look in our company's cloud file storage for a document that had my name "Tim Farley" in it along with a particular CVE number (also in quotes). I was looking for an old report where I had written up a particular vulnerability. It very quickly showed me a link to a document that I recognized as one of my reports, and offered "would you like to see the exact paragraph". I said sure, show me the exact paragraph.

It then wrote me 257 WORDS explaining how it had screwed up, and the CVE number I gave it is NOWHERE TO BE FOUND in that document. Included was some mumbo jumbo about how it uses parallel partial searches to do its work or some such. AND IT COMPLIMENTED ME on challenging its answer.

A ton has been written on how #AI might replace low-level jobs such as interns. But I swear to you if I had an intern who behaved like this, I would put them on a performance improvement plan!

How is this acceptable performance for a product that people pay a bunch of money for? Would you buy Excel if on the bottom of every page it said "HEY YOU BETTER CHECK ALL MY MATH BECAUSE I MIGHT HAVE SCREWED UP SOMETHING"?

#AIFAIL #PutAIonaPIP