Would you return a hard drive with 1 uncorrectable error after 130 hours of work?
Would you return a hard drive with 1 uncorrectable error after 130 hours of work?
Seriously, do not use LLMs as a source of authority. They are stochistic machines predicting the next character they type; if what they say is true, itโs pure chance.
Use them to draft outlines. Use them to summarize meeting notes (and review the summaries). But do not trust them to give you reliable information. You may as well go to a party, find the person whoโs taken the most acid, and ask them for an answer.
Because itโs like a search box you can explain a problem to and get a bunch of words related to it without having to wade through blogspam, 10 year old Reddit posts, and snippy stackoverflow replies. You donโt have to post on discord and wait a day or two hoping someone will maybe come and help. Sure it is frequently wrong, but itโs often a good first step.
And no Iโm not an AI bro at all, I frequently have coworkers dump AI slop in my inbox and ask me to take it seriously and I fucking hate it.