pre-LLM google: enter query, check out the first couple links, compare accuracy
post-LLM google: enter query, get an AI summarised result that is just accurate enough just frequently enough that it lulls you in to a false sense of security. after getting burned one too many times, you realise that you should read the summary, then check out the first couple links and compare accuracy
perplexity: enter query, get an AI summarised result that downplays the sources it used. get lucky a few times where it is (or seems) accurate, then realise it makes large mistakes, and crucially, there is no way to distinguish between accurate answers, hallucinations, or confident statements sourced from eleven year old sarcastic reddit comments. make a new habit: whenever you search perplexity for something you care about the accuracy of (i.e. anything), do a follow-up google search, read their AI summary, then check out the first couple links and compare accuracy
man this is so much better than the old way of doing things