An example of enshittification and how badly Google has broken its core reason for existence, search: Google is using an LLM to "hallucinate" postcodes.
I dunno about other countries, but in Australia, if you get someone's postcode correct and a detail as simple as the recipient's name or nickname, a street name without the number, or the name of a home, then our postal service has a miraculously high success-rate for delivering a letter or parcel to the correct destination. You can write the wrong suburb or state in the address, but if you use the right the postcode, Australia Post delivers. But, without the correct postcode, you're up the creek without a paddle. The wrong postcode can lead to delayed delivery or can result in your mail getting the "Return to sender" treatment.
Google Search results _could_ index the database from Australia Post or pay a moderate commercial licensing fee to access their API. But, instead, Google's got an LLM to make-up postcodes. The results are near enough to appear convincing at first glance, such as using '3' for first digit of postcodes in Victoria. Then, they present it in the top position of Search results and have the cheek to use the Australia Post logo. In my experience, there appears to be a correlation between numerical similarity of Google's fictional postcodes and geographical proximity of suburbs. Google's "AI Overview" presents misleading nonsense instead of using traditional indexing or a trivial lookup table.
(More details in the Alt-text of the screenshot.)
#Google #GoogleSearch #AI #AIOverview #LLMs #Enshittification #AustraliaPost #Australia #Postcode #Zipcode