@jimmylittle @outadoc @marcoarment ok so we are not addressing software, strictly speaking we are addressing a cultural entity (however it may be bound in/defined in software). that’s a distinction in so far as the bug is in *output* (as you yourself asserted). The output is faulty given expectation.
llms are not search engines at least that’s not the primary function. the function is to aggregate, collate, extrapolate through immitation. that function (the output) is fundamentally different from what google is oriented to deliver (historically speaking).
there are so many different products that are conflated into ai (or/and that claim to be enhanced/informed by ai) that it can be tricky to speak generally but in so far as objections go a fundamental one is as to how they are framed. these services cannot reason.
for folks who believe they can or/and that cognition/sentience is not a necessarily embodied phenomenon the agi singularity is an article of faith. it is not rational.