Liquid Glass’ blurred content everywhere is especially cruel to those of us who use reading glasses or progressives.

The reflex to seeing blurry text on our phones is to adjust our sight angle or distance to sharpen it. But, of course, it’s not our fault, doesn’t sharpen, and just causes eyestrain.

Text on my phone should never be blurry.

You may ask, “How many people could this possibly affect?”

Well…

@marcoarment I'm gonna be that guy, but ChatGPT is not the right tool for this

@outadoc @marcoarment I think LLMs are the perfect tool for this. I'm curious why you don't agree.

LLMs are great at parsing text and aggregating it. Their entire existence is based on modeling languages. World-knowledge LLMs search the internet far better than plain-old-Google in my experience. Factual halucinations are still an issue, but have been dramatically reduced in the last year.

After a few minutes of "regular" Googling, everything in this screenshot is accurate.

@jimmylittle @outadoc @marcoarment what's a 'factual' hallucination as distinct from... what?

@oscarjiminy @outadoc @marcoarment There are all kinds of hallucinations. LLMs have a current problem of seeing things that aren’t there as things they perceive as facts.

A guy on mushrooms seeing dancing pink elephants is a different kind of hallucinating.

There is an important distinction. 🍄

@jimmylittle @outadoc @marcoarment it's a specious distinction in that we are not addressing human subjects

so far as machine output goes 'factual' hallucinations are identical to any other output you may term as an hullucination

@oscarjiminy @outadoc @marcoarment To be clear. *I* don’t term them as hallucinations, the industry does.

I consider them bugs in the output.

@jimmylittle @outadoc @marcoarment so what's a factual bug?

hallucination's a cultural adaptation to describe an emerging phenomenon

@jimmylittle @outadoc @marcoarment it's a perfectly cromulent term