The notion of a broken clock being sometimes right is based on a gross misunderstanding of what information is.

A clock that always shows the same time is never right, even in the moments of the day when the time happens to be what it shows, because you don't gain any information about what time it is by looking at the clock.

This reasoning also applies to chatbots. If you can't tell whether what you have been given is useful information unless you alreay know the information, then you haven't been given useful information.

@riley That's a very useful angle on it. Where I think this gets interesting is that there's information which is, so to speak, self-certifying. Consider a proof, written in a form that's subject to a deterministic mechanised check. In many ways, it doesn't matter where you got it from: a Ouija board, a demon whispering, hard work, or an LLM. If the proof correctly typechecks, the theorem is true. Now if we consider programs are proofs...

@modulux A proof is not information in a strict sense, and largery exactly because of this reason: it's self-contained (or, well, can be, with sufficient formalism available).

In a broad sense, there's some very interesting philosophy that can be done about the notion of information content of Teh Book. But it's mostly the kind of philosophy that requires a larger mug of beer than would be conducive to my upcoming meetings[1], so, as the old Orcish saying goes, nar udautas.

As a general rule, I tend to prefer the interpretation that a proof is a series of "I'd now like to bring your attention to ..." kind of steps: they don't add anything (directly) to your mental map; they suggest where you should look at to find interesting things that are already on the map.

[1] A children's book I once read included a character, one mathematics professor, who argued that it is pointless to ask questions, because there's two possibilities: the answer either is known or is not known. If it's known, what's the point of asking it again? If it's not known, what's the point of asking if there won't be an answer?

And, well, while it's silly in an obvious way, this kind of reasoning actually comes up in the context of proofs-as-information.

@modulux (In case you're not familiar with Famous #ADHD People, The Book is a tenet from Erdős Pál lore.)

@riley Yes, I heard about it; the most elegant possible proof for a given theorem, roughly?

Rather I was thinking of the notion you stated that proofs aren't information, and I see why you said it. But it doesn't seem intuitive when we compare it to other ways we use the notion.

For example let's say we have a composite number pq. Generally speaking, we would say that getting p and q is additional information. But the proof that some p in particular and some q in particular result in pq would contain no information. It's rather odd to think of.

@modulux You know how numeric probabilities can vary depending on how equipotentiality is defined, and it sometimes be left implicit with multiple equally plausible "obvious" definitions?

Modelling the information flow of abstract mathematics as such runs into this same sort of problems. Nobody has axiomatised it; there's a bunch of common intuitive assumptions, but a lot of them are ... well, you can pry them loose and justify it if you want to, and sometimes, get interesting results this way. But a lot of times, you don't get anything, or maybe you will have to nail down your own (quasi)-axioms first. These aren't like the axioms of modern geometry; they're really kind of like what Eukleides wrote in the beginning of The Elements, and then never did anything with because it didn't make any sense.[1]

So you see why I suggested a huge mug of beer for dealing with this stuff.

[1] Caveat: if you go searching, a lot of sources offer modern axiomatic geometry instead of Eukleides' original work — still because his vague notion of foundations didn't make sense, and now we actually have the axioms that could have been used for the conclusions he went on to, pardon the pun, draw. Most of the rigorisation work was done in the 1600s' Italy; the lingering hairy problem of the Parallels' Axioms was eventually solved by Lobachevskiy in early 1800s by demonstrating that it can be reversed without breaking anything else, and Euklidean geometry as understood by moden mathematics generally rests on Hilbert's[2] work from the pinnacle of the 19th century, as in, it was published in 1899. But it can be great fun to read translations of the original Elements, including the crappy parts.

[2] You might have heard of his hotel, which has a countable infinity number of rooms. Ijon Tichy was a repeat customer.

@modulux Oh, btw: Turing's Machines are this way, in part, because they genuinely used to try and go with the notion of information flows in mathematics being like frictionless spherical cows in vacuum. For some things, it's a great simplifications; for others, well, it didn't work out, and we ended up having Complexity Theory.