Have you noticed that whenever you get one of those 6 or 8 digit authenticator codes there's an easy mental way to chunk it? E.g. consecutive numbers, palindromes, repeats, familiar sequences (e.g. 911, 1066, etc). Presumably this is all just randomness, but could you make a least-memorizable code?
Like one where you just literally have to memorize the eight digit sequence because there are no natural chunkings available.

@zachweinersmith.bsky.social

This is the idea of compressibility. If you define an algorithm then the compressibility of a string of data with regards to that algorithm is how short the shortest input which produces that string as output. Incompressible strings (i.e. random) strings are those where the shortest input that produces the string is longer than the string itself.

1/?

@zachweinersmith.bsky.social

The thing is, you can always change your algorithm so that any specific output is very compressible (e.g. something akin to "if input==0; return 7568301222338375). But you can never do this for *every* output at the same time, because you just run out of short input strings.

@zachweinersmith.bsky.social

Thus you can prove that for any fixed algorithm, the vast majority of strings will be incompressible/random.

The really interesting thing is that this formal concept of randomness is really new, despite the concept of randomness/chaos/chance being positively ancient