@proedie No, that's not how information works. Information is about reducing your uncertainty space. Every time you can exclude half of the uncertainty space, you will have gained one bit of information. If you exclude less than half of the uncertainty space, you will have gained less than a bit of information. Just ask Claude[1].
Looking at broken clock[2] does not reduce your uncertainty space at all, therefore you gain zero bits of information. The classic formula Claude Shannon is famous for involves dividing the volume of the uncertainty space after gaining information with the volume of the uncertainty space before gaining information, and then taking a base-2 logarithm of the ratio and negating it. If you don't care a minus one bit about negative amounts of data, you can turn the ratio on its top; then, negation won't be necessary. But there's didactic reasons for presenting it in the classic way.
[1] Claude Shannon, an overall smart human and a measurer of the enthropy of information. Who were you thinking about?
[2] Well, there's the minor issue of knowing that the clock is broken, lest you erroneously throw out parts of your uncertainty space that might actually be valid. But the problem of information-resembling text is also an issue that applies to chatbots.