If I understand information theory right, then, given "Socrates is human" and "all humans are mortals", there is no information in "Socrates is mortal", as it can be derived from the given statements. But what about the computational cost of deriving conclusions from statements? Does information not capture or consider that at all?

Aquinas said that angels immediately know all the consequences of their knowledge. But humans do not.

1/2

Just to give an obvious example, in cryptography, the public key is known. The private key is derivable from the public key, but the computational cost is so hight that it seems obvious that the private key is information, and in fact quite valuable information.

So it seems that between information and compute there should be some theory that connects them, but I can't remember having come across anything like that? What am I missing?

2/2

@vrandecic Hi 👋 Aren't the theories you are missing the hardness assumptions?

I think that each process to "derive information" (or maybe, "making information explicit") requires computational effort. While in some examples that effort may be neglegible, in some others (OWL?) it becomes tangible, and for even others there are whole applications build on-top of that notion of infeasible computational effort (i.e., your cryptography example).

@uvdsl agreed!

And my question is, whether there is some unified view on information which acknowledges that (and, probably, implicitly, I may also ask whether that's useful)

@vrandecic @uvdsl are you looking for Kolmogorov complexity?

@nichtich @uvdsl no. I can write a function that does prime factorization with a very small Kolmogorov. This wouldn't capture the computational cost of getting the factors.

It's closer to a combination of complexity theory and information theory.