Does anyone know anything about 'entropy for integers'?
You define it like this:
\[
H(n) = \sum_i \alpha_i \log p_i
\]
where the sum ranges over the prime factorization \(n=\prod_{i \in \Bbb N} p_i^{\alpha_i} \)
I'm actually interested in a slightly different quantity:
\[
K(n) = \sum_{d \mid n} d \log (n/d)
\]
which I now notice is even more similar to the entropy of a distribution. In fact:
\[
K(n) = \log n - \sum_{d \mid n} d \log d
\]
🤯🤯🤯🤯🤯🤯🤯

@mc

The sequence of exponents (including the zeroes) in the #PrimesFactorization itself forms a kind of #VectorLogarithm (I think I used to call it) since the #PrimeMask (?) or #PrimeWise sum of the logs is the log of the product.