Before I kick the bucket, I want to figure out how these frameworks fit together:

classical mechanics
classical statistical mechanics
classical field theory
quantum mechanics
quantum statistical mechanics
quantum field theory
thermodynamics

and probably some more. For example, one famous weird thing is that if you take classical statistical mechanics and replace

1/(Boltzmann's constant × temperature)

with

i × Planck's constant × time

in all your equations, you get quantum mechanics - more or less. So if you ignore the constants, this is saying that "imaginary time" - whatever the hell that is! - acts like "inverse temperature".

Physicists use this fact a lot, but remain divided on whether it's "just a trick". I don't think something this big can be just a trick!

But there are other ways to set up this analogy. I wrote a paper with Blake Pollard where instead we said inverse temperature is analogous to i × Planck's constant. We pushed this other analogy to the point of figuring out what in quantum mechanics corresponds to 𝑒𝑛𝑡𝑟𝑜𝑝𝑦 in classical statistical mechanics. We called it "quantropy", and worked out this nice chart.

But now I'm wishing we hadn't set Boltzmann's constant equal to 1. And I want to compare our analogy to the usual one, and figure out what the hell is going on. When there are multiple mathematically rigorous analogies between frameworks you should get serious and study them all, not just pick one and ignore the rest.

I'm also annoyed that we didn't notice that the thing analogous to free energy, which I called "free action" or Φ, is what physicists call the "effective action".

Here's our paper:

https://arxiv.org/abs/1311.0813

Quantropy

There is a well-known analogy between statistical and quantum mechanics. In statistical mechanics, Boltzmann realized that the probability for a system in thermal equilibrium to occupy a given state is proportional to exp(-E/kT) where E is the energy of that state. In quantum mechanics, Feynman realized that the amplitude for a system to undergo a given history is proportional to exp(-S/i hbar) where S is the action of that history. In statistical mechanics we can recover Boltzmann's formula by maximizing entropy subject to a constraint on the expected energy. This raises the question: what is the quantum mechanical analogue of entropy? We give a formula for this quantity, which we call "quantropy". We recover Feynman's formula from assuming that histories have complex amplitudes, that these amplitudes sum to one, and that the amplitudes give a stationary point of quantropy subject to a constraint on the expected action. Alternatively, we can assume the amplitudes sum to one and that they give a stationary point of a quantity we call "free action", which is analogous to free energy in statistical mechanics. We compute the quantropy, expected action and free action for a free particle, and draw some conclusions from the results.

arXiv.org
@johncarlosbaez

I have always been intrigued by Chapter 10 of Feynman and Hibbs, Quantum Mechanics and Path Integrals (1965), in which the authors analyze a representation of the statistical density matrix in quantum statistical mechanics by means of a real-valued path integral. This is basically the Feynman-Kac formula, which can be constructed rigorously using Wiener measure.
@ltmccarty - yeah, that stuff is GREAT! My paper with Blake is heavily based on their calculations. We try to squeeze out a few new insights.
@ltmccarty - actually I hadn't carefully read Chapter 10. I'm looking at it right now. Thanks!

@johncarlosbaez

I have been using some of this mathematics for machine learning.

Here are two papers:

https://doi.org/10.1007/s10472-024-09929-7

https://www.researchgate.net/publication/380461219_Differential_Similarity_in_Higher_Dimensional_Spaces_Theory_and_Applications_Version_40

In the second paper (a preprint, under review), you willl see on pp. 6-7 that I discuss the connections to physics, and cite Chapter 10 of Feynman and Hibbs. I am also working on a third paper: "Manifold Logic and the Theory of Differential Similarity."

I wrote a thread on LinkedIn discussing these papers:

https://www.linkedin.com/feed/update/urn:li:activity:7197269535287058434/