L Thorne McCarty

71 Followers
111 Following
174 Posts

Professor of Computer Science and Law, Emeritus, Rutgers University.

I have been working on AI and Law for a long time. I am particularly interested in bridging the gap between machine learning and logical rules, which is an important issue today for legal technology. You can find most of my papers on ResearchGate: https://www.researchgate.net/profile/L-Thorne-Mccarty

@MissConstrue The California authorities should arrest these ICE agents and prosecute them under state law.
@ra6bit

I favor a strategy of massive tax resistance. File incomplete or erroneous tax returns, and then file petitions in Tax Court. The tax code is so complex and the people who administer it are so stupid that, if you challenge everything, you will succeed on something. Imagine what would happen if 100,000 taxpayers followed this strategy!
Boost this toot if you're planning on sticking around Mastodon & the Fediverse whether or not it's more popular than Bluesky.
@laurence Mastodon.

@johncarlosbaez

I have been using some of this mathematics for machine learning.

Here are two papers:

https://doi.org/10.1007/s10472-024-09929-7

https://www.researchgate.net/publication/380461219_Differential_Similarity_in_Higher_Dimensional_Spaces_Theory_and_Applications_Version_40

In the second paper (a preprint, under review), you willl see on pp. 6-7 that I discuss the connections to physics, and cite Chapter 10 of Feynman and Hibbs. I am also working on a third paper: "Manifold Logic and the Theory of Differential Similarity."

I wrote a thread on LinkedIn discussing these papers:

https://www.linkedin.com/feed/update/urn:li:activity:7197269535287058434/

Before I kick the bucket, I want to figure out how these frameworks fit together:

classical mechanics
classical statistical mechanics
classical field theory
quantum mechanics
quantum statistical mechanics
quantum field theory
thermodynamics

and probably some more. For example, one famous weird thing is that if you take classical statistical mechanics and replace

1/(Boltzmann's constant × temperature)

with

i × Planck's constant × time

in all your equations, you get quantum mechanics - more or less. So if you ignore the constants, this is saying that "imaginary time" - whatever the hell that is! - acts like "inverse temperature".

Physicists use this fact a lot, but remain divided on whether it's "just a trick". I don't think something this big can be just a trick!

But there are other ways to set up this analogy. I wrote a paper with Blake Pollard where instead we said inverse temperature is analogous to i × Planck's constant. We pushed this other analogy to the point of figuring out what in quantum mechanics corresponds to 𝑒𝑛𝑡𝑟𝑜𝑝𝑦 in classical statistical mechanics. We called it "quantropy", and worked out this nice chart.

But now I'm wishing we hadn't set Boltzmann's constant equal to 1. And I want to compare our analogy to the usual one, and figure out what the hell is going on. When there are multiple mathematically rigorous analogies between frameworks you should get serious and study them all, not just pick one and ignore the rest.

I'm also annoyed that we didn't notice that the thing analogous to free energy, which I called "free action" or Φ, is what physicists call the "effective action".

Here's our paper:

https://arxiv.org/abs/1311.0813

Quantropy

There is a well-known analogy between statistical and quantum mechanics. In statistical mechanics, Boltzmann realized that the probability for a system in thermal equilibrium to occupy a given state is proportional to exp(-E/kT) where E is the energy of that state. In quantum mechanics, Feynman realized that the amplitude for a system to undergo a given history is proportional to exp(-S/i hbar) where S is the action of that history. In statistical mechanics we can recover Boltzmann's formula by maximizing entropy subject to a constraint on the expected energy. This raises the question: what is the quantum mechanical analogue of entropy? We give a formula for this quantity, which we call "quantropy". We recover Feynman's formula from assuming that histories have complex amplitudes, that these amplitudes sum to one, and that the amplitudes give a stationary point of quantropy subject to a constraint on the expected action. Alternatively, we can assume the amplitudes sum to one and that they give a stationary point of a quantity we call "free action", which is analogous to free energy in statistical mechanics. We compute the quantropy, expected action and free action for a free particle, and draw some conclusions from the results.

arXiv.org
@johncarlosbaez

I have always been intrigued by Chapter 10 of Feynman and Hibbs, Quantum Mechanics and Path Integrals (1965), in which the authors analyze a representation of the statistical density matrix in quantum statistical mechanics by means of a real-valued path integral. This is basically the Feynman-Kac formula, which can be constructed rigorously using Wiener measure.
Potential challengers to Kamala Harris are uniting behind her

Less than a half-hour after posting his announcement that he would not continue his campaign for a second term, President Joe Biden put out a second statement, in which he fully endorsed Vice President Kamala Harris as the next Democratic nominee....

Daily Kos
Ah the photos of disappointed fascists holding empty wine glasses ready for the champagne are just 👌.