The #paperOfTheDay is "Exponents for the excluded volume problem as derived by the Wilson Method" from 1972. This one-page letter makes one elementary, but far-reaching observation: The O(N)-invariant scalar #quantumFieldTheory , which had a few months before been treated by Wilson in terms of dimensional regularization, has a special interpretation when N=0. Namely, one assigns to every #FeynmanGraph a "symmetry factor", which is a polynomial in N. The coefficient of N^k in this polynomial counts how many ways there are to decompose the 4-valent vertices of the graph such that one obtains exactly k cycles. If one sets N=0, all that remains is the constant term: It counts the ways of decomposing the graph without forming any cycle.
One is interested in the statistical behaviour of non-crossing paths on a lattice, called a self-excluding walk. This can be studied with the methods of statistical #physics . One introduces a Boltzmann-type weight exp(-n*p), where p is a parameter (analogous to the inverse temperature or the Planck constant), and n is the length of a self-excluding walk. Let N_n be the number of different such walks (for a fixed size of the lattice, or counted relative to the lattice size), then the sum of N_n*exp(-n*p) is analogous to a partition function, or path integral. Hence, it can be analyzed perturbatively with Feynman integrals, namely those mentioned above of the O(N) theory at N=0. This way, one obtains, for example, the critical exponents.
https://www.sciencedirect.com/science/article/pii/0375960172901491
My #paperOfTheDay is "Event generation with exponential scaling in multiplicity using AmpliCol" https://arxiv.org/abs/2601.19483 . Scattering experiments of elementary particles, e.g. at CERN, are very complicated even beyond the "core" physical process, for example the detectors can only measure in certain angles and with certain thresholds. To calibrate them, one needs to simulate the whole processes numerically, which is done with Monte Carlo methods. Among other things, the behaviour of particles in quantum chromodynamics depends on their "color", which is a type of charge somewhat analogous to electrical charge. Since that dependence is very complicated, the idea of the present paper is to use a simplified "leading" color-dependence as a proxy for importance sampling in the Monte Carlo simulation. This is analogous, but for a totally different physical question, to the weighting of individual random #FeynmanGraph s that I did in my "Predicting Feynman Periods" paper a while ago. https://link.springer.com/article/10.1007/JHEP11(2024)038 #dailyPaperChallenge
Event generation with exponential scaling in multiplicity using AmpliCol

Efficient generation of LHC events is hindered by the rapidly rising cost of evaluating QCD matrix elements with increasing multiplicity. We build on a recently proposed two-step strategy in which unweighted events are first generated using the leading-colour (LC) approximation and then reweighted to full-colour (FC) accuracy, utilising the LC integration efficiency while recovering the exact FC prediction. In this work we extend the method to general Standard Model processes and present AmpliCol, a standalone implementation designed for LHC collisions. We benchmark multi-jet, $t\bar{t}$+jets, $ZZ$+jets, and Drell-Yan+jets production, measuring the time required to obtain a fixed number of unweighted events at FC accuracy. Across all processes, the runtime exhibits a stable exponential scaling with multiplicity, far milder than the factorial growth of conventional matrix-element generators. This demonstrates that the AmpliCol code enables efficient event generation at multiplicities that are otherwise computationally prohibitive.

arXiv.org

New theoretical #physics preprint https://arxiv.org/abs/2412.08617
We looked at the asymptotic growth rate of the beta function in #quantumFieldTheory , and the relative importance of subdivergence-free #Feynmangraph s. These graphs correspond to integrals, and the size of the graph is measured by its loop number, which also indicates how hard it is to solve the integral. State of the art computations in realistic theories are anywhere between 1 and 6 loops. The asymptotics of the perturbation series is known from instanton calculations. We now showed (in a model theory), that the leading asymptotics describes the true growth rate only for more than 25 loops, way beyond anything that can realistically be computed.

This is good news: It tells us that asymptotic instanton calculations provide non-trivial additional information that can not be trivially inferred from low-order perturbation theory.
In the plot, the red dots are numerical data points for the subdivergence-free graphs in phi^4 theory up to 18 loops, the green lines are the leading instanton asymptotics.

Primitive asymptotics in $ϕ^4$ vector theory

A longstanding conjecture in $ϕ^4_4$ theory is that primitive graphs dominate the beta function asymptotically at large loop order in the minimal-subtraction scheme. Here we investigate this issue by exploiting additional combinatorial structure coming from an extension to vectors with $O(N)$ symmetry. For the 0-dimensional case, we calculate the $N$-dependent generating function of primitive graphs and its asymptotics, including arbitrarily many subleading corrections. We find that the leading asymptotic growth rate becomes visible only above $\approx 25$ loops, while data at lower order is suggestive of a wrong asymptotics. Our results also yield the exact asymptotics of Martin invariants. In 4D, each graph comes with a nontrivial Feynman integral, its period. We give bounds on the degree in $N$ for primitive and non-primitive graphs, and construct the primitive graphs of highest degree explicitly. We calculate the 4D primitive beta function numerically up to 17 loops. The qualitative behaviour is similar to the 0D series, with a small but systematic tendency for the 4D data to grow faster with $N$, indicating a correlation between periods and $O(N)$-symmetry factors. The zeros of the 4D primitive beta function approach their asymptotic locations quickly, but, like in 0D, the growth rate of the 4D primitive beta function does not match its asymptotics even at 17 loops. Our results improve on the knowledge of asymptotics in QFT by providing concrete analytic and numerical values, and putting individual observables into a broader context of $ϕ^4_4$ theory in 0D and 4D. We demonstrate that even if certain quantities are in agreement with the asymptotics already below 10 loops, this must not be mistaken as evidence that overall an asymptotic regime has been reached.

arXiv.org
My article together with Kimia Shaban has appeared in JHEP today. We have examined the #statistics of #Feynmangraph s in #QFT , and how they can be exploited to efficiently compute #amplitudes at high loop order.
The article is open access, and the dataset is freely available from my website if you want to explore statistics and correlations yourself. Predicting the values of these Feynman integrals could also be interesting as a test case for #machinelearning
https://link.springer.com/article/10.1007/JHEP11(2024)038
Predicting Feynman periods in ϕ4-theory - Journal of High Energy Physics

We present efficient data-driven approaches to predict the value of subdivergence-free Feynman integrals (Feynman periods) in ϕ4-theory from properties of the underlying Feynman graphs, based on a statistical examination of almost 2 million graphs. We find that the numbers of cuts and cycles determines the period to better than 2% relative accuracy. Hepp bound and Martin invariant allow for even more accurate predictions. In most cases, the period is a multi-linear function of the properties in question. Furthermore, we investigate the usefulness of machine-learning algorithms to predict the period. When sufficiently many properties of the graph are used, the period can be predicted with better than 0.05% relative accuracy.We use one of the constructed prediction models for weighted Monte-Carlo sampling of Feynman graphs, and compute the primitive contribution to the beta function of ϕ4-theory at L ∈ {13, … , 17} loops. Our results confirm the previously known numerical estimates of the primitive beta function and improve their accuracy. Compared to uniform random sampling of graphs, our new algorithm is 1000-times faster to reach a desired accuracy, or reaches 32-fold higher accuracy in fixed runtime.The dataset of all periods computed for this work, combined with a previous dataset, is made publicly available. Besides the physical application, it could serve as a benchmark for graph-based machine learning algorithms.

SpringerLink
I recently discovered an excellent #math article by Krajewski and Martinetti that I had overlooked so far: https://arxiv.org/abs/0806.4309
Basically, renormalization of #Feynmangraph s in #qft is organized in terms of rooted trees, and so are solutions to differential equations, concatenation of differential operators, numerical integration schemes, and the Hopf algebra of power series. It is intuitively clear that all these things must be closely related, but I wasn't aware that there is this article where the relations are actually spelled out in detail. They also include a derivation of Wigner's semicircle law for Gaussian random matrices in the rooted-tree formalism, which is something I didn't think about at all. Learned another unexpected connection 😀 .
Wilsonian renormalization, differential equations and Hopf algebras

In this paper, we present an algebraic formalism inspired by Butcher's B-series in numerical analysis and the Connes-Kreimer approach to perturbative renormalization. We first define power series of non linear operators and propose several applications, among which the perturbative solution of a fixed point equation using the non linear geometric series. Then, following Polchinski, we show how perturbative renormalization works for a non linear perturbation of a linear differential equation that governs the flow of effective actions. Then, we define a general Hopf algebra of Feynman diagrams adapted to iterations of background field effective action computations. As a simple combinatorial illustration, we show how these techniques can be used to recover the universality of the Tutte polynomial and its relation to the $q$-state Potts model. As a more sophisticated example, we use ordered diagrams with decorations and external structures to solve the Polchinski's exact renormalization group equation. Finally, we work out an analogous construction for the Schwinger-Dyson equations, which yields a bijection between planar $ϕ^{3}$ diagrams and a certain class of decorated rooted trees.

arXiv.org
Here is a curious finding from our statistical analysis https://arxiv.org/abs/2403.16217 :
A #Feynmangraph is a graphical short hand notation for a complicated integral that computes the probability for scattering processes in #quantum field theory.
An electrical circuit can also be described as a graph. What happens if we interpret the Feynman graph as an #electrical network, where each edge is a 1 Ohm resistor? We can then compute the resistance between any pair of vertices and collect all these values in a "resistance matrix", as shown below. The average of all these resistances is called "Kirchhoff index". Now it turns out that this average resistance is correlated fairly strongly with the Feynman integral of that graph: A graph with large contribution to quantum scattering amplitudes on average also has a large electrical resistance. Isn't that a nice connection between two seemingly distinct branches of theoretical #physics ?
Predicting Feynman periods in $ϕ^4$-theory

We present efficient data-driven approaches to predict Feynman periods in $ϕ^4$-theory from properties of the underlying Feynman graphs. We find that the numbers of cuts and cycles determines the period to approximately 2% accuracy. Hepp bound and Martin invariant allow to predict the period with accuracy much better than 1%. In most cases, the period is a multi-linear function of the parameters in question. Besides classical correlation analysis, we also investigate the usefulness of machine-learning algorithms to predict the period. When sufficiently many properties of the graph are used, the period can be predicted with better than 0.05% relative accuracy. We use one of the constructed prediction models for weighted Monte-Carlo sampling of Feynman graphs, and compute the primitive contribution to the beta function of $ϕ^4$-theory at $L\in \left \lbrace 13, 14, 15, 16 \right \rbrace $ loops. Our results confirm the previously known numerical estimates of the primitive beta function and improve their accuracy. Compared to uniform random sampling of graphs, our new algorithm reaches 35-fold higher accuracy in fixed runtime, or requires 1000-fold less runtime to reach a given accuracy. The data set of all periods computed for this work, combined with a previous data set, is made publicly available. Besides the physical application, it could serve as a benchmark for graph-based machine learning algorithms.

arXiv.org
The #Feynmangraph s that contribute to scattering amplitudes in #quantum field theory come in all shapes and sizes. Can one guess how much their Feynman integral will contribute just from looking at them 🤔? It turns out one can! Let's consider subdivergence-free graphs at 12 loops. The pictures show the two largest and the two smallest contributors. All graphs have the same number of edges and vertices. The graphs that contribute strongly look "larger", and the small contributors look "more dense", but drawings are of course arbitrary. A closer examination shows that "symmetry", as measured by graph automorphisms, is not clearly related to the value of the Feynman integral.