We propose a new family of probability densities that have closed form normalising constants. Our densities use two layer neural networks as parameters, and strictly generalise exponential families. We show that the squared norm can be integrated in closed form, resulting in the normalizing constant. We call the densities Squared Neural Family (#SNEFY), which are closed under conditioning.
Accepted at #NeurIPS2023. #MachineLearning #Bayesian #GaussianProcess
Squared Neural Families: A New Class of Tractable Density Models
Flexible models for probability distributions are an essential ingredient in many machine learning tasks. We develop and investigate a new class of probability distributions, which we call a Squared Neural Family (SNEFY), formed by squaring the 2-norm of a neural network and normalising it with respect to a base measure. Following the reasoning similar to the well established connections between infinitely wide neural networks and Gaussian processes, we show that SNEFYs admit closed form normalising constants in many cases of interest, thereby resulting in flexible yet fully tractable density models. SNEFYs strictly generalise classical exponential families, are closed under conditioning, and have tractable marginal distributions. Their utility is illustrated on a variety of density estimation, conditional density estimation, and density estimation with missing data tasks.