New paper "Flexible tails for normalizing flows" https://arxiv.org/abs/2406.16971 with Tennessee Hickling, exploring links between generative models and extreme value theory. Come and ask me about it at the #isba2024 wednesday poster session (or any other time during the conference!)
Flexible Tails for Normalizing Flows

Normalizing flows are a flexible class of probability distributions, expressed as transformations of a simple base distribution. A limitation of standard normalizing flows is representing distributions with heavy tails, which arise in applications to both density estimation and variational inference. A popular current solution to this problem is to use a heavy tailed base distribution. Examples include the tail adaptive flow (TAF) methods of Laszkiewicz et al (2022). We argue this can lead to poor performance due to the difficulty of optimising neural networks, such as normalizing flows, under heavy tailed input. This problem is demonstrated in our paper. We propose an alternative: use a Gaussian base distribution and a final transformation layer which can produce heavy tails. We call this approach tail transform flow (TTF). Experimental results show this approach outperforms current methods, especially when the target distribution has large dimension or tail weight.

arXiv.org
Contributed talk submissions for the #ISBA2024 conference are open here 👉 https://bayesian.org/2024-world-meeting-2/
2024 World Meeting | International Society for Bayesian Analysis