'Entropic Gromov-Wasserstein Distances: Stability and Algorithms', by Gabriel Rioux, Ziv Goldfeld, Kengo Kato.

http://jmlr.org/papers/v25/24-0039.html

#regularization #wasserstein #variational

Entropic Gromov-Wasserstein Distances: Stability and Algorithms

Bayesian Meta-Learning Is All You Need

— Why is the deterministic view of meta-learning not sufficient?

— What is the variational inference?

— How can we design neural-based Bayesian meta-learning algorithms?

https://jameskle.com/writes/bayesian-meta-learning-is-all-you-need

#machinelearning #bayesian #metalearning #variational

Bayesian Meta-Learning Is All You Need — James Le

This blog post is my attempt to demystify the probabilistic view of meta-learning and answer these key questions: Why is the deterministic view of meta-learning not sufficient? What is the variational inference? How can we design neural-based Bayesian meta-learning algorithms?

James Le

'Structured Optimal Variational Inference for Dynamic Latent Space Models', by Peng Zhao, Anirban Bhattacharya, Debdeep Pati, Bani K. Mallick.

http://jmlr.org/papers/v25/22-0514.html

#variational #models #priors

Structured Optimal Variational Inference for Dynamic Latent Space Models

'A Framework for Improving the Reliability of Black-box Variational Inference', by Manushi Welandawe, Michael Riis Andersen, Aki Vehtari, Jonathan H. Huggins.

http://jmlr.org/papers/v25/22-0327.html

#variational #adaptively #optimization

A Framework for Improving the Reliability of Black-box Variational Inference

`Using the framework of utility-calibrated #variational inference, we unify Gaussian process approximation & data acquisition into a joint #optimization problem, thereby ensuring optimal decisions under a limited computational budget. Our approach can be used with any decision-theoretic acquisition function and is compatible with trust region methods like TuRBO... Our approach outperforms standard SVGPs on high-dimensional benchmark tasks in control and molecular design`

https://arxiv.org/abs/2406.04308

Approximation-Aware Bayesian Optimization

High-dimensional Bayesian optimization (BO) tasks such as molecular design often require 10,000 function evaluations before obtaining meaningful results. While methods like sparse variational Gaussian processes (SVGPs) reduce computational requirements in these settings, the underlying approximations result in suboptimal data acquisitions that slow the progress of optimization. In this paper we modify SVGPs to better align with the goals of BO: targeting informed data acquisition rather than global posterior fidelity. Using the framework of utility-calibrated variational inference, we unify GP approximation and data acquisition into a joint optimization problem, thereby ensuring optimal decisions under a limited computational budget. Our approach can be used with any decision-theoretic acquisition function and is compatible with trust region methods like TuRBO. We derive efficient joint objectives for the expected improvement and knowledge gradient acquisition functions in both the standard and batch BO settings. Our approach outperforms standard SVGPs on high-dimensional benchmark tasks in control and molecular design.

arXiv.org

'A Variational Approach to Bayesian Phylogenetic Inference', by Cheng Zhang, Frederick A. Matsen IV.

http://jmlr.org/papers/v25/22-0348.html

#phylogenetic #bayesian #variational

A Variational Approach to Bayesian Phylogenetic Inference

'Low-rank Variational Bayes correction to the Laplace method', by Janet van Niekerk, Haavard Rue.

http://jmlr.org/papers/v25/21-1405.html

#variational #hyperparameters #approximations

Low-rank Variational Bayes correction to the Laplace method

'Additive smoothing error in backward variational inference for general state-space models', by Mathis Chagneux, Elisabeth Gassiat, Pierre Gloaguen, Sylvain Le Corff.

http://jmlr.org/papers/v25/22-1392.html

#variational #smoothing #estimation

Additive smoothing error in backward variational inference for general state-space models

'Black Box Variational Inference with a Deterministic Objective: Faster, More Accurate, and Even More Black Box', by Ryan Giordano, Martin Ingram, Tamara Broderick.

http://jmlr.org/papers/v25/23-1015.html

#variational #optimizer #optimizing

Black Box Variational Inference with a Deterministic Objective: Faster, More Accurate, and Even More Black Box

2022.10 Variational autoencoders and Diffusion Models - Tim Salimans

YouTube