1/ PyVBMC 1.0 is out! 🎉

https://github.com/acerbilab/pyvbmc

A new Python package for efficient Bayesian inference.

Get a posterior distribution over model parameters + the model evidence with a small number of likelihood evaluations.

No AGI was created in the process!

GitHub - acerbilab/pyvbmc: PyVBMC: Variational Bayesian Monte Carlo algorithm for posterior and model inference in Python

PyVBMC: Variational Bayesian Monte Carlo algorithm for posterior and model inference in Python - GitHub - acerbilab/pyvbmc: PyVBMC: Variational Bayesian Monte Carlo algorithm for posterior and mode...

GitHub

2/ PyVBMC works similarly to Bayesian optimization, but with the goal of *inference* (getting the full posterior distribution) instead of *optimization* (getting a single point estimate).

Probabilistic numerics FTW.

3/ Using PyVBMC is super simple.

Define your model (prior, likelihood, a few options), give them to the VBMC optimization process, and go!

4/ PyVBMC is particularly effective when the evaluation of the likelihood of your model:

- is at least a bit expensive (e.g., ~ one second per evaluation),

OR

- the likelihood evaluation is stochastic (e.g., estimated by Monte Carlo sampling aka simulation).

(or both)

5/ PyVBMC uses Gaussian processes and a variational approximation to efficiently compute a flexible (non-Gaussian) approximate posterior distribution of the model parameters, from which statistics and posterior samples can be easily extracted.

6/ Like all methods, there are limitations!

PyVBMC:
- Works up to ~10 continuous model parameters
- Works with reasonably smooth target posteriors (no weird posteriors)
- Can deal with mild multi-modality, but (for now) might not work with very separate modes

7/ PyVBMC is available on both PyPI (pip install pyvbmc) and conda-forge, and provides a user-friendly Pythonic interface.

GitHub repo: https://github.com/acerbilab/pyvbmc

Detailed notebook tutorials and documentation make it accessible to new and experienced users:
https://acerbilab.github.io/pyvbmc/examples.html

GitHub - acerbilab/pyvbmc: PyVBMC: Variational Bayesian Monte Carlo algorithm for posterior and model inference in Python

PyVBMC: Variational Bayesian Monte Carlo algorithm for posterior and model inference in Python - GitHub - acerbilab/pyvbmc: PyVBMC: Variational Bayesian Monte Carlo algorithm for posterior and mode...

GitHub

8/ References:

Our original VBMC papers, detailing the algorithm:
- NeurIPS 2018: https://arxiv.org/abs/1810.05558
- NeurIPS 2020: http://arxiv.org/abs/2006.08655

And we have a tl;dr software preprint: https://arxiv.org/abs/2303.09519

Variational Bayesian Monte Carlo

Many probabilistic models of interest in scientific computing and machine learning have expensive, black-box likelihoods that prevent the application of standard techniques for Bayesian inference, such as MCMC, which would require access to the gradient or a large number of likelihood evaluations. We introduce here a novel sample-efficient inference framework, Variational Bayesian Monte Carlo (VBMC). VBMC combines variational inference with Gaussian-process based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective. Our method produces both a nonparametric approximation of the posterior distribution and an approximate lower bound of the model evidence, useful for model selection. We demonstrate VBMC both on several synthetic likelihoods and on a neuronal model with data from real neurons. Across all tested problems and dimensions (up to $D = 10$), VBMC performs consistently well in reconstructing the posterior and the model evidence with a limited budget of likelihood evaluations, unlike other methods that work only in very low dimensions. Our framework shows great promise as a novel tool for posterior and model inference with expensive, black-box likelihoods.

arXiv.org

9/ Work on this package is a joint effort of many fantastic machine learning developers working in my group at various times: Bobby Huggins, Chengkun Li, Marlon Tobaben, Mikko Aarnos.

Thanks to @FCAI & UnivHelsinkiCS for funding and supporting the project!

10/ We are very interested in which kind of problems you might apply PyVBMC to, and connecting to the broader open source and probabilistic programming language crowd (e.g., @pymc, @mcmc_stan, @ArviZ).

Please get in touch if you have any questions or feedback for us!

11/ PS: We have other tools coming out soon, follow me or our spaces for more info: https://github.com/acerbilab
acerbilab

Machine and Human Intelligence Research Group - University of Helsinki - acerbilab

GitHub