@charleemos I have found both the #PyMC tutorials (https://www.pymc.io/projects/docs/en/latest/guides/Gaussian_Processes.html) and the #Stan User's Guide (https://mc-stan.org/docs/stan-users-guide/gaussian-processes.html) on #GaussianProcesses good for getting your hands dirty. Seeing GPs in action and fiddling with hyperparameters was helpful for me to understand the mathematical underpinnings.
Gaussian Processes — PyMC dev documentation

Cordial congrats to Michel Talagrand on winning this year's Abel Prize, well deserved! His works on bounding #stochastic_process are also of great value for #cosmology ! Imagine we lacked knowledge of bounds on #GaussianProcesses !

https://youtu.be/wDIqCN7E7VA?si=Jb68QcR8fyqx54Zb

#AbelPrize #AbelPrize2024 #MichelTalagrand

Michel Talagrand's reaction to winning the 2024 Abel Prize

YouTube

Via Alexander Terenin: stochastic gradient descent can be used as an efficient approximate sampling algorithm for Gaussian processes. Looks super cool: https://arxiv.org/abs/2306.11589

#GaussianProcesses #Bayesian

Sampling from Gaussian Process Posteriors using Stochastic Gradient Descent

Gaussian processes are a powerful framework for quantifying uncertainty and for sequential decision-making but are limited by the requirement of solving linear systems. In general, this has a cubic cost in dataset size and is sensitive to conditioning. We explore stochastic gradient algorithms as a computationally efficient method of approximately solving these linear systems: we develop low-variance optimization objectives for sampling from the posterior and extend these to inducing points. Counterintuitively, stochastic gradient descent often produces accurate predictions, even in cases where it does not converge quickly to the optimum. We explain this through a spectral characterization of the implicit bias from non-convergence. We show that stochastic gradient descent produces predictive distributions close to the true posterior both in regions with sufficient data coverage, and in regions sufficiently far away from the data. Experimentally, stochastic gradient descent achieves state-of-the-art performance on sufficiently large-scale or ill-conditioned regression tasks. Its uncertainty estimates match the performance of significantly more expensive baselines on a large-scale Bayesian optimization task.

arXiv.org

Funded (£20,410 4-year tax-free stipend) PhD positions available with me at Imperial College London

#PathIntegrals, #DFT, #CompChem, #MachineLearning, #GaussianProcesses, #JuliaLang, #MonteCarlo, excitement, adventure and really wild things!

Next interview round closing Friday 14th July 2023 .

More details and exemplar projects in this Google doc: https://docs.google.com/document/d/1wiG-T8uqgq_-h-Btu1tecmdrZbK-zS006eB6BKGqFgI/edit?usp=sharing

2023-01_FrostGroupPhDAdverts

2023 Frost group PhD adverts Overview / TL;DR Introduction Research Group Motivation / Philosophy Eligibility Funding Notes How to apply Project 1: Simulating charge transfer and recombination at an upconverting organic semiconductor interface Project 2: Modelling mixed electron ion conduction ...

Google Docs

@tylerjburch Yes, I hear you 😓

You've likely already fixed your installation and I'm not sure whether you're using #jupyter, but I found this guide really helpful:

https://pkseeg.com/post/jupyter-venv/

Now, I always warn people to never mess with the base installation of #python on a machine but use (virtual) environments instead.

Good luck with the #GaussianProcesses, I'm going through the #pymc tutorials for it right now 😎

How to use Jupyter Lab and Python Virtual Environments (Ubuntu/Debian) | pkseeg

How to use Jupyter Lab and Python Virtual Environments simultaneously in an Ubuntu/Debian system for better python package management, better data science project management, and general project reproducibility.

pkseeg
I'm eyeballs deep into understanding #GaussianProcesses (GPs). There are great resources out there but I can thoroughly recommend this introductory paper on #Distill by Görtler et al. The interactive plots are a great https://doi.org/10.23915/distill.00017
A Visual Exploration of Gaussian Processes

How to turn a collection of small building blocks into a versatile tool for solving regression problems.

Distill
Let's say I have samples of a bounded (but potentially noisy) function at some fixed interval of points. How would I go about determining the likelihood that my observed data was sampled from an *increasing* function with heteroscedastic noise? #machinelearning #statistics #GaussianProcesses ??

I have, once again, made the strange choice to write a blog. This one is about Gaussian Process and, particularly, about what the Markov property looks like when you don't have a linear notion of time to help you define a past and present.

Like all my GP posts, this one is wildly technical but with an aim towards being somewhat useful. The information here is hard to find unless you want to read a 400 page book translated from Russian
https://dansblog.netlify.app/posts/2023-01-21-markov/markov.html

#GaussianProcesses #machinelearning

Un garçon pas comme les autres (Bayes) - Markovian Gaussian processes: A lot of theory and some practical stuff

Well this is gonna be technical. And yes, I’m going to define it three ways. Because that’s how comedy works.

Watch our own @sethaxen summarize our recent #NeurIPS2022 workshop paper on modeling European #paleoclimate using #GaussianProcesses!

https://youtu.be/ZFiJHmZbpZA

@ml4science @unituebingen @sommer @alvaro

Spatiotemporal Modeling of European Paleoclimate using doubly sparse Gaussian Processes-NeurIPS 2022

YouTube

👋 This is my first time attending @NeuripsConf (virtually to reduce carbon emissions).

On Friday I'll join the workshop "Gaussian Processes, Spatiotemporal Modeling, and Decision-making Systems," where we have a paper, poster, and lightning talk on GPs for modeling #paleoclimate.

If you're attending and want to chat about #GaussianProcesses, probabilistic programming (#ProbProg), or @ArviZ, ping me!

#NeurIPS2022