Handbook of Markov chain Monte Carlo, second edition
Markov chain Monte Carlo(MCMC) 핸드북 2판이 편집되었으며, 주요 저자들이 참여해 각 장을 아카이브에 공개하고 있다. 특히 현대 하드웨어와 소프트웨어에서 MCMC 실행에 관한 장이 추천되며, 확률 프로그래밍, 순차 몬테카를로, 분할 정복 알고리즘 등은 포함되지 않아 아쉬움을 남긴다. 이 책은 MCMC 연구자와 개발자에게 유용한 참고서로 평가된다.

https://statmodeling.stat.columbia.edu/2026/05/06/handbook-of-markov-chain-monte-carlo-second-edition/

#markovchainmontecarlo #mcmc #montecarlo #probabilisticprogramming #statisticalcomputing

Handbook of Markov chain Monte Carlo, second edition | Statistical Modeling, Causal Inference, and Social Science

I have spent some time cleaning up my home-grown #Bayesian inference library for public consumption. Enjoy:
https://codeberg.org/wasowski/probula

The story goes that I needed a pure Scala3 replacement for #Figaro, that I can use for teaching purposes. The status is:
- Probula can handle regression models
- Importance sampling, monadic style implementation
- Very basic descriptive stats built in.
- CVS export for arviz, to perform posterior analysis.

#probula #scala #oss #ProbabilisticProgramming #foss

Recently, we discussed @kach 's paper on self-inferring probabilistic programs in our group's journal club. This inspired me to dive into the implementation of probabilistic programming languages and to try it out myself using #Gleam.

I've put together a blog article about that:
https://a5s.eu/blog/gleam-ppl/

And here is the code:
https://codeberg.org/andreas-k/tinypp

#ProbabilisticProgramming

Tiny Gleam PPL

📢 Episode 126 is Live!

🎧 Listen now 👉 https://learnbayesstats.com/episode/126-mmm-clv-bayesian-marketing-analytics-will-dean

🎙️ In this episode with
Alex Andorra, Will Dean from
PyMC-Labs explains how Bayesian methods are reshaping marketing analytics, from MMM to CLV estimation and more ....

#BayesianMarketing #MMM #CLV #MarketingAnalytics #MachineLearning #ProbabilisticProgramming #DataScience #PyMC #Marketing

Learning Bayesian Statistics – Laplace to be for new & veteran Bayesians alike!

Laplace to be for new & veteran Bayesians alike!

Learning Bayesian Statistics – Laplace to be for new & veteran Bayesians alike!

Made an introductory 📕(draft) about using Python for Bayesian Inference and unifying narrative, math, and code. People seem to find it helpful. Check it out. Feedback encouraged.

https://persuasivepython.com

#DataScience #Python #bayes #Stats #probabilisticprogramming

Persuasive Python

I’m in #Vancouver for the next month. If someone wants meet for chats on #probabilistic #MachineLearning, #ProbabilisticCircuits, or #ProbabilisticProgramming feel free to DM me!
Automatic differentiation in Prolog. ~ Tom Schrijvers, Birthe van den Berg, Fabrizio Riguzzi. https://arxiv.org/abs/2305.07878 #Prolog #LogicProgramming #AutomaticDifferentiantion #ProbabilisticProgramming
Automatic Differentiation in Prolog

Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function's (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.

arXiv.org
Last day of #bayescomp2023, I very much enjoyed yesterday’s panel on #ProbabilisticProgramming. Looking forward to today’s schedule.
This one became a nice example of implementing a custom CUDA kernel through various stages of optimization #CUDA #GPU #ProbabilisticProgramming https://indii.org/blog/sum-of-discrete/
Sums of Discrete Random Variables as Banded Matrix Products

A zero-stide catch and custom CUDA kernel.

indii.org

I am looking for
a) examples of tools that let you build statistical models more complex then just variations of a single model class (like most stat packages - brms, laavan, ...) but less complex than fully fledged probabilistic programming languages
b) Probabilistic programming languages that neatly support composing non-trivial submodels together

Does anyone have recs?
In both cases I am coming up almost empty handed...

#stan #ppl #ProbabilisticProgramming #brms