I am taking a class on Probabilistic Graphical Models (PGMs) this semester and we have a final project which can be a breadth lit review on a topic or a research project.

Does anyone know about some cool work that combined PGM or PGM methods (e.g., inference, parameter estimation, learning with partial observations, etc.) with #ComputationalNeuroscience or maybe #DecisionMaking models? Ideally with a focus on methods, algorithms, or simulations.

I'm looking for some starting point to dig through the literature a bit and see if anything catches my attention.

Thanks in advance!

#ProbabilisticGraphicalModels #Neuroscience #ExactInference #VariationalInference #CausalInference #SamplingInference #MCMC #ParameterEstimation #StructureLearning #MarkovNetworks #BayesNetworks

Towards Improved Learning in Gaussian Processes: The Best of Two Worlds

Gaussian process training decomposes into inference of the (approximate) posterior and learning of the hyperparameters. For non-Gaussian (non-conjugate) likelihoods, two common choices for approximate inference are Expectation Propagation (EP) and Variational Inference (VI), which have complementary strengths and weaknesses. While VI's lower bound to the marginal likelihood is a suitable objective for inferring the approximate posterior, it does not automatically imply it is a good learning objective for hyperparameter optimization. We design a hybrid training procedure where the inference leverages conjugate-computation VI and the learning uses an EP-like marginal likelihood approximation. We empirically demonstrate on binary classification that this provides a good learning objective and generalizes better.

arXiv.org