Now Chris Williams is presenting a principled probabilistic way to train #capsule networks (originally by @geoffreyhinton
) as generative models.

A variational approach to train generative part-based generative models.

#ANC #Seminar #ML #AI #probabilistic #generative #modeling

@nolovedeeplearning Cool! I thought the research on capsule networks had died out since apparently they are expensive to train and don't provide enough benefits compared to CNNs.

@mathieualain not true, the biggest benefit is having a clear part based decomposition.

The more structure, the less examples you need

@nolovedeeplearning Interesting! I need to learn more about them then šŸ˜…
@mathieualain @nolovedeeplearning we need models that ā€œcompartmentalizeā€ the knowledge locally, while keeping it distributed for robustness reasons. It’s a tough problem but little is known how and whether this happens in existing networks.

@emtiyaz @mathieualain

Totally agree!

Looking forward to hear from you @emtiyaz on this recent paper where you can have a #differentiable #layer that encodes #logical #constraints and provably #guarantees that the predictions of a #neural net satisfy the constraint!

By making everything modular and compartimentalized (plug&play) you can easily integrate #symbolic and #neural #reasoning

https://openreview.net/forum?id=o-mxIWAY1T8

Semantic Probabilistic Layers for Neuro-Symbolic Learning

We design a predictive layer for structured-output prediction (SOP) that can be plugged into any neural network guaranteeing its predictions are consistent with a set of predefined symbolic...

OpenReview

@nolovedeeplearning @mathieualain thanks sharing. It’s a beautifully written paper. Congratulations!

I wish I could understand sec 3 better. Specifically how PC enable tractable inference, compared to eg decomposable PGMs (used to work on that earlier). I hope someday I get time to do it. Is there a tutorial somewhere from people like me who know PGMs well?

@emtiyaz @mathieualain

tl;dr: classical #PGMs can be compiled into #circuits #PCs and then #overparameterized to increase #expressiveness but retaining #tractability.

A gentle introduction for #probabilistic and #PGM folks is this paper https://web.cs.ucla.edu/~guyvdb/papers/ProbCirc20.pdf

A companion #video #tutorial is here https://www.youtube.com/watch?v=2RAG5-L9R70

I will do a new version of this tutorial with YooJung Choi and Robert Peharz at #NeurIPS2022 in a month!

Showing latest advancements in #reliable #inference with #PCs!

@nolovedeeplearning @mathieualain thanks. Just read the intro and it sounds magical. I had always thought that it should be possible to convert intractable PGMs into much larger tree-like Graphs where linear time inference is possible. And it sounded like that to me and I want to learn more.

If you have time please come to Japan to visit and teach us more. I think using PCs as posterior candidate is a straightforward idea which I have been dreaming for many years.

@emtiyaz @mathieualain thanks for the invitation, super appreciated!

Let's chat via Zoom one of these days? (or will you be at #NeurIPS2022 by any chance?)

@nolovedeeplearning @mathieualain not there but yes let’s try zoom sometimes (I tend to forget though so pardon me if I don’t get back right away)