Now Chris Williams is presenting a principled probabilistic way to train #capsule networks (originally by @geoffreyhinton
) as generative models.
A variational approach to train generative part-based generative models.
Now Chris Williams is presenting a principled probabilistic way to train #capsule networks (originally by @geoffreyhinton
) as generative models.
A variational approach to train generative part-based generative models.
@mathieualain not true, the biggest benefit is having a clear part based decomposition.
The more structure, the less examples you need
Totally agree!
Looking forward to hear from you @emtiyaz on this recent paper where you can have a #differentiable #layer that encodes #logical #constraints and provably #guarantees that the predictions of a #neural net satisfy the constraint!
By making everything modular and compartimentalized (plug&play) you can easily integrate #symbolic and #neural #reasoning
@nolovedeeplearning @mathieualain thanks sharing. Itās a beautifully written paper. Congratulations!
I wish I could understand sec 3 better. Specifically how PC enable tractable inference, compared to eg decomposable PGMs (used to work on that earlier). I hope someday I get time to do it. Is there a tutorial somewhere from people like me who know PGMs well?
@emtiyaz @nolovedeeplearning @mathieualain Antonio and others gave a great tutorial on PCs at various venues. There is a recording of one of them on YouTube. PCs are very accessible for people with PGM background.