David Pfau

@pfau
6 Followers
193 Following
51 Posts
So far I have not found the science, but the numbers keep on circling me. Views are very much my own.
Websitehttp://davidpfau.com
For those heading to NeurIPS - I'll be speaking at the ML and Physical Sciences workshop on Saturday, bright and early (8 AM!) about our work on deep learning for quantum chemistry and condensed matter physics. Hope to see you there! http://ml4physicalsciences.github.io/2022 #neurips2022
Machine Learning and the Physical Sciences, NeurIPS 2022

Website for the Machine Learning and the Physical Sciences (MLPS) workshop at the 35th Conference on Neural Information Processing Systems (NeurIPS)

Very disappointing to see the discourse sliding from "large language models don't work as well as the hype suggests" to "large language models are inherently dangerous". Let's not get so full of ourselves that we act like we're handling plutonium here.
@jasonpjason Probably a good place to rebuild your network in case Twitter really does go down, and then presumably some people will build a viable replacement using Mastodon's code as the starting point.
@alemi The Stochastic Variation Inference paper in JMLR by Matt Hoffman has a very nice overview of how to do natural gradient descent with exponential family distributions. Just a very elegant paper overall.
This place reminds me quite a bit of '90s Slashdot, in that it's a mix of interesting stuff and people endlessly hating on the big corporate incumbent they are trying to displace.
...to be clear, I mean it's impressive how quickly Mastodon itself has grown. I'm not patting myself on the back here.
I've been on Mastodon less than a month, and just crossed 1000 followers. Pretty impressive! Now I just need to post about something other than Mastodon itself...
@dpkingma Lots of people seem to be heading there in the Twitter exodus instead of here.
Crap, am I gonna have to get a Discord account too?

I am dead, I can’t believe this AI intro text I just got. If you thumb through it, there are all the usual suspects like breadth-first search and probability.

Then you open the first chapter and it goes HARD on the current state of things.

Then it’s back to mathematical notation like nothing happened.