Seth Axen 🪓

@sethaxen
31 Followers
38 Following
23 Posts
probabilistic programmer (#probprog), #FOSS contributor and advocate, #JuliaLang developer, bringing #MachineLearning to the sciences at the ML ⇌ Science Colab @unituebingen. he/him
Websitehttps://sethaxen.com
GitHubhttps://github.com/sethaxen
Twitterhttps://twitter.com/sethaxen

Hello, I am a statistician at Chalmers University of Technology and University of Gothenburg!

I work on scientific #machinelearning problems taking a Bayesian #statistics and causal inference perspective and contribute to #JuliaLang.

#Introduction

@johnryan Yeah I do #probprog in #JuliaLang, and it's great that we can use arbitrary Julia code within our models. This is because most of the language is differentiable with #autodiff and code is composable, which is not the case for most PPLs.

For #deeplearning research, Julia could come in handy for writing and transforming custom kernels without fussing with CUDA, as some posts in that thread note, but I have no experience with this.

@spinkney Nicely done! Happy Halloween! 🎃

@johnryan This blog post reflects one (important) perspective. The community had a large discussion about this earlier this year: https://discourse.julialang.org/t/state-of-machine-learning-in-julia/74385

My take away is "it depends." If you're doing classical ML, Julia is usually fine. For cutting edge DL, it's generally not as mature as Python yet. For research/nonstandard ML, Julia shines.

#julialang #machinelearning

State of machine learning in Julia

I’ll offer a perspective from someone who (as a conscious choice) primarily uses Python over Julia. I work with, and maintain libraries for, all of PyTorch, JAX, and Julia. For context my answers will draw some parallels between: JAX, with Equinox for neural networks; Julia, with Flux for neural networks; as these actually feel remarkably similar. JAX and Julia are both based around jit-compilers; both ubiquitously perform program transforms via homoiconicity. Equinox and Flux both build mod...

Julia Programming Language

In ArviZ.jl we store inference results (especially #MCMC draws) as InferenceData. It's built on DimensionalData, so we have multidimensional real arrays with named dimensions. Each array element is a marginal of a random draw, which is a useful format for plotting, #statistics, and diagnostics, but sometimes it's useful to get back to a structure more like what a PPL might emit.

Surprisingly, we can get pretty close with just 8 lines of code:
https://github.com/arviz-devs/InferenceObjects.jl/issues/27

#probprog #foss #JuliaLang

Utility for unflattening Datasets · Issue #27 · arviz-devs/InferenceObjects.jl

The natural way to represent a draw from a posterior distribution is as a NamedTuple whose keys are parameter names and whose values are the values. The values can be scalars, arrays, or arbitrary ...

GitHub
@cameron_pfiffer Seriously, and it's about the only day where it's normal for random neighbors to visit each other. In my old neighborhood, band members all lived in the same house, and every Halloween hundreds of people would fill the streets to watch them perform a live show on their lawn. No other holiday comes close!
@PhilippHennig Ah, good to know for next year! We leave at the bottom of the hill and have seen almost no decorations and no trick-or-treaters. We'll have to explore the top of the hill next time!
@cameron_pfiffer and easy to support him via https://opencollective.com/mastodon or Patreon
Mastodon - Open Collective

Mastodon is a free, open-source social network server based on ActivityPub

Just so y'all know, the lead Mastodon guy does this for a hilariously small amount of money. OSS is something else

https://mastodon.social/@Gargron/109260715240000670

📢PARENTS📢

Make SURE you check your children's candy carefully this Halloween🍬

I just found HETEROSKEDASTICITY in this snickers bar😲😱