Karen Ullrich

477 Followers
68 Following
9 Posts
Research scientist at FAIR NY + collab w/ Vector Institute. ❤️ Machine Learning + Information Theory. Previously, PhD at UoAmsterdam, intern at DeepMind + MSRC.
websitehttps://karenullrich.info/
Gscholarhttps://scholar.google.com/citations?user=TMIPmNAAAAAJ&hl=en&oi=ao
🤩 Tomorrow the #ICML2023 workshops shall begin.🎙️ Join me at 9 AM I will discuss absolutely all there is to know about #BitsBackCoding @ the Structured Probabilistic Inference & Generative Modeling workshop.
#AI #ML #DataCompression

RT @neural_compress
The 2nd iteration of the "Neural Compression: From Information Theory to Applications" workshop will take place @icmlconf in Hawaii this year!

Submissions due May 27th. For more details:https://neuralcompression.github.io/workshop23

@BerivanISIK @yiboyang @_dsevero @karen_ullrich @robamler @s_mandt

Neural Compression | Workshop 2023

A venue for neural compression

If you are interested in serving as a reviewer, please fill out the form: https://docs.google.com/forms/d/e/1FAIpQLSd3L9_o7vAZUSWjWMxi18jZHuIrBaafUBm6v1fTZQorK2o9Qw/viewform
Thank you for reviewing for the ICML 2023 Neural Compression: From Information Theory to Applications workshop

Please fill out this form if you are interested in being a reviewer for the ICML 2023 Neural Compression: From Information Theory to Applications workshop. The review period will be May 26th-June 9th with emergency reviews happening the week after. 🏆 2 free ICML 2023 Workshop registrations will be given as "Best Reviewer Awards" 🏆

Google Docs

The 2nd iteration of the "Neural Compression: From Information Theory to Applications" workshop will take place
@icmlconf
in Hawaii this year!

Submissions due May 27th. For more details:https://neuralcompression.github.io/workshop23

@BerivanISIK @yiboyang @dsevero @stephan_mandt

Neural Compression | Workshop 2023

A venue for neural compression

Our #ICLR2023 workshop on Physics4ML is open for submissions. Deadline: 3rd February.

Submit your work on physics-based ML, equivariance, etc here: https://openreview.net/group?id=ICLR.cc/2023/Workshop/Physics4ML

More info:
https://physics4ml.github.io/

https://mobile.twitter.com/tk_rusch/status/1610305901558210563

#Physics4ML #AI4Science #GeometricDeepLearning

ICLR 2023 Workshop Physics4ML

Welcome to the OpenReview homepage for ICLR 2023 Workshop Physics4ML

OpenReview

🚨 Internship opportunity🚨

If you are interested in information theory, generative modeling, and AI4 Science let’s learn and explore together @ FAIR New York.

I got one internship spot for 2023. Apply until Jan 12 via the website, and mind the minimum requirements!

Link to the application form -> https://www.metacareers.com/jobs/901899764520819/
Additionally send me an email (karenu@) with subject: [Internship 2023] YOUR NAME
Talk about who you are and what we could work on together. Please no DMs.

Research Scientist Intern, AI Core Machine Learning (PhD)

Meta's mission is to give people the power to build community and bring the world closer together. Together, we can help people build stronger communities - join us.

Meta Careers

📢 Neural Compression Enthusiasts @ #Neurips2022; Tue Nov 29th, 3.30 pm , Room 282 inside the Convention Center.

Let’s meet, chat, and get inspired!
Hope to see ya there ❤️

https://www.eventbrite.com/e/neurips-2022-neural-compression-meetup-tickets-471377230987

This is no corporate event: no fancy food, no preparations, it’s just us in a room. Bring your own coffee (BYOC). Maybe I can hook up the stereo with my playlist, that's top fancy to be expected. Also no registration required.

@dpkingma @stephan_mandt @yiboyang

Neurips 2022 Neural Compression meetup

This is an informal meetup for anyone interested in generative models, probabilistic machine learning and compression.

Eventbrite

📢 📢 New Feature in #NeuralCompression repo: Bits-Back compression for diffusion models!
Compress image data 🖼️ using diffusion models at an effective rate close to the (negative) ELBO.

See: https://github.com/facebookresearch/NeuralCompression/tree/main/projects/bits_back_diffusion

Some context ⏩ [1/3]

NeuralCompression/projects/bits_back_diffusion at main · facebookresearch/NeuralCompression

A collection of tools for neural compression enthusiasts. - NeuralCompression/projects/bits_back_diffusion at main · facebookresearch/NeuralCompression

GitHub
Explore the connection between diffusion models and optimal control 🔥
🎙️ Come to our oral at the #NeurIPS workshop on score-based methods and let’s discuss how one field can benefit from the other.
📖 http://bit.ly/3UAhena
Great work with Lorenz Richter and @karen_ullrich