Patrick Emami

36 Followers
116 Following
28 Posts
machine learning postdoc @ NREL, Ph.D. @ UF. 
deep generative models | reinforcement learning | ML ∩ climate change 🔋🌳 🌎
websitehttps://pemami4911.github.io
githubhttps://github.com/pemami4911

Not necessarily! We use gradients to craft an efficient proposal distribution for sampling from high-dimensional and discrete product of experts.

For example, this enables us to do things like maximize the sum of two binary MNIST digits just by flipping binary pixels:

3/

Combining multiple models in sequence space is straightforward if we treat each as one expert in a product of experts, like in energy based models.

"But isn't directed evolution just doing brute force or random search for mutations?” 🤔

2/

New pre-print!

Plug and play generation works for images and text…what about proteins?

We engineer proteins by combining your favorite unsupervised and supervised protein sequence models (even protein language models!) in a fast *gradient-based* discrete MCMC sampler.

🧵

Will be at @workshopmlsb today presenting our poster “Plug and Play Directed Evolution of Proteins with Gradient-Based Discrete MCMC”

We mix and match pre-trained protein models to do search with fast discrete MCMC :)

https://www.mlsb.io/papers_2022/Plug_Play_Directed_Evolution_of_Proteins_with_Gradient_based_Discrete_MCMC.pdf