Language models have become privately controlled research infrastructure. This week, OpenAI deprecated the Codex model that ~100 papers have used—with 3 days’ notice. It has said that newer models will only be stable for 3 months. Goodbye reproducibility!

It'll be interesting to see how developers are going to use these models in production if things are going to break every couple of months.

By @sayashk and me on the AI Snake Oil book blog: https://aisnakeoil.substack.com/p/openais-policies-hinder-reproducible

OpenAI’s policies hinder reproducible research on language models

LLMs have become privately-controlled research infrastructure

AI Snake Oil
@randomwalker @sayashk I assume this means they found something me commercial monetization?