@carnage4life
I think this part in the article is where we will go (based on personal experience working with smaller models and seeing their progress too):
> To hedge against that risk, Cursor is betting on its own models. A team of about 20 AI researchers is developing the Composer model family, built on open-source foundations like DeepSeek, Kimi, and Qwen, then fine-tuned with proprietary data and reinforcement learning. Version 2.0 marked the release of Cursor's first homegrown coding model. Composer 1.5 is fast, the second-most popular model on the platform, and significantly cheaper for Cursor to run than paying for Anthropic's large models.
I've saturated my dev brain a while ago already, so I think that local/opensource models + a return to more sound engineering practices instead of everybody losing their mind burning tokens at random will be where we are going.
#llm #llms