#bergetai just announced a, for me, very interesting service: inference using powerful models like Kimi K2.6.
They previously released Mistral 3.5 & Gemma 4 as moderate and light models which I would assume is to build a know-how in building good inference endpoints.
Hope to see more of this! Local models and more providers means less lock-in, making inference a commodity and not putting all eggs in the openai/anthropic basket.




