This really drives home something about LLM systems. They’re very expensive to run, both to train and per-query, and hard to make cheaper. I expect them to get more expensive to run. They’re currently sold at a big loss to establish monopoly power and then raise prices dramatically. That’s the *stated* business plan.

If you’re building your business to rely on LLM, you need to factor in what you‘ll do when they pivot to making money, or they pull back because they can’t.
https://geeknews.chat/@theregister/112116266764229145

The Register (@[email protected])

Microsoft promises Copilot will be a 'moneymaker' in the long term Exec tells investors to 'temper' expectations as mission to convince customers of price tag continues Microsoft is asking investors to "temper" expectations for quick financial returns from Copilot amid efforts to convince customers that paying "substantial" sums each month is actually worth it.… #theregister #IT https://go.theregister.com/feed/www.theregister.com/2024/03/18/microsoft_copilot_moneymaker/

Geek News Central Mastodon Chat

@cocoaphony They're not expensive "per query". If you run a cloud service doing millions of them for free then maybe fix your business model.

This is easy to prove: Run an LLM locally on your GPU. Compare cost of energy vs, for example, playing a game on said GPU.