This really drives home something about LLM systems. They’re very expensive to run, both to train and per-query, and hard to make cheaper. I expect them to get more expensive to run. They’re currently sold at a big loss to establish monopoly power and then raise prices dramatically. That’s the *stated* business plan.

If you’re building your business to rely on LLM, you need to factor in what you‘ll do when they pivot to making money, or they pull back because they can’t.
https://geeknews.chat/@theregister/112116266764229145

The Register (@[email protected])

Microsoft promises Copilot will be a 'moneymaker' in the long term Exec tells investors to 'temper' expectations as mission to convince customers of price tag continues Microsoft is asking investors to "temper" expectations for quick financial returns from Copilot amid efforts to convince customers that paying "substantial" sums each month is actually worth it.… #theregister #IT https://go.theregister.com/feed/www.theregister.com/2024/03/18/microsoft_copilot_moneymaker/

Geek News Central Mastodon Chat

@cocoaphony @grumpygamer @theregister is this accurate about running them?

there are a ton of LLM models in the gpt 3-3.5range that can be run locally (even on mobile devices) and a lot of companies are investing money into specialized chips to make running them more performant.

i honestly don’t understand the expensive subscriptions but i imagine it’s gotta be about the cost of training

@cocoaphony @grumpygamer @theregister hmmmmmmm seems like the cutting edge models are pretty expensive to run at scale, which makes sense. I agree it seems kind of wild to build a business running on top of licensable ai — it's already too expensive.