LLM access is relatively cheap now because the LLM vendors are discounting their price at a massive loss, subsidized by VC, in order to get you addicted and to drive as much skilled human labor as possible out of the workforce permanently.

The goal is monopolization, and if they’re successful, you’ll see monopolistic pricing in the future.

@lapcatsoftware But will it work? There seem to be too many labs making models of similar quality, and often it's effortless to switch between them.
@williamoconnell @lapcatsoftware
so what?
they all make a loss, and eventually need to make money
@Doomed_Daniel Generally they aren't making a loss on inference currently, they're just spending so much more on training new models that they have a loss overall. But also some of the competition is from open source models, and the cost of inference for those is just the cost of compute, which goes down over time.