OpenAI kills their Sora Video generator.

Which isn't that surprising TBH:
- it is massively expensive to run
- the results were not that usable
- it was a lawsuit generator

So in order to keep the numbers looking better till their IPO at the end of this year they are starting to cut shit. Just an indicator that things are not looking good cashflow-wise.

@tante also speaking of hype dying out hopefully, there are enough free alternatives to get simple jobs done for casual users (ie summaries or translation). I’ve stopped using #Mistral when I realized it was in fact running on USA infra and now using #xPrivo which is 100% EU and open source (even self-hosted option). Canceled my #Claude subscription a month ago.

@oatmeal @tante I guess people that like that are using alternatives.

Well, the people on my church are, every month they present a 5 minute AI-generated parody of some TV show featuring people from the church and the current month events. From the video style, looks like they are using Google's thing. 

@qgustavor @oatmeal @tante
"free" alternatives

With Sora gone, there is little reason for Google to keep their own AI video generator money pit running. Those unrealistically low tiers will soon vanish.

The same applies to code generation. Even $200/month is still only being sustained by continuing investor funding rounds.
https://youtu.be/oAbpVCn-Ox0

AI Bubble: Nobody will pay for unsubsidised AI | Ed Zitron

YouTube

@bornach @oatmeal @tante To be fair, I guess when the bubble bursts cloud AI computing will vanish, training will be lot rarer, but the existing free models will remain.

The thing is: there are some fields which AI managed to wrap around and people seem to be fine with this. Example: coding, because before AI people were stealing each other's work from StackOverflow uncredited anyway. Well, artists also did the same, from New Game No Life drawings (which had a TERRIBLE repercussion) to the tons and tons of unlicensed drawings around the world (like all badly drawn Homer Simpson drawings in one third of bars in my city).

@qgustavor indeed useful, when it works. From my own experience and what I’ve read confirms this: models reproduce common, not best or right practices for the job on hand. That’s how you end up with a pile of boilerplate code, duplicated functions, and reimplemented libraries. The problem is stack overflow and other sources of shared knowledge will disappear. Maybe it won’t matter in the future, I don’t know, but right now it’s a clear case of sawing off the branch one is sitting on.

@oatmeal While I guess Stack Overflow may disappear, not due to AI but because mismanagement (like they trying to make it look Reddit). I don't think other sources of knowledge will disappear, mostly if AI training becomes harder and more expensive, using them will become less and less desirable.

But, sure, while on one hand many coding problems are repeated (I remember searching StackOverflow just to find an answer from me from the past), and for them AI looks useful and it's a reason for people to keep using them after the AI bubble pop, not all problems are common and repeated.

And you pointed something really important: I lost count of how many times AI code (or coworker code written by AI) had issues like repeated code and bad practices. Example: wildcard CORS. SO COMMON!