I find myself bouncing between Twitter and Mastodon. Not sure one is better than the other.
I will say most of my conversations over here are far more engaging though. Maybe too many bots on Twitter?
I find myself bouncing between Twitter and Mastodon. Not sure one is better than the other.
I will say most of my conversations over here are far more engaging though. Maybe too many bots on Twitter?
@PhotonEmpress Oh don't worry, Paradox is doing tons of A/B testing before deploying at Anthrocon. YT engineers have contacted him about supplying LUTs and how it transforms an HDR source.
Once again, "ctcwired" wherever you can find him.
@JackRacc I could be mistaken but I think YouTube is 265 10b HDR only for now.
But part 2 here is that for highly available webcasts we use FPGA based encoders. So AV1 realistically out.
But AV1 isnโt part of MPEG so far as I know so I expect it to die like VP7/8/9, VC1, et al. Same story, different song. They never learn.
@PhotonEmpress Don't be so sure about AV1 dying. It's making it into Nvidia Jetson modules as part of the encoding ASIC. It's part of every Ada Lovelace GPU.
It's up to someone to push for YouTube to trial AV1 real time encode direct to HLS chunks using a NVIDIA GPU at the source. I'm certain a bunch of Ada Lovelace Quadros can do the job. NVIDIA's ASIC portion of the GPU is tuned as good as FPGA solutions.
@JackRacc same story, different codec. Just like we still have VC1 and VP9 today I donโt expect AV1 will die. But it wonโt be the standard either.
Itโs legit the same lesson people seem to need to re-learn over and over and over and over again.
@PhotonEmpress Netflix was a huge proponent of it, now Netflix is getting desperate for funding. Netflix may just go the way of HD DVD soon.
My take though is the decode/encode complexity is favorable vs VVC and will get into low power ASIC chips sooner than VVC. (heck, Nvidia Jetson can go super low in terms of power budget)