Interesting. 4K is mostly a lie.

https://www.youtube.com/watch?v=yN0H_WfWOp4

The Biggest Mistake in the History of Hollywood

YouTube
@Gargron without looking, is this about compression? Because yeah. JPEG lossy compression works on images as well as words (LLMs)

@codinghorror @Gargron

Well, yes and no. The full argument was that:

- most of the digital film cameras are recording at less than 4k (2.4k was used as a median).
- those that use film, get digitalised before cutting for that ~2.4k, and that's the resolution on which efects are added, and which forms the defacto max-resolution.
=>so what they sell as 4K is often only 2.4k, with stretched pixels.

Further: to stream those streched extra-pixels, they tend to over-compress colour profiles.

@iju @codinghorror @Gargron yeah, streaming (as opposed to 4k Blu-ray) has effectively made 4k TVs pointless. They mostly target 15mb/s and most of the detail is lost. Same goes for the Dolby Vision dynamic HDR profiles they use for streaming.

Stick a high bitrate Blu-ray of a decent film transfer on a well calibrated display and you’ll be blown away, though.

@dascandy @iju @codinghorror @Gargron megaBITs. it's traditional to measure bandwidth in bits per second rather than bytes
@WiteWulf @dascandy @iju @Gargron sometimes traditions are bullshit
@codinghorror @WiteWulf @dascandy @iju @Gargron And sometimes there are good reasons for them. Bit streams aren’t always split into bytes; and bandwidth is a measure of flow rate not volume. Also telecom’s history, naturally.
@holdenweb @codinghorror @dascandy @iju @Gargron and if you want to be really obtuse: how many bits are in your byte? ‘Cos not all bytes have 8 😉
@WiteWulf @codinghorror @dascandy @iju @Gargron Loved the DECSystem10 36-bit architecture, with a variable byte size.