Interesting. 4K is mostly a lie.

https://www.youtube.com/watch?v=yN0H_WfWOp4

The Biggest Mistake in the History of Hollywood

YouTube
@Gargron without looking, is this about compression? Because yeah. JPEG lossy compression works on images as well as words (LLMs)

@codinghorror @Gargron

Well, yes and no. The full argument was that:

- most of the digital film cameras are recording at less than 4k (2.4k was used as a median).
- those that use film, get digitalised before cutting for that ~2.4k, and that's the resolution on which efects are added, and which forms the defacto max-resolution.
=>so what they sell as 4K is often only 2.4k, with stretched pixels.

Further: to stream those streched extra-pixels, they tend to over-compress colour profiles.

@iju @codinghorror @Gargron yeah, streaming (as opposed to 4k Blu-ray) has effectively made 4k TVs pointless. They mostly target 15mb/s and most of the detail is lost. Same goes for the Dolby Vision dynamic HDR profiles they use for streaming.

Stick a high bitrate Blu-ray of a decent film transfer on a well calibrated display and you’ll be blown away, though.

@dascandy @iju @codinghorror @Gargron megaBITs. it's traditional to measure bandwidth in bits per second rather than bytes
@WiteWulf @dascandy @iju @Gargron sometimes traditions are bullshit

@codinghorror @dascandy @iju @Gargron harsh 🤨😀

Coming from a telecoms background, I find it easier to compare speeds of different things if they standardise on bits/second. I really don’t like it when apps default to bytes/second, moreso when they don’t respect lowercase b for bits and upper case B for bytes, and you have to guess what value they’re showing you.

@WiteWulf @dascandy @iju @Gargron I know, I know, insert-xkcd-cartoon-about-competing-standards-here