Interesting. 4K is mostly a lie.

https://www.youtube.com/watch?v=yN0H_WfWOp4

The Biggest Mistake in the History of Hollywood

YouTube
@Gargron without looking, is this about compression? Because yeah. JPEG lossy compression works on images as well as words (LLMs)

@codinghorror @Gargron

Well, yes and no. The full argument was that:

- most of the digital film cameras are recording at less than 4k (2.4k was used as a median).
- those that use film, get digitalised before cutting for that ~2.4k, and that's the resolution on which efects are added, and which forms the defacto max-resolution.
=>so what they sell as 4K is often only 2.4k, with stretched pixels.

Further: to stream those streched extra-pixels, they tend to over-compress colour profiles.

@iju @codinghorror @Gargron yeah, streaming (as opposed to 4k Blu-ray) has effectively made 4k TVs pointless. They mostly target 15mb/s and most of the detail is lost. Same goes for the Dolby Vision dynamic HDR profiles they use for streaming.

Stick a high bitrate Blu-ray of a decent film transfer on a well calibrated display and you’ll be blown away, though.

@dascandy @iju @codinghorror @Gargron megaBITs. it's traditional to measure bandwidth in bits per second rather than bytes
@WiteWulf @dascandy @iju @Gargron sometimes traditions are bullshit

@codinghorror @dascandy @iju @Gargron harsh 🤨😀

Coming from a telecoms background, I find it easier to compare speeds of different things if they standardise on bits/second. I really don’t like it when apps default to bytes/second, moreso when they don’t respect lowercase b for bits and upper case B for bytes, and you have to guess what value they’re showing you.

@WiteWulf @dascandy @iju @Gargron I know, I know, insert-xkcd-cartoon-about-competing-standards-here
@WiteWulf @codinghorror @iju @Gargron I would've assumed that if you used uppercase M. Millibits and bytes seemed incorrect.
@codinghorror @WiteWulf @dascandy @iju @Gargron And sometimes there are good reasons for them. Bit streams aren’t always split into bytes; and bandwidth is a measure of flow rate not volume. Also telecom’s history, naturally.
@holdenweb @codinghorror @dascandy @iju @Gargron and if you want to be really obtuse: how many bits are in your byte? ‘Cos not all bytes have 8 😉
@WiteWulf @codinghorror @dascandy @iju @Gargron Loved the DECSystem10 36-bit architecture, with a variable byte size.

@WiteWulf - the thing is that many people still watch plain old TV, and in Germany, that means 720p at most.
Which is scandalous, given that FullHD is now what, 20 years old?

I do notice a good FullHD quality jump.

I also notice a huge quality jump going to 4K.

At 4K, I have enough. Eyes can't tell any higher even at monitor distance. And even if the footage is older, grainy, whatever: at least it's not an issue of too low resolution. 4K should just be standard. Period.

Then we can concentrate on other things like surround sound.

@axel_hartmann I’m surprised that broadcast TV is still only 720p in Germany. And yes, on larger screens, the jump from 720p to 1080p is very obvious.

The jump to 4k typically isn’t as noticeable to most viewers (at least at streaming bandwidths), but testing shows that viewers respond far more favourably to HDR. 1080p/HDR tests far better than 4k/SDR with many viewers.

@WiteWulf i’ve seen remuxes of 1080p videos and 4k and I honestly cannot tell them apart sitting at an appropriate distance on my 75 inch TV