Interesting. 4K is mostly a lie.

https://www.youtube.com/watch?v=yN0H_WfWOp4

The Biggest Mistake in the History of Hollywood

YouTube
@Gargron without looking, is this about compression? Because yeah. JPEG lossy compression works on images as well as words (LLMs)
@Gargron also 8k is the big lie. 4k not so much. Very rapidly diminishing returns.

@codinghorror @Gargron

Well, yes and no. The full argument was that:

- most of the digital film cameras are recording at less than 4k (2.4k was used as a median).
- those that use film, get digitalised before cutting for that ~2.4k, and that's the resolution on which efects are added, and which forms the defacto max-resolution.
=>so what they sell as 4K is often only 2.4k, with stretched pixels.

Further: to stream those streched extra-pixels, they tend to over-compress colour profiles.

@iju @codinghorror @Gargron yeah, streaming (as opposed to 4k Blu-ray) has effectively made 4k TVs pointless. They mostly target 15mb/s and most of the detail is lost. Same goes for the Dolby Vision dynamic HDR profiles they use for streaming.

Stick a high bitrate Blu-ray of a decent film transfer on a well calibrated display and you’ll be blown away, though.

@dascandy @iju @codinghorror @Gargron megaBITs. it's traditional to measure bandwidth in bits per second rather than bytes
@WiteWulf @dascandy @iju @Gargron sometimes traditions are bullshit

@codinghorror @dascandy @iju @Gargron harsh 🤨😀

Coming from a telecoms background, I find it easier to compare speeds of different things if they standardise on bits/second. I really don’t like it when apps default to bytes/second, moreso when they don’t respect lowercase b for bits and upper case B for bytes, and you have to guess what value they’re showing you.

@WiteWulf @dascandy @iju @Gargron I know, I know, insert-xkcd-cartoon-about-competing-standards-here
@WiteWulf @codinghorror @iju @Gargron I would've assumed that if you used uppercase M. Millibits and bytes seemed incorrect.
@codinghorror @WiteWulf @dascandy @iju @Gargron And sometimes there are good reasons for them. Bit streams aren’t always split into bytes; and bandwidth is a measure of flow rate not volume. Also telecom’s history, naturally.
@holdenweb @codinghorror @dascandy @iju @Gargron and if you want to be really obtuse: how many bits are in your byte? ‘Cos not all bytes have 8 😉
@WiteWulf @codinghorror @dascandy @iju @Gargron Loved the DECSystem10 36-bit architecture, with a variable byte size.

@WiteWulf - the thing is that many people still watch plain old TV, and in Germany, that means 720p at most.
Which is scandalous, given that FullHD is now what, 20 years old?

I do notice a good FullHD quality jump.

I also notice a huge quality jump going to 4K.

At 4K, I have enough. Eyes can't tell any higher even at monitor distance. And even if the footage is older, grainy, whatever: at least it's not an issue of too low resolution. 4K should just be standard. Period.

Then we can concentrate on other things like surround sound.

@axel_hartmann I’m surprised that broadcast TV is still only 720p in Germany. And yes, on larger screens, the jump from 720p to 1080p is very obvious.

The jump to 4k typically isn’t as noticeable to most viewers (at least at streaming bandwidths), but testing shows that viewers respond far more favourably to HDR. 1080p/HDR tests far better than 4k/SDR with many viewers.

@WiteWulf i’ve seen remuxes of 1080p videos and 4k and I honestly cannot tell them apart sitting at an appropriate distance on my 75 inch TV
@Gargron the fact that a movie theater projected 35mm prints at, best case scenario, 2K and now we scoff at anything below 4K for home video never ceases to bewilder me.

@vpermar @Gargron Some films were made with 70 mm film, but not most due to cost. If you saw such a film in a theater that had (and used) a 70 mm projector, you'd have noticeably better image quality.

I think there is a different problem with digital films - the cost reduction resulted in movies that are too long with lots of scenes that just don't add very much, and there are no intermissions.

@bzdev
A massive 65 mm camera is heavy, difficult to handle and expensive. And so is the 65 mm film. The biggest cost however are the 70 mm films, with added audio tracks, that have to be shipped to theaters. And not many theaters have a 70 mm projector and the required lenses are very expensive. The dynamic range of film is however amazing and it doesn't have the artefacts of an image sensor.
@vpermar @Gargron

@vpermar @Gargron 35mm film, depending on ISO and lighting, was supposed to be much more detailed than that.

I can attest that when I first saw digital projection (Planet of the Apes, some fancy cinema in New York, 2001), far from being the clear sharp perfect image it was promoted to be, I found it quite dull looking and low resolution compared to what I was used to.

@mossman @vpermar @Gargron

The video says 35mm is basically at about 4k, with a minority arguing it might go up to 6k.

@Gargron This guy's first mistake is expecting high quality from streaming

@Gargron

This entire video in one image.

BUY MOVIES.

@jsit @Gargron idk if it’s just me but my 1080p Blu-Ray looks nicer than the digital version I have of the same movies on YouTube/Apple.

Mainly dark scenes and areas of detailed motion. Like the water waves in Finding Dory. And like the night shots in the Superbad movie.

@kosama 100%. The color banding on dark regions over streaming is so distracting.
@jsit yup, it takes me out of the movie every time. lol
@jsit @Gargron 💯
Bitrate is key here 👍
@jsit @Gargron his second mistake is seeing color banding and thinking about resolution and not bit depth. Truly crazy to put this much effort into a video and not immediately hit on that.

@Gargron AFAIK, 4K is real and is on the original film, saved somewhere.

The cost to use that film to move into 4K Digital is so costly, compare to expected sales, companies would only just shove in AI and call it a day.

@Gargron 17:45 - 19:00 is pure misinformation. And some more at towards the end...
Basically, disregard most of the information where they talk next to the whiteboard with the CIE 1931 projected on it.
@petrikas @Gargron I mean I immediately have a skeptical eye when people fetishize film but I sort of expected someone making so long a video could’ve learned more in the process.
@Gargron Only because streamers crush the bit depth and use low bit rates. The resolution is there, it's these other issues that make it look crappy.

@Gargron I had got halfway through before he finally claimed something I hadn't heard before, which is that movies are still being edited in "2K" - which he says is only actually ~2000*700 - even today.

Seems hard to believe, although on the other hand I never bother to .. uhhh... "acquire" anything higher than 1080p (HEVC) since the 4K versions don't really seem to look much better but do take ten times as long to download / ten times the disk space. That would make sense if they really are just upscaled from the same source anyway.

That might also explain why I get more of a sense of detail and realism when selecting 4K on some YouTuber speaking to camera in their home studio than the movies with a million passes of filters and effects smearing everything into a muddy blur...

@Gargron I sort of figured that.

@Gargron

Interpolation and algorithms is what you call that…

@Gargron He is only talking about movies, but especially Netflix has some strong requirements for its production contractors about specific 4K cameras and CGI + editing in 4K in series. They want grounded reasons to upsell you to their 4K plan, so 4K series on Netflix and Prime are usually true 4K productions with a complete 4K DI pipeline.
Newer cinema movies of the past, say 6 years, also do often have whole 4K production pipelines (just don’t expect it from Disney).
@frumble
Even if the actual movie is all 4k, the stream has a limited bandwidth of ~ 25 MBit/s -> lossy compression.
@innerand Of course, but it’s H.265 HDR on streaming services vs. H.264 SDR on FHD Blu-ray, that makes comparisons crude (compression with HDR is also more efficient than SDR). 25 MBit/s 4K can look excellent with the right encoder, it’s just not the actual standard bitrate of those services.
80 MBit/s UHD-BDs do look better, of course.
@Gargron I've generally not minded DVDs, especially for older media they look fine because they were designed to be watched at that resolution. I think 4k looks really nice but it ends up being annoying to work with as you need a specialised player so I mostly watch stuff on a 1080p display coming out of my ps3, honestly still my gold standard resolution.
There's me nerding out oops. Hope someone finds this interesting

@Gargron @sundogplanets I've started watching his other stuff now, too. It's really interesting and insightful.

E.g., Elon Musks satellites interrupting telescopes on earth. Prof. Sam Lawler's presentations/writings have been about.

https://youtu.be/WntZu73iwUA

Why the Biggest Radio Telescope Is Practically Useless

YouTube
@Gargron let’s not get started on screen size and viewing distance 🤫
@Gargron 4K in Youtube, viewed on a 4K monitor 30 cm from your eyes, with a Premium subscription, is definitely visibly better than HD, though.

@tml @Gargron

The video addresses that youtubers use better cameras than film industry.

(For various reasons, but mostly I would guess that gopros work ok for video essays and skating videos, but not for whatever is in the theatres currently.)

https://mastodon.social/@iju/116111224937278334