Looking at the Twitch Enhanced Broadcasting rollout (you encode up to five streams on your local system & stream them all to Twitch, which then sends the requested single stream to viewers rather than transcode at the server), I wonder if we are going to eventually get a situation where you get multi-resolution video encodes. The start of each block decodes to a low res output but extra data add more spatial detail so rather than multiple encodes, there is one with a variable payload size.
@shivoa there are some experimental stuff, usually called progressive video IIRC?

But I think it's usually better to negotiate the bitrate for most applications, because digital data transmission is mostly all-or-nothing.

FPV drones use PAL/NTSC because of this, because even if there's signal integrity issues, you still get some picture.
@ignaloidas Ye, probably true.
I think this may be a niche but (as services get less willing to pay to transcode) we may see more and more consumer upstreams burdened by having to stream all bitrate options at once (and finding efficiencies in doing this without potential redundancies but also without needing heavy server transcoding of the high quality stream).
@shivoa Yeah, but I'm fairly certain that for compute, simultaneous encoding already has redundancies that can be utilized to decrease load, I'd say that what you'd get the most out of progressive video is the bandwidth for the sender, but I don't think that it is that big of a problem.
@ignaloidas I'm considering people on home internet connections with very asymmetric allowances. Many people with eg VDSL have very fast downloads but poor upload potential so uploading five different stream qualities is likely to limit their top streaming potential (vs just uploading a single highest quality stream).
@ignaloidas @shivoa I don't think it's necessarily all-or-nothing. Sure, on the Web we don't have loss-tolerant protocols with embedded error correction (but isn't that related to use of loss as congestion indicator in TCP?), but when the medium is fundamentally very noisy there are various tradeoffs to try before the ultimate fallback to blasting how much of the uncompressed data fits.
@amonakov @shivoa right, but even the more advanced protocols with ability to specify tolerable loss and stuff like Media over QUIC essentially falls down to a bandwidth negotiation. You could serve progressive video over it of course, but it wouldn't be that much better than what you have with bandwidth negotiation.

@ignaloidas @amonakov @shivoa
ok but even with bandwidth negotiation, wouldn't it be easier for the server to discard some chunks of the stream and send only the low-quality part to low-bw clients, and all of the chunks to high-bw clients?

Then the server doesn't have to transcode, and the total bandwidth from the source to the ingest server is max(qualities) instead of sum(qualities)