Netflix can now serve 100Gbit/s of video (so something like 12,500 individual 4K streams) with an appliance using 100 watts of power. That’s 8 milliwatts for each 4K stream.

Remember that number the next time someone tells you that watching a Netflix show is as bad as driving an SUV or some shit.

https://people.freebsd.org/~gallatin/talks/OpenFest2023.pdf

@karppinen That's true, but the actual externalized cost of streaming is the network route between Netflix and you, not their servers.
@dalias @karppinen just look at the router at home. It maybe condumes 11,5 W base load and when you download 2 GB over 1 hour it increases to 11,6 W. So if you don't turn it off, the 11,5W are used anyway. The 2 GB caused maybe 0,1 W extra electricity. I don't think that it's much different in internet routers (except them having more throughput). The big factor with Netflix is your screen. A Projector causes more emissions than a TV > computer monitor > laptop > smartphone.

@duco @dalias I run a small fiber network and can confirm that in principle there’s very little marginal energy cost to traffic. Our peaks are in the 50–80 gigabits per second range and don’t really show up on router electricity usage graphs compared to the 10x smaller baseline.

That said, wireless is different. Also, much of the capacity is there specifically for video, so it’s not right to look at the marginal cost alone.

@karppinen @duco Exactly. Streaming video is the whole reason all the capacity infrastructure is there.