Not sure Tim Berners-Lee’s vision was to have 148 requests transfer 5.3 MB of assets to deliver 15 KB of text

#pollution

…putting the ‘ass’ into assets

…thanks for liking /boosting the thing at the top. Muting for now 💚

https://mastodon.social/@urlyman/112767616997017525

@urlyman Approximately a 1:352 signal-to-crud ratio. Impressive.
@rvedotrc in this case, they’re made-up numbers to express an experience [https://mastodon.social/@urlyman/112965031272068593], but in the general vicinity of https://tonsky.me/blog/js-bloat/
JavaScript Bloat in 2024

What is the average size of JavaScript code downloaded per website? Fuck around and find out!

tonsky.me
@urlyman and exfiltrate gods-only-know what kinds of data to it's 878 partners

@urlyman Indeed.

Software and comms tech has a tendency to become more bloated to use the available bandwidth. It reminds me of the way new roads attract more traffic rather than easing congestion. Why can't we value elegance more? Depressing.

@macronencer it’s what happens continually in an economics that does not care about what it externalises.

But it will care. Eventually
https://mastodon.social/@urlyman/112964851577705411

@macronencer @urlyman “Wait, it has even more horsepower? Can’t stable them ponies!”
J'ai mieux compris quand je suis arrivé en Bosnie-Herzégovine, et que ça m'a coûté 60€ pour afficher l'adresse de mon Airbnb 😱 (à 7,5€/Mo)
@urlyman and involve 247 "partners" with "legitimate interest"
@urlyman
And probably could also not imagine that the Internet is going to be kidnapped by morons like Musk, Zuckerberg and the "we are the good"-gang.

@urlyman I got my career start in the Web 1.0 server-side Java world but had the opportunity to do some work with a couple former DataPoint guys who invented ArcNet. We had some great conversations about how expectations about available resources and approaches to optimization have evolved over time. We all agreed that no web page should ever have more code embedded in it than the original UNIX source code base

But here we are...

@urlyman your website is not truly modern and _responsive_ if it uses less than 50MB of JavaScript and assets 
@kaia @urlyman Meanwhile I guess postmodern ones can allow themselves to be purposefully efficient
@kaia @urlyman if you website doesnt get my gaming PCs fans spinning are you even really plaing
@urlyman aaaaaaaaaaarrrrrgggghhhhhh(screaming into the void and bigtech's ear)
@john let’s start a choir of dissonance

@urlyman
*148 requests carried over UDP 🤦

To me this is one of the most bizarre parts!

@m0xee can you expand? I’m not well-versed on networking protocols ( I’m more of a front-end bloke)
@urlyman
HTTP/3 is UDP-based — a strong departure from TCP-based HTTP1.1
Not without a few advantages — mainly parallelism, and it's a further development of HTTP/2, which was still TCP-based, but multiplexed.
Yet, dealing with datagrams on application level to work with streams of data seems a little controversial to me. Last, but not least: it mostly makes sense when in addition to 15 kB of main text content you have to carry large number of scripts, stylesheets, auxilary data fragments, etc
@m0xee @urlyman does HTTP/3 have back off logic built in? A big part of TCP is making sure that normal clients don’t cause a storm when packets are being dropped
@unsaturated
TBH, I'm not knowledgeable enough myself about how QUIC, which HTTP/3 is based on, handles that. To me the idea of introducing congestion control into application level (and the implied added complexity of client implementation) seem bad enough, but you might be right, there might be even more to it than that.
@urlyman
@unsaturated
Maybe network engineers who had to deal with it in practice can shed some light. My "solution" is disabling it in clients where it is optional (it is in Firefox) and building software such as curl without support for it 😅
@urlyman
@m0xee @urlyman it looks like quic and http/3 have congestion control RFC’s. I totally agree with you wrt doing this at the app layer.

@urlyman I feel this constantly. News sites where all I want is the text are covered in auto playing videos, irrelevant photos, structural elements that limit viewability, and of course ads.

I just want the text.

@urlyman there's me, earlier in the year writing the page for my weather station so it shows graphically the current status with real time live updates & got it all to be about 64k in total.

At some point I'll try to cut that by a few more k as there's some white space that's not needed.

The development page: https://weather.area51.dev/dash/home

@urlyman When #webperf is not in consideration for sites, search engines should rank them even worse.
@midzer @urlyman which is what
https://clew.se/
does as I've learned today. 🙂
Only vanishingly few sites are indexed, but it's a great discovery machine.
Clew

Clew is a web search engine trying to be different from the rest. We are the magic ball of string leading through the internet's labyrinth.

@mforester Thanks for sharing. Interesting project!

@urlyman A program I wrote for debugging web servers lists headers including ones from HTTP redirects. Here's a partial list for https: // www . wsj . com:
---------
Status: 403 Forbidden
Content-Type: text/html; charset=UTF-8
Content-Length: 557155
--------
... and the content length just includes the HTML, not anything loaded in a separate transaction.

Over 1/2 MB for an error response? Really?

@urlyman "If present trends continue, there is the real chance that articles warning about page bloat could exceed 5 megabytes in size by 2020" https://idlewords.com/talks/website_obesity.htm
The Website Obesity Crisis

@urlyman I'm currently working on an internal application that uses graphql to transfer data to a react front-end. In general, our queries are >1.5x larger than the responses. We could optimize them, if we had time and they wouldn't change all the time. Maybe, one day...
@urlyman I'm pretty shure it wasn't and I think that the #Enshittification of the #Web with #Trackers, #Ads, #Autoplay bs and the like should've been outlawed ages before...

@urlyman

...and disable basic browser navigation with half-assed javascript intercepting the user's clicks, then blame the end user for being backwards when the end user complains about this.

@urlyman @jwildeboer while in the mean time consuming an inordinate amount of resources on your local computer…