Every time I mess around with sysadmin stuff, I'm always flummoxed by dumb things like "what is the difference between /usr/bin and /usr/local/bin, what the heck is an LD_LIBRARY_PATH, should I use sudo for this build tool or not," etc.). I guess this is how backend devs feel when they have to tweak a Webpack config.

Maybe this is why I'm a little skeptical of the whole "move everything to Rust/Zig/Go/etc" movement in the JS ecosystem. I like JavaScript, I understand JavaScript. If I have to debug some JS tool, I'm well-equipped. Whereas if I have to dip down into some weird error like "libfoo.so.42: cannot open shared object file" then I know I'm gonna get lost.

Plus I don't think we've come close to exhausting all the ways to optimize JS deps: https://marvinh.dev/blog/speeding-up-javascript-ecosystem/

Speeding up the JavaScript ecosystem - one library at a time

Most popular libraries can be sped up by avoiding unnecessary type conversions or by avoiding creating functions inside functions.

@nolan
> Plus I don't think we've come close to exhausting *all* the ways to optimize JS deps (emphasis mine)

While that is surely true, at some point you are working to have the horse beat the supercar, and at what cost is that? When you could settle for _just_ a car and avoid the time spent beating the dead horse.

@stub Sure, I agree that optimal Rust is faster than optimal JS. And idiomatic Rust is faster than idiomatic JS.But the question is: how much faster, and at what cost? It seems to me that a lot of people are jumping from "idiomatic JS" to "optimal Rust" without really trying out "optimal JS" first.

In your analogy, I'd say to try putting some horseshoes on the horse before springing for a sportscar. 🙂