I've seen attempts to quantify the relative energy usage of programming languages, and the general consensus tends to be "Rust great, java is pretty good, Ruby/Python are the worst polluters." I totally buy that, but I'm curious whether anybody has replicated the results with more modern versions, e.g. #ruby with #yjit ?
My thinking is that the compilation cost amortizes over the lifetime of a given piece of code. So for a service that will run millions of times, the relatively expensive compilation of #rust pays off, but for a one-off script there wouldn't be as much (environmental) benefit. But do we have benchmarks on #jit usage so we can estimate where the inflection point between the two might be?
At large enough scales it's intuitive that it pays off to use something compiled. But most compute workloads are much more modest and certainly aren't "web scale".
I also have not seen development time included in most sources I've found, which again makes sense for software that runs a lot/for a long time. But if I'm doing a one off job, using a language that prioritizes developer velocity (over performance) could significantly reduce the amount of time I need to have a machine powered on.