I know I live on a different planet to many but, the only Python packaging problem I actually have is when I need to use two different versions of the same package at the same time. Solve that and we'll talk. #python

@carlton the thing about multi-version support that I always wondered about it... If this feature existed, it would mean no global state, right? Since global state is often kept at module-level, and there would be several (maybe many) copies of each module.

"Ooh I have to configure structlog 1.2, 1.3 and better not forget 1.4" ;)

@tintvrtkovic Yeah, I'm sure it's not an equilibrium position, but namespaces are a thing, and slow-dependency's internal use of util < new-hotness blocking use of util == new-hotness continues to be a pain.

@tintvrtkovic @carlton Global (in the #Python sense of "global", i.e. module-level) state is kept in the module object in memory, but once that module object has been loaded from a file, it exists independently of the file it came from. So if you did load multiple different versions of a package, they'd all correspond to different module objects in memory, each of which could have its own global state. Heck, even with just one version of a package, you could load some module from that package a hundred times and get a hundred different module objects in memory, again each with its own individual global state.

This is kind of what the importlib.reload() function does, although IIRC it also tries to mix the original and reloaded versions of the module in a way that is supposed to make sense if you don't care about the old module. To get a true fresh and independent copy of the module object you'd have to use other importlib functions.

@tintvrtkovic @carlton Anyway, back to the point... I think the reason #Python 's packaging tools normally won't install multiple versions of the same package in the same environment is more to avoid confusion more than anything else. Like, when you import a module from the package, which version does it come from? If part of your code needs one version and another part needs a different version, how do you identify in each part of the code which version of the dependency it should import from? And how do you express version constraints on a sub-package level? (like, if source_file_a.py in your package needs dependency-X>=1.2,<2, but source_file_b.py needs dependency-X>=1.6,!=2.0.1,<3) You can write custom code to solve these problems, but I suppose the developers of the packaging standards and tools looked at how it's played out in the past and decided that it's so rarely useful and so potentially confusing that it's not worth supporting in the default setup.
@tintvrtkovic @carlton ...that was much more than I meant to write when I started out 😛 but I dunno, hopefully you find something useful in all that
@diazona @tintvrtkovic Quietly muttering “DON’T GET NERD SNIPED” to myself. 😉

@carlton Some of my Ruby and Node friends feel Python's package management will never be adequate if this problem isn't addressed.

Assuming you don't mean multiple versions of the same package available everywhere but instead one per package, I think we'd basically need a separate site-packages directory location per package for this. 😬

@treyhunner Yeah, I have no grand solution waiting up my sleeve. Greater minds than mine can solve the how. 🥳

It’s frustrating when all dependencies (and the project) have to be on the same compatible ranges. A slow updating package blocks everyone else updating even if its usage of the target package are entirely internal.

🤷

@carlton @treyhunner To your last point, this is why we are still stuck on Python 3.12 for more projects that I care to admit to.

Since we can't stop authors from pinning to upper boundaries, I wish we had the mechanics to at least say, "IGNORE THIS" very loudly so we can move around it.

@webology @carlton @treyhunner

For this reason:

> When evaluating requires-python ranges for dependencies, uv only considers lower bounds and ignores upper bounds entirely.

https://docs.astral.sh/uv/concepts/resolution/#universal-resolution

Resolution | uv

uv is an extremely fast Python package and project manager, written in Rust.

@adamchainz @carlton @treyhunner Sorry, I was sharing two different thoughts which was confusing.

What I mean was we could not override package upper boundaries, and I should have known that UV has literally thought of everything and including this annoying situation.

See https://docs.astral.sh/uv/reference/settings/#override-dependencies

h/t via https://mastodon.social/@konstin/115133779802412410

Settings | uv

uv is an extremely fast Python package and project manager, written in Rust.

@webology @carlton @treyhunner In uv, there's `override-dependencies` to patch (transitive) bounds https://docs.astral.sh/uv/reference/settings/#override-dependencies
Settings | uv

uv is an extremely fast Python package and project manager, written in Rust.

@konstin @carlton @treyhunner Wow, that looks exactly what I was saying was missing. Thank you for sharing it. I was fighting this just last week. Much appreciated.

@carlton @treyhunner FWIW, "one module instance per process" is more an import system restriction rather than strictly being a packaging one. We'll potentially get to a point where different subinterpreters can load different versions of pure Python packages, but even if that happens, extension modules are still going to be constrained by the way platform dynamic library loading works.

In the meantime, `uv` version overrides at least offer an escape hatch for overly pessimistic upper limits.

@carlton @treyhunner (Strictly speaking, it's *already* possible to import multiple versions of a module into a process - the CPython test suite does it in order to test accelerated stdlib modules both with and without their accelerator code. It's just fraught with quirky edge cases, especially around exception handling when multiple exceptions with the same qualified name but different identities exist in the same interpreter. Hence the potential subinterpreter connection)
@carlton @treyhunner Huh, I'd never actually looked into how JS exception handling works. It seems that is typically "catch by value", where Python is largely "catch by identity", which makes a big difference in how likely identity divergence is to cause problems in practice. https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/instanceof#instanceof_and_multiple_realms does caution that the identity divergence problem exists even in JS, though. It's just in a way (realms) that is more analogous to subinterpreters than to multiple copies in one interpreter.
instanceof - JavaScript | MDN

The instanceof operator tests to see if the prototype property of a constructor appears anywhere in the prototype chain of an object. The return value is a boolean value. Its behavior can be customized with Symbol.hasInstance.

MDN Web Docs
@treyhunner @carlton FYI I am NOT rewriting import again, so you're on your own. 😁
@treyhunner @carlton Or you push for a way to specify requirement overrides into the packaging ecosystem.

@brettcannon @treyhunner @carlton

@mitsuhiko Has a blog post from earlier this year with thought on that - and it would be feasible without changes in the language and a lot of hackish stuff.

Me? just put the calls needing a different package version on the other end of a Celery-managed call.

(Actually - it could be done with subinterpreters now as well, in a way that would be a lot cleaner)

Multiversion Python Thoughts

A braindump on how to make multi version in Python work.

Armin Ronacher's Thoughts and Writings
How do I override nested NPM dependency versions?

I would like to use the grunt-contrib-jasmine NPM package. It has various dependencies. Part of the dependency graph looks like this: ─┬ [email protected] │ ├─┬ [email protected]....

Stack Overflow
@carlton I’ll argue that it’s a feature rather than a problem. We don’t have the nested outdated dependency hell of Javascript-land because of the pressure. The only solution is to contribute a fix or remove outdated or unmaintained dependencies.