Hey smart people of the Internet. It seems like things have changed (for the worse) in newer #Python. I am trying to install #wxPython like I always have been doing, per https://wxpython.org/pages/downloads/

pip install -U -f https://extras.wxpython.org/wxPython4/extras/linux/gtk3/ubuntu-24.04 wxPython

And it is suddenly yelling at me to not install the exact way I have been installing this for years and years. What is the new cool hip way they want me to do this? Preferably uncomplicated. #ubuntu #gnu #linux #programming

wxPython Downloads

Current Release Starting with wxPython 4.0 (the first Phoenix release) the wxPython source archive and, for supported platforms, wxPython binary wheels are available from the Python Package Index (PyP

wxPython
@RomanOnARiver
What's the output from that pip command?
@dragon0 thanks for responding. The error is "externally-managed-environment" asks me to use apt which wouldn't get me the latest version I want

@RomanOnARiver
Then it's like @diazona said, the "hip" way is to use a virtual environment.

You can use `python -m venv` to create one, but you might want to take the opportunity to learn one of the project-oriented tools like PDM or Poetry. Then it's just a simple `pdm add wxpython` or `poetry add wxpython`.

An advantage of these tools is they'll record the exact versions of all packages they install, so when you return to the project it'll still work exactly as you left it.

@dragon0 @diazona so if I'm installing some module in a virtual environment is it accessible from any python script on the system or do I need to run python from inside that virtual environment?

@RomanOnARiver @dragon0 You need to run the python from that virtual environment. (There are a few packages that are written to allow them to use things from other virtual environments, but that's quite rare.)

But note that if you have a Python script (a .py file) sitting somewhere on your system, that script doesn't inherently belong to any virtual environment. It's the Python *interpreter*, and installed packages, that belong to virtual environments. If you just have a script file, you can use an interpreter from any virtual environment to run it. It may do different things depending on which interpreter you use. (Like, usually, failing with an error if the script uses some package that isn't installed in that particular virtual environment.)

#Python

@diazona @dragon0 gotcha, so if my virtual is in /foo/bar/python as opposed to my global /bin/python or wherever they put it, I'm running /foo/bar/python and my script can be anywhere, if I'm understanding correctly.

@RomanOnARiver @dragon0 Yes, exactly. You would run `/foo/bar/python script.py`.

Note that if you're working on Linux or similar, it would be unusual to create a virtual environment somewhere you needed to use `sudo` for it - like /foo - unless you're specifically installing something meant to be used system-wide. (Not very common.) Virtual environments are lightweight, they're meant to be created and deleted over and over again as needed, so it's not like you have to put them somewhere "permanent".

#Python

@diazona @dragon0 oh yeah okay I gotcha that makes sense. If I do want it in a folder I'm assuming it stays there until I get rid of it? Does it survive day distro upgrades? One issue I had with the old (wrong) way I had been using is every distro upgrade would wipe away all those external modules.

@RomanOnARiver @diazona

These tools generally put the venvs under ~, so dist upgrades shouldn't touch them

@dragon0 @diazona understood. Now not that I'm going to do this, but I am curious is it a bad idea to have say one virtual python for everything as opposed to little individual ones for each thing?

@RomanOnARiver @dragon0 It'd probably work for most things. You could definitely try, at least.

I think the most likely way for you to run into a problem there is if you want to install two different packages that require different versions of some dependency. Like, say, if you install one thing that requires appdirs at most version 3, and another that requires appdirs 4. (Totally made-up example, of course) They can't both be installed at the same time in the same virtual environment, so then you'd have to split those two things apart.

You'd probably also have a hard time figuring out if you can safely remove a package from your mega-venv. (Though there are ways around that, by recording what you installed and why - tools like pipenv, pdm, poetry, etc. can help with that)

#Python

@diazona @dragon0 I gotcha. And so another question is, if I'm using say IDLE to write and then test my code, that doesn't exist virtually in these environments right?
@RomanOnARiver @diazona IDLE is a built-in package, so you should be able to run `/path/to/venv/bin/python -m idle` to launch IDLE inside the venv and be able to import the venv's packages
@dragon0 @diazona oh neat. You love to hear it.
@dragon0 @diazona this is kind of off the wall, unsure if it's related, I want to eventually get some of this stuff on like Flathub or the snap store, does this affect that in any way, complicate or simplify? Packing is another one of those things I'm wrapping my head around - I've just been using like PyInstaller and putting a binary in a zip file which is really not great right?

@RomanOnARiver @diazona I haven't done anything like that, but I don't think it'll complicate it too much.

The two tools I know about are PyInstaller and BeeWare Briefcase. Both work by examining the imports from your main script, so afaik as long as you run them inside the venv they'll collect your dependencies correctly.

@RomanOnARiver @diazona I normally deploy server apps using Docker - the workflow there is to use the dependency list generated by PDM (or `pip freeze` inside the venv) to install the dependencies inside the container image along with the application code.

I think Flatpack/Snap/AppImage are kinda similar - you need some way to bundle your pip dependencies with your code. Per-project venvs help with listing all the project's dependencies that need to be bundled with the app.

@dragon0 @RomanOnARiver Yeah, for a lot of these packaging solutions (in my experience) it's important to make sure your app works with the normal Python packaging flow first, i.e. where you have a pyproject.toml file and you can run `python -m build` to generate an sdist and wheel. That's usually a good starting point for some other packaging system to be able to generate, say, a Docker image or an apt package or so on. Though I don't know about Flathub or snaps specifically.

@RomanOnARiver @diazona That's entirely up to you. The main advantage of venvs is to prevent their packages from conflicting with each other or the OS-installed Python (which is often used by other OS packages).

You can have one venv for all your stuff, just be aware that upgrading packages in the venv will affect all the scripts that use that venv.

The advantage of per-project venvs is that your different projects can depend on different package versions and be updated independantly.

@RomanOnARiver @dragon0 Yep it survives upgrades! The virtual environment, once created, is basically just a bunch of files, and like any other files, they won't be touched during a distro upgrade (unless you put the files somewhere they conflict with something on the system, but that's why you should put them under your home directory 😛 )

There is one catch though, which is that the virtual environment doesn't actually contain a whole independent copy of Python, at least not on Linux - it contains a symlink pointing back to the Python that created it. That can cause some trouble. For example, if you had Python 3.10 as the default on your system (like in Ubuntu 22.04) and you ran `python3 -m venv mypath`, then you later upgrade to Ubuntu 24.04 which doesn't have Python 3.10, the virtual env in `mypath` would stop working because it'd be looking for Python 3.10 that is not there. (But you can delete `mypath` and create a new venv.)

#Python

@RomanOnARiver
That's correct.

If you want to have "global" environments, you might want to use an env wrapper like virtualenvwrapper. Rather than per-project venvs, these tools make named venvs in a well-known directory. You can then set the #! line in your scripts to use a specific venv for that script (eg #!/foo/bar/python at the top of the script).
@diazona

@RomanOnARiver So here's the thing: don't assume that any time you're asked to do something differently it's a change for the worse. (see also https://xkcd.com/1172/) You didn't say what pip is actually saying when it yells at you, but if it's the thing that most often generates these kinds of complaints - the "externally managed environment" warning message - you were doing it wrong all along, and pip just never told you before.

The new way is to use a virtual environment for each situation where you need to install things. There are many ways to create and manage virtual environments - the simplest being `python -m venv <path>` - and once you have one, you can use pip to install your packages in that environment, e.g. with `<path>/bin/python -m pip install wxPython`. (Or `<path>/bin/pip install wxPython` probably works too, but there are some weird niche cases where that runs into trouble.)

#Python

Workflow

xkcd
@diazona thanks for responding, what is the location for the path people typically use?
@diazona also, to be fair, I'm just following the documentation from the package itself that said to do that. Plenty of python modules are still like hey just install this with pip.
@diazona for example another one I need to use is appdirs, documentation says just use pip https://pypi.org/project/appdirs/
appdirs

A small Python module for determining appropriate platform-specific dirs, e.g. a "user data dir".

PyPI
@diazona or sorry, platformdirs is the up to date fork I guess. Same difference https://pypi.org/project/platformdirs/
platformdirs

A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`.

PyPI
@diazona and then as a followup, since I do a lot cross platform, is there anything particularly weird or different when doing this from Windows? Last time I compiled something on Windows it was the same old story too.

@RomanOnARiver I've never done any of this on Windows, so I don't know how different it is, but hopefully someone else can chime in to tell you about that!

By the way, can I just say, I really appreciate that you're engaging in an actual discussion here. It's not common, but I've definitely seen people post requests for help like your initial post and react badly (with insults, even) when they get a response that isn't exactly what they wanted to hear.

@RomanOnARiver Yeah, unfortunately it is a thing that a lot of documentation is out of date with respect to modern packaging standards. Much has changed over the last 5-ish years.

Any time a package tells you to install it with pip, you should take that to mean "install with a pip-compatible tool into a virtual environment". The pip-compatible tool might be pip itself, or pipenv, or uv, or poetry, or so on. (Don't worry about what all these do, just be aware that there are different ones.) If there is such a thing as a package that actually needs to be installed globally, using your system's pip, that package should make that very obvious in its documentation - but I have never seen such a thing.

#Python

@RomanOnARiver Missed this one, sorry: honestly, the path can be anything you want, but I think the most common convention is to use a directory called `.venv` or `venv` inside the project directory for whatever project you're working on. If it's a project that has a directory. For a one-off script, people do all sorts of things, and again you can put it wherever you want, but one common choice is to use `<name-of-script>-venv` in the same directory where the script is.

There are some tools that manage virtual environments for you by storing them in either the same directory as your code or in a common directory somewhere in your user account's home. For example, pipenv, pip-run, pipx, and others. You might want to investigate some of these tools if you get tired of manually creating virtual environments for things.

#Python

@diazona the error did mention pipx

@RomanOnARiver yup pipx is one of the more useful ones in my opinion. It lets you install Python applications - runnable programs like twine or pylint or so on - by just typing `pipx install <program>`, in most cases, and it will do the work of figuring out where to create a virtual environment and what to install in it, installing the package you requested, and making the executable scripts from that package available in your PATH.

#Python

@diazona including with wx which has that weird wheel thing?

@RomanOnARiver I don't know about wx specifically - I haven't used it - but if it has a runnable program in it, yeah, you can install it with pipx. (And if it doesn't have a runnable program in it, you can still try, but pipx will give you an error saying there was no script to install.)

BTW "wheels" are the standard way Python packages are distributed these days. (They're really just ZIP files with some constraints on their name and content.) So almost every time you run `pip install <package>` or `pipx install <package>` or similar with any pip-compatible tool, what it's doing behind the scenes is downloading a wheel file and unpacking it inside the virtual environment directory.

#Python

@diazona wx is actually yelling me when installing from pipx, it needs a system png, tiff, and libcurl. I'm sure these are just things it needs from apt, otherwise wouldn't it pull it as its own dependency?

@RomanOnARiver Yeah probably. Although when a Python package depends on non-Python packages, as wx does here, things can get a little complicated - sometimes you have to install some system package with apt before it will let you install the Python package with pip/pipx/whatever.

As a rule, anything you install with pip/pipx/whatever is allowed to depend on stuff you've installed with apt, but never the other way around. (Which should probably never come up, since when you install an apt package, apt will take care of installing its dependencies, but maybe just something to keep in mind....)

@diazona ah I see they have a whole fun list https://github.com/wxWidgets/Phoenix#prerequisites

Edit: which of course is out of date. I don't remember installing anything extra in the past but maybe these are things I just had installed through other stuff

GitHub - wxWidgets/Phoenix: wxPython's Project Phoenix. A new implementation of wxPython, better, stronger, faster than he was before.

wxPython's Project Phoenix. A new implementation of wxPython, better, stronger, faster than he was before. - wxWidgets/Phoenix

GitHub
@RomanOnARiver Yeah makes sense, it's easy enough to just randomly have dependencies installed for other reasons

@diazona @RomanOnARiver

One brief note: when David says the "new way" is to use Python virtual environments, that's extremely relative. It has been best practice (and the only way to save your sanity because of dependency hell in any nontrivial project) in the Python community for, I dunno, 20 years? But Python's been around for >30, so it is the "new" way. Long before the `virtualenv` tool I was accomplishing the same thing with symlink trees to isolate interpreters.

There are lots of opinions on where you should keep your venvs, i.e. what the path to them should be. I personally like putting them in the project directory, typically `<project>/.venv`, but others like to stash them away someplace, like in ~/.local/share. Project tools that handle virtualenv management (e.g. poetry, uv, etc) will generally give you a way to control where they create the venvs.

#DependencyHell #import #circular #conflict

@cazabon @RomanOnARiver Indeed, "new way" here is relative to what Rivermonster was doing before, namely not using virtual environments.