Hey smart people of the Internet. It seems like things have changed (for the worse) in newer #Python. I am trying to install #wxPython like I always have been doing, per https://wxpython.org/pages/downloads/

pip install -U -f https://extras.wxpython.org/wxPython4/extras/linux/gtk3/ubuntu-24.04 wxPython

And it is suddenly yelling at me to not install the exact way I have been installing this for years and years. What is the new cool hip way they want me to do this? Preferably uncomplicated. #ubuntu #gnu #linux #programming

wxPython Downloads

Current Release Starting with wxPython 4.0 (the first Phoenix release) the wxPython source archive and, for supported platforms, wxPython binary wheels are available from the Python Package Index (PyP

wxPython
@RomanOnARiver
What's the output from that pip command?
@dragon0 thanks for responding. The error is "externally-managed-environment" asks me to use apt which wouldn't get me the latest version I want

@RomanOnARiver
Then it's like @diazona said, the "hip" way is to use a virtual environment.

You can use `python -m venv` to create one, but you might want to take the opportunity to learn one of the project-oriented tools like PDM or Poetry. Then it's just a simple `pdm add wxpython` or `poetry add wxpython`.

An advantage of these tools is they'll record the exact versions of all packages they install, so when you return to the project it'll still work exactly as you left it.

@dragon0 @diazona so if I'm installing some module in a virtual environment is it accessible from any python script on the system or do I need to run python from inside that virtual environment?

@RomanOnARiver @dragon0 You need to run the python from that virtual environment. (There are a few packages that are written to allow them to use things from other virtual environments, but that's quite rare.)

But note that if you have a Python script (a .py file) sitting somewhere on your system, that script doesn't inherently belong to any virtual environment. It's the Python *interpreter*, and installed packages, that belong to virtual environments. If you just have a script file, you can use an interpreter from any virtual environment to run it. It may do different things depending on which interpreter you use. (Like, usually, failing with an error if the script uses some package that isn't installed in that particular virtual environment.)

#Python

@diazona @dragon0 gotcha, so if my virtual is in /foo/bar/python as opposed to my global /bin/python or wherever they put it, I'm running /foo/bar/python and my script can be anywhere, if I'm understanding correctly.

@RomanOnARiver @dragon0 Yes, exactly. You would run `/foo/bar/python script.py`.

Note that if you're working on Linux or similar, it would be unusual to create a virtual environment somewhere you needed to use `sudo` for it - like /foo - unless you're specifically installing something meant to be used system-wide. (Not very common.) Virtual environments are lightweight, they're meant to be created and deleted over and over again as needed, so it's not like you have to put them somewhere "permanent".

#Python

@diazona @dragon0 oh yeah okay I gotcha that makes sense. If I do want it in a folder I'm assuming it stays there until I get rid of it? Does it survive day distro upgrades? One issue I had with the old (wrong) way I had been using is every distro upgrade would wipe away all those external modules.

@RomanOnARiver @diazona

These tools generally put the venvs under ~, so dist upgrades shouldn't touch them

@dragon0 @diazona understood. Now not that I'm going to do this, but I am curious is it a bad idea to have say one virtual python for everything as opposed to little individual ones for each thing?

@RomanOnARiver @dragon0 It'd probably work for most things. You could definitely try, at least.

I think the most likely way for you to run into a problem there is if you want to install two different packages that require different versions of some dependency. Like, say, if you install one thing that requires appdirs at most version 3, and another that requires appdirs 4. (Totally made-up example, of course) They can't both be installed at the same time in the same virtual environment, so then you'd have to split those two things apart.

You'd probably also have a hard time figuring out if you can safely remove a package from your mega-venv. (Though there are ways around that, by recording what you installed and why - tools like pipenv, pdm, poetry, etc. can help with that)

#Python

@diazona @dragon0 I gotcha. And so another question is, if I'm using say IDLE to write and then test my code, that doesn't exist virtually in these environments right?
@RomanOnARiver @diazona IDLE is a built-in package, so you should be able to run `/path/to/venv/bin/python -m idle` to launch IDLE inside the venv and be able to import the venv's packages
@dragon0 @diazona oh neat. You love to hear it.
@dragon0 @diazona this is kind of off the wall, unsure if it's related, I want to eventually get some of this stuff on like Flathub or the snap store, does this affect that in any way, complicate or simplify? Packing is another one of those things I'm wrapping my head around - I've just been using like PyInstaller and putting a binary in a zip file which is really not great right?

@RomanOnARiver @diazona I haven't done anything like that, but I don't think it'll complicate it too much.

The two tools I know about are PyInstaller and BeeWare Briefcase. Both work by examining the imports from your main script, so afaik as long as you run them inside the venv they'll collect your dependencies correctly.

@RomanOnARiver @diazona I normally deploy server apps using Docker - the workflow there is to use the dependency list generated by PDM (or `pip freeze` inside the venv) to install the dependencies inside the container image along with the application code.

I think Flatpack/Snap/AppImage are kinda similar - you need some way to bundle your pip dependencies with your code. Per-project venvs help with listing all the project's dependencies that need to be bundled with the app.

@dragon0 @RomanOnARiver Yeah, for a lot of these packaging solutions (in my experience) it's important to make sure your app works with the normal Python packaging flow first, i.e. where you have a pyproject.toml file and you can run `python -m build` to generate an sdist and wheel. That's usually a good starting point for some other packaging system to be able to generate, say, a Docker image or an apt package or so on. Though I don't know about Flathub or snaps specifically.