- #Python subprocess call to install packages how to#
- #Python subprocess call to install packages full#
- #Python subprocess call to install packages code#
- #Python subprocess call to install packages free#
- #Python subprocess call to install packages windows#
Python Installation Checklistīefore packages can be installed, ensure that a Python installation containing the necessary files needed for installing packages is in place by following the Installation Requirements. However, PyPA fully supports using a Python script to run pip as a subprocess.
#Python subprocess call to install packages code#
When run as a thread from within a Python script, pip may affect non-pip code with unexpected results. Pip is not thread-safe, and is intended to be run as a single process.Use of a Python script to run pip to install a package is not supported by the Python Packaging Authority (PyPA) for the following reason:
#Python subprocess call to install packages how to#
This resource provides examples of how to create a Python script that runs pip (the recommended package manager) or conda as a subprocess in order to install Python packages. (I’m somewhat interested in doing something like this myself, but my time is sufficiently limited that waiting for me to deliver anything more than a prototype might be ill-advised )Įdit: But I should say that like I’m a strong +1 on doing something to reduce the current process creation cost of using PEP 517).In some cases, you may need to automate the updating of multiple Python deployments with a specific package or set of packages. To be honest, I think your requirement would be better handled by “someone” enhancing the pep517 library to work like this, rather than getting bogged down in debates over how to modify the standard. That’s basically what the current pep517 library does, except that it doesn’t use a persistent subprocess, but instead uses a subprocess per call. The persistent subprocess can do the work of preserving the process state around API calls. One alternative approach, which would work perfectly well with the current PEP 517 design, would be to implement one persistent, dedicated subprocess for each isolated build environment, and have that subprocess communicate with the backend via hook calls and with the frontend via a dedicated IPC API. Saying that “backends must not mess with global process state” is insufficient, because what constitutes “global process state” isn’t well-defined - in the past pip has cared about stdio data that’s not passed through Python’s IO mechanisms (consider the output of a C compiler called without IO redirection), and whether the process creates extra threads, as two further examples. If we want to make it easier for frontends, then we need to make additional restrictions on backends, and we have to be careful to define what those are. However, in doing so they are relying on the backend not “messing things up”.
I think as far as guarantees go it would suffice to enforce that the PEP-517 invocation part (the part that invokes the backend method in the subprocess) will not alter the global state.Īt the moment, the section you quote is non-normative, so it’s perfectly OK (in principle) for frontends to not use a subprocess call. I did do a POC of an interface that does not require fresh subprocess calls,, and excluding some rare edge cases it actually works fairly well. I think at the very least we should allow build backends to opt-out of the need for the fresh subprocess on every call. It’s a one-time cheap cost, but most backends don’t mangle the global state, either way, most of the time. You can wrap every PEP-517 hook with a os.getcwd/ os.chdir and os.environ mangling. Now from the POV of the build backend, this seems not that expensive to handle. I think it would be beneficial to pay it just once, rather than 3 times. And this is an overhead that must be paid on every tox invocation.
#Python subprocess call to install packages windows#
This overhead (on my high-spec MacBook Pro) is around 50ms, but things get much worse on a Windows machine where starting subprocesses is even more expensive. On a normal tox run for example tox needs to do the following 3 calls:
#Python subprocess call to install packages full#
However, the drawback is effecting the entire python user base, because now all frontend-backend interactions need to pay the full interpreter startup/teardown (plus imports) price every time. The benefit only affects build backend maintainers, who no longer have to care about the global state. I feel this guarantee offers little benefit at a very costly price. A Python library will be provided which frontends can use to easily call hooks this way.
#Python subprocess call to install packages free#
Taking advantage that the PEP is still provisional I propose to remove this guarantee from it:įrontends should call each hook in a fresh subprocess, so that backends are free to change process global state (such as environment variables or the working directory).