I tried PDM earlier this year and there's a few things worth noting:
- PEP582 which it is based on is still in draft, and some tools (VS Code) won't fully support it until it's accepted.
- If you want to develop or test using different Python versions, you still need to use a virtual environment. PDM does handle this for you though.
IMO, the Python packaging ecosystem has been a dumpster fire for a long time, but the flurry of recent PEPs that provide a standard way of defining a package with pyproject.toml have made it so much better. Now it's just a matter of the tools catching up.
Dumpster fire compared to what? Nuget? NPM? It may be that some more recent languages managed to start a saner solution and keep it sane.
But really, it's a hard problem, between cross-platform support, backwards compatibility, security concerns, hosting, most authors being volunteers and so on.
And still, even with "just" pip or even conda I am enjoying the Python experience more than some other packaging solutions I've seen.
It's objectively a dumpster fire. I don't care about other languages also being a dumpster fire on the packaging front, because I don't use other languages :) It also doesn't help me to know that NPM or .NET have rubbish packaging systems, that's their problem and not mine. I'd primarily want python's packaging system to be good.
I mean, just look at this thread. Someone asks "So in light of this, what should I use for python packaging?", and they get two dozen different answers loaded with weirdness like pyenv, python-venv, virtualenvwrapper, etc... If I wasn't using python, I would've thought this is some cruel python in-joke that outsiders don't get.
Just looking at those names, I'm already confused as to what the hell each are doing and why do I need them?
But let's go back to pip and conda. Conda is unbearably slow. Pip is not entirely reliable in resolving versions properly. There's also not entirely cooperative interaction between conda and pip. If you use conda, you should not use pip (or just minimally) because it'll result in a mess.
Yes, packaging is hard, but it feels like python has managed to solve (or not solve) it in a uniquely obtuse and bad way so far.
Hopefully the slew of new PEPs will finally bring some clarity to this mess.
When you say Python's way is uniquely obtuse, you are making a comparison to other packaging systems.
Labeling something a "dumpster fire" is a value judgement. It can't be "objective", unless there is a physical dumpster that is burning. It's not like the Python community hasn't solved this problem because it's lazy or incompetent. It's just a hard problem.
> Pip is not entirely reliable in resolving versions properly
It must be true because people say something like this any time a "python packaging sucks" thread comes along ... but I can't recall ever experiencing that in 10 years of Python-ing
The people having problems always seem to mention Conda though
I guess the parts that have problems are primarily where FFI and compilation or linking of non-Python code is happening?
Whoa there. Setting aside the stagnation of Perl due to the v6 debacle (which, by the way, Python 3 came very close to succumbing to the same fate), CPAN is widely recognized as a very successful package system, and is frequently the envy of many other languages. DBI, the Net:: space, and many others just work, and the package names follow common sense.
I started a new Python project last month. I tried both Poetry and PDM but I decided not to use neither of them. PDM is currently basically one man show, and the Poetry's doc isn't great - The doc page seems pretty but it only describes command line usages and does not tell how I can configure metadata. Most importantly Poetry does not support the standard PEP621 yet.
So I stick with this setup:
- Use pyenv to manage different Python versions and virtual environments.
It's pretty simple. Check in the lock file, and then run
$ poetry install
to replicate it.
> - Use pip freeze > requirements.txt as a "lockfile".
There's lots of reasons to not do this anymore, and Dependency Hell is real, and has been for 25 years with RedHat RPM's, etc.
Even if you don't want to rely upon poetry for building in prod, poetry can still export a requirements.txt file for you, so you're not locked into using poetry, but you still get to specify the high level packages you want, and let it solve the dep graph for you.
That probably works for smaller projects without many dependencies, but it’s just going to install the sub-dependency versions that satisfy whatever comes last in the requirements file. The pip docs describe that situation here: https://pip.pypa.io/en/latest/topics/dependency-resolution/
The pip docs also suggest using pip-tools to create lock files. Pip-tools is only for creating lock files (it’s not trying to fix virtualenvs like poetry is), and it works great.
Yeah our codebase has a requirements file that takes over an hour to install with the new dependency resolver (and over 10 minutes using the deprecated resolver.) This is on a 6c/12t ryzen with 32g of ram and a gigabit connection.
This has been my experience, too, and with a moderately simple set of dependencies. I don't understand why this is considered acceptable enough to deprecate the old resolver.
it seems like you are posting on the wrong platform, github is that --> way :}
the pip team has introduced strict performance checks to ensure the new resolver was not significantly slower than the old. they solicited requirements files to weed out situations exactly like the one you describe. for example https://github.com/pypa/pip/issues/8664
I believe pipenv (not pyenv!) is also a viable option for correct versioning, though I'm not actually sure whether it or pip-tools is more actively developed these days. Last I used pipenv though (~2 years ago), it was a nicer virtualenv + pip-tools combination, but had worse version resolution / less useful verbose output for no apparent reason (since iirc it shares tons of code with pip-tools).
Putting that aside though: yes, 100% pip-tools or an equivalent (which pyenv is not). It's the only sane way to both freeze dependencies for reproducibility, and maintain upgradability. I've used pip-tools for years on many large, complex projects, and it has been a huge benefit every single time. And it routinely revealed significant library-incompatibility problems that teams had only luckily dodged due to not using feature X yet, because pip's resolver has been so braindead for forever.
Pipenv quickly becomes unusably slow. As in half hour to change one dependency. Poetry is strictly better, and I say that believing poetry is not a great solution either. There aren't many good, general recommendations to give people regarding Python package management, but "not pipenv" is one of the few I'm confident in.
How long ago did you try pipenv? I don't use it for large projects (just a few dozens dependencies), it is not fast but it works (I expect it to be just a convenience wrapper around corresponding functionality provided by pip/pip-tools (there were substantial changes in pip's dependency resolution).
Sort of - you need to make a requirements.in file that specifies only your top level dependencies, then you run pip-compile on that file to generate the requirements.txt.
> - Use pip freeze > requirements.txt as a "lockfile".
This is not and has never ever been correct. It makes it infinitely harder to install an application vs the standard `pip install -e .` which works on every package manager and avoids PYTHONPATH issues, as well as being able to publish your application to PyPI for easy installation (as simple as pip install --user app or pipx install app).
I 100% agree, I’ve made over 100 projects in the past 8 years of being deep into Python, ranging from enterprise software to little hobby projects. I’ve settled on exactly the setup. It’s great when you have multiple projects on the same machine, different dependencies, versions, maybe even specially forked changed and altered versions of libraries. It’s super versatile and easy to share. No problems running my software at all. I’ve even written a script that each time I commit for git it will quickly generate a new requirements file such that it’s always up to date. Thank you for sharing.
Genuine question: If I am starting a Python project NOW, which one do I use? I have been using pipenv for quite some time and it works great but locking speed has been problematic, specially after your project grows large enough (minutes waiting for it to lock without any progress warning at all).
Should I just upgrade to Poetry or should I just dive headfirst into PDM? Keep myself at Pipenv? I'm at a loss.
Python standard library is great and its a nice languate if you like the syntax but aside from a few constants like Django, Flask, and Pandas, the ecosystem feels like it is slowly turning into a fragmented mess.
If you're building a package, pip install -e . is preferable to -r requirements.txt. Most projects don't need and shouldn't use requirements.txt. The only ones that do are where you're shipping a whole tested environment out, like a docker image for deployment. And in that case you need to be using something like pip-tools to keep that requirements for up to date.
Same, I use two small bash scripts in my projects bin/ directory, "venv-create" for the creation of the .venv/ and "venv-python" for running the effective version of Python from the .venv/ -- this sets environment variables such as PYTHONPATH and PYTHONDONTWRITEBYTECODE and provides a single approach for running project files and packages.
I get versioned requirements files for the project base requirements, and also for each (version, implementation) of python, in case they are changes, and this has proven to be reliable for me.
It's all about finding the minimal-but-complete convenience / ergonomic solution over the, err, inconvenience of packaging. I also marvel at when I attempt to explain these things to experienced programmers, I only manage to convince them 50% of the time at most.
That's what I use as well. It works great, it's built in and it's easy to use and understand. Only issue is when you upgrade the version of Python you're running. In that case you might need to rebuild your virtualenv, but that's super easy.
I use the same solution to have multiple versions of Ansible installed.
If you need to run multiple version of Python, then virtualenvs might not be enough, but that's honestly not a problem I have. New version of Python, great, let's me just rebuild my virtualenv and get back to work.
One of the most important rules I have regarding working in Python is: Never, never ever, install ANYTHING with the global pip. Everything goes in to virtualenvs.
... bless my `zsh` shell history for these incantations. I don't think I have any hope of remembering it -- probably because of all the old virtualenv incantations!
Kind of agree with pipenv though. It's painfully slow, but it abstracts away having to worry about various requirements files (eg: dev vs prod) and the .lock keeps things consistent.
PDM author here, if anyone want to know the advantage of __pypackages__ over virtualenv-based tools(venv, virtualenv, poetry, pipenv), here it is:
The fact that virtualenvs come with a cloned(or symlinked) interpreter makes it vulnerable when users want to upgrade the host interpreter in-place unless you keep all the old installations in your system, which is what pyenv is doing. You can imagine how many interpreters, including virtualenv-embedded ones are on your machine.
You can regard __pypackages__ as a virtualenv WITHOUT the interpreter, it can easily work with any python interpreter you choose as long as it has the same major.minor version as the packages folder.
If you're building an application, use Poetry. If you're building a library, use Flit. Use PEP621 metadata in pyproject.toml regardless.
Poetry is much more focused on managing dependencies for applications than dealing with libraries that have to be used by other libraries or applications. See this deep discussion for some timely/relevant examples: https://iscinumpy.dev/post/bound-version-constraints/
> Poetry is much more focused on managing dependencies for applications than dealing with libraries
The linked article is quite long. Do you have a quick example? I've built lots of apps and libraries using Poetry and haven't run into any dependency issues, and you can specify versions constraints in the same way as Flit, so it's hard to see how they'd be different in that regard.
Edit: I think I see what the issue is--Poetry sets upper version bounds by default via its ^X.Y.Z and ~X.Y.Z syntax and advocates for always specifying upper bounds in their docs.
So, it's not that Poetry can't be used for a library, but you'll have to know that you can, and perhaps should, use different version syntax in some cases (i.e., the standard >=X.Y syntax).
I tried to use project.dependencies and project.optional-dependencies fields in pyproject.toml defined by PEP621. However Poetry seems to ignore it and it tries to stick with its own tools.poetry section.
Ah, you're right. I thought Poetry had adopted PEP 621 already, but it looks like there's still a debate about if/how they want to do that. A bit ironic considering the author of Poetry was listed as an author on PEP 621, and that was accepted just over a year ago.
I know you didn't mean it that way, but "since 2021"..? :D
I was working actively with Python (and loving it) between 2012-2015 (during which the 2 vs 3 transition/conflicts were raging) and the couple of times I've tried to get back into the groove since then (in side projects) I keep hitting walls and getting confused about how to even get the basic things running.
I've heard good things about Poetry but never tried it. PDM being closer to npm/yarn sounds like an attractive option (since they're very familar), just to tinker around (not to build a complex application with specific requirements). Will give it a shot!
I've seen others make the same reference, but I've not understood it.
My package is a Python module and a set of command-line tools based on that module.
Is my package a library? Or an application? Because it seems to be "both".
It also includes several C extensions, using either the Python/C API manually or Cython. I would love to switch to something more modern than setup.py, but none of these new back-ends seem to support this. For example, I just checked - flit explicitly focuses on "pure Python packages with no build steps" "and leaves the hard things up to other tools".
Do you expect other people to add your project as a dependency of their projects, and call into your API? Then it's a library.
If the only expected users of your API is your own project, it's an application.
And yes, if you need to build native extensions, setuptools is still your only real option. It's not great, and results in hundreds of packages that all find their own definition of insanity. But given just how crazy the C/C++ toolchain ecosystem is, especially when you have to deal with Windows, macOS, and the diversity of Linux, no one else has been will to tackle that problem because setuptools already exists.
Following that definition, the project I mentioned is a library.
But quoting PEP 517 "Much experience suggests that large successful projects often originate as quick hacks".
I have another project which started as a command-line tool only, meant for the end-user to build, maintain, search, and update a data set. It was not designed to be called by another program, neither through import nor popen. Following the guideline, I should use poetry.
It then acquired a limited Python API so search functionality could be incorporated in a web app. One of my users paid for that feature. I don't know if others use it.
This turned my package into a library. Does that mean I should switch from poetry to flit?
There seem to be several people who have tackled that problem, including emscons, PyQt-builder, and mesonpep517, which all defer to an existing third-party build system. I don't use or have experience with any of the underlying build systems, and version numbers like 0.28.0 and 0.2 make me wary.
Are there others? Finding such modules on PyPI is not easy! A search for "517" is almost useless. Could there be a new trove category for such packages?
Sold everyone on using poetry a few months ago and now red faced as we have a litany of problems and time waste due to it. We are now sitting on some bleeding edge branch because specific dependencies cannot work at all without some new fangled feature and everyone wishes we were just using plain virtualenv as we had much less problems with that.
You would also not have any real control over what gets installed with just venv. So everyone might think the grass was greener on the other side, but it might not actually be, if you value reproducible environments.
What you've never spent hours trying out n! orderings of requirements.txt until you get the results you want/can live with? You have not lived until you've done this. /s
Great. I have heard "anecdata evidence" that sometimes poetry fails to install a combination of packages or something along those lines, did you find any of those shenanigans in your own experience?
The issues happen because poetry is strict about version constraints. Pip used to be lax, so some packages could get away with poor/overly restrictive constraints..
Now pip is strict as well, and poetry is also getting a lot more common, so maintainers have had pressure to fix most of these issues.
I've run into this a couple of times. Typically it's when starting a new project and gradually adding dependencies. By default, when adding (poetry add <dep>), poetry adds the most recent version of the library, and in pyproject.toml that becomes the minimum possible version. If you then later try to add something that depends on that library, but isn't up-to-date on the most recent version, poetry will (correctly!) error out because it fails to find a compatible solution. For example, something like this:
poetry add ingredients # most recent version: 1.0
poetry add cake # requires ingredients, but <1.0
So here, poetry would add ingredients to your pyproject.toml with version = "^1.0.0" in the first step. In the second step, when you tried to add cake as a new dependency, it would say it can't find a version of cake compatible with your pyproject.toml's requirement for ingredients.
The solution is to change pyproject.toml to be more permissive about the version of ingredients, resulting in it downgrading to a compatible version. Or submit an issue and/or PR with the maintainers of cake to bump up the supported versions.
Another way it happens: I was stuck on python 3.6 for a while and it meant that I couldn't just poetry add pandas because the latest version isn't compatible with 3.6 and it won't look for other versions.
No shenanigans for any library-ish code with few dependencies that targets relatively modern Python.
I've had one or two fundamental version conflicts with a 5+ year old application with 100+ dependencies and a decent amount of legacy stuff. They were a pain in the ass, and the sdispater's stance on not allowing overrides is a pain in the ass. We ended up forking the upstream libraries to resolve the version conflict.
With all of that, poetry is amazing and a huge step forward. I'd advocate it wholeheartedly.
I've spent the last year or so comparing these packages. Initially, I liked poetry, but I began to run into issues related to client cert auth with private repositories. I and several other users who encountered these problems submitted pull requests to address them, but found that it was nearly impossible to get any of them reviewed, much less merged.
Out of frustration, I decided to try pdm instead, and although pdm doesn't handle pushing to pypi repositories, Twine works fine for it, and I have had a pain-free experience using pdm for dependency management.
I like the idea of poetry, but it's unusable if I can't get bugs fixed, and I gave up after nearly a year of trying.
> but I began to run into issues related to client cert auth with private repositories.
Yeah, I think that's the worst thing about private infrastructure in 2021 is getting those certs correctly installed and seen by the software and both sides.
I'm thinking that tailscale or something like it may be a better alternative in the future where authorization is at the link layer before the tcp connect can happen.
As others have said, this is related to how Poetry does dependency resolution. If you say your program uses Python "^3.8", you would expect any Python version in that range to work, but instead, ALL Python versions in that range must work.
If one of your dependencies won't work with 3.9, the dependency graph cannot resolve. The author of Poetry said that this is because Poetry is meant for libraries, where this makes more sense.
Alternatively, pyenv and pyenv-virtualenv for shell integration and seamless virtualenv activation.
To be fair, I'm not saying there's anything wrong with virtualenvwrapper, just that I've never used it and for my purposes the above solution works well.
This doesn’t solve dependency management?
All it does is it separates your env and you can install what you need there.
But installing with pip is still subject to version incompatibility etc.
I’d just use pip and requirements files if you can. It’s doubtful that your requirements are sufficiently complex as to require a more complex resolver, although that depends on your ML needs.
Having used PDM now for several projects, it's my preferred package manager over poetry and others. Its dependency resolver is both faster and more forgiving than poetry's. I also like the built-in task management system similar to npm's.
> PDM is meant to be a next generation Python package management tool. It was originally built for personal use. If you feel you are going well with Pipenv or Poetry and don't want to introduce another package manager, just stick to it. But if you are missing something that is not present in those tools, you can probably find some goodness in pdm.
Having used PDM a bit, its ambition in my opinion may not be to replace existing tools, but rather to experiment and implement the most recent PEPs related to packaging.
While you can argue about PEP 582[1] implementation (which is still in draft), PDM doesn't prevent anyone from using virtual environments, and even provides a plugin[2] to support that.
PDM also implements PEP 631[3], which most other package managers have been relunctant to support or slow to adopt.
Thanks for the kind words on PDM. At time of creating PDM I don't want it to be similar with any other package mangers, so I chose PEP 582, and I thought I can play more new stuff on it.
But as PDM becomes mature, it is acknowleged by the Python packaging people, I also work hard to make PDM fit more people's workflow, fortunately, it has a strong plugin system. You can add virtualenv support(pdm-venv), publish command(pdm-publish) and more. In the future, I would like to see it can eventually push the iteration of PEP 582 and make it finalized.
Just made an account to say this. I am really impressed by your projects. I first found out about pdm after writing a small plugin for marko (which is amazing by the way) and checking out your github profile. I find what you write to be really well thought out and approachable.
The big distinguisher of PDM is that it support PEP 582[0]. That means it works less like Pip and works more like NPM of the JS world. To quote PEP 582:
> This PEP proposes to add to Python a mechanism to automatically recognize a __pypackages__ directory and prefer importing packages installed in this location over user or global site-packages. This will avoid the steps to create, activate or deactivate "virtual environments". Python will use the __pypackages__ from the base directory of the script when present.
Thus, the idea of PDM is that it will create a directory, called `__pypackages__` in the root of your project and in that folder it'll populate all the dependencies for that project. Then, when you run scripts in the root folder of your project, your Python install will see that there's a `__pypackages__` folder and use that folder to look up dependencies.
This style of "dependencies inside the project directory" is similar to how npm of the Javascript ecosystem works, where it creates a `node_modules/` folder in the root of your project and fills that folder with the dependencies for your project. This style of dependency management is different from other package managers such as Poetry (Python), Pip (Python), go (Golang), and cargo (Rust), all of which instead have a sort of "secret folder acting as cache of dependencies at particular versions", a folder that's usually pretty hidden out of the way, in which the package manager automatically manages the acquisition, storage, and versioning/version resolution (Poetry, Go, Cargo, all do this but Pip does not).
That's a very fast and probably wrong rundown on what makes this package manager different from others.
I’ve long been of the opinion that pip and venv (and sometimes pyenv) is good enough. PEP 582 is a rare instance where a new packaging proposal makes sense right away when I read it and could beat pip and venv in simplicity.
It seems functionally similar to venv but has the benefit of standardizing the location of dependencies to __pypackages__/3.x/*. With venv the developer selects some arbitrarily named directory that is sometimes but not always .venv/*.
venv and PEP 582 seem to solve overlapping but slightly different problems. venv packages a python environment (interpreter and packages) without inherently tying it to a project (a project specific one might be embedded in a project directory, or a shared one might be used somewhere else, and you might even have multiple used for the same project, e.g., for different python versions or to validate against different versions of dependencies, etc.), while PEP 582 ties dependencies (possibly for multiple interpreter versions) to a project, but leaves supplying and identifying the right interpreter to use to some other system.
I like the "venv per script" approach. So it is like... you create a directory that contains main.py and foo.pdm or whatever its extension is, and then it grabs all the dependencies into that directory when you run a command and has a lock file and everything? I did not check out the website yet, that is why I am asking. If this is the case, then it is a win in my book.
Yes, PDM does some hacks to make it work with the familiar `python` executable. See the example at https://github.com/pdm-project/pdm/#quickstart. All you need is a fews lines to set up your shell.
Basically PDM supports project-specific python package installs. This is different from how python has traditionally worked where it has installed globally for the user running it. Why is this important? Because with virtual environments it's easy to forget to activate a virtual environment and run a pip install or upgrade and clobber your computer or server's python environment. It also avoids the confusing issue where someone updates their PATH variable while in a venv, but then its no longer there after exiting the virtual environment.
PDM also supports centralized package cache like [pnpm] while still keeping project's dependencies isolated.
> Also this will just pollute your source directories with generated directories and files that shouldn't be there.
I don't see why it is a problem, node.js also has node_modules in the project. And __pypackages__ is created with a .gitignore file so you won't commit it accidentally.
PDM may also be a good fit for Blender, because of the per-project approach. Blender doesn't come with a package manager and has a varied relation to the system installed Python interpreters depending on platform and install choices.
Scripts, Plugins etc for Blender are currently distributed in a very ad-hoc way, and it is hard to get adoption with plugins that require more elaborate dependencies, especially binary modules.
PDM’s support of PEP 582 seems promising. What would be even cooler is if the maintainers of pip agree with PEP 582 and incorporate it into pip itself.
Dephell is a converter for python packaging systems. It can turn poetry files into requirements.txt, or setuptools' setup.py into pipenv's Pipfile etc.
Python Packaging: There is More Than One Way to Do It
I switched from Poetry to PDM and is feeling great.
If there's one reason to use PDM: the maintainer fixes bugs fairly quickly, not like Poetry, where many bugs are left open and nobody is taking care of it.
I think competition in this space is good. Not all package managers will get and keep a big audience or following, but they act as independent proving grounds for new ideas and solutions.
Tell that to a sysadmin who's trying to guarantee their server uptime...
I use python almost every day and I find it a cluster fuck anytime i have to install something using a specific version or update some packages. If it wasn't for containerization i would've lost hope very early on in the python packing ecosystem.
For people finding good package manager for Python, conda is the best. (Mamba is even better and is a drop-in replacement of conda that you can even do conda install mamba and use mamba from there.)
It doesn’t replace your setup.py or pyproject.toml as a maintainer, though. For that I uses poetry with pyproject.toml. As a maintainer, you need to release to PyPI first and then release it to anaconda or conda-forge (where I prefer the later.) Releasing on conda-forge seems more troublesome at first, and that process makes you trust conda-forge more than PyPI where it virtually has no barrier to entry.
As an end user, most packages you can pip install with can be conda/mamba install with. For those which doesn’t exist, you can use pip install inside a conda environment. (Which is like virtualenv.) dependencies already installed via conda won’t be repeated.
Conda is designed to be multi platform and multi language as well. Npm, Julia, ffmpeg, etc can be installed via conda. It is designed for scientific stack where there’s a lot of non pure Python dependencies. In this aspect, it is more like a package manager other than the dIstro provided ones, which should be compared to homebrew, Macports, nix, pkg-src, etc.
There's always Jim Waldo's proposed rule for the C++ standards committee.
Every extension proposal should be required to be accompanied by a kidney. People would submit only serious proposals, and nobody would submit more than two.
— Jim Waldo
I've never seen PEP582 until now, but it looks concerningly short. It doesn't discuss tradeoffs, alternatives, and is nowhere near comprehensive enough in its examples. It just doesn't seem very thoroughly thought through... It has the feeling of "this is such a great and simple idea! What could possibly go wrong??"
Suprised there is so much love for Poetry. Having only used it sparingly (and having some dependency issues with it at the time) I do not see the benefits over conda (miniforge version to easily avoid anaconda) or mamba if you care about speed.
Conda is missing a significant number of python packages and many they do have are perpetually out of date - coincidentally that includes poetry and pdm.
Maybe dependency hell is a wicked problem and cannot really be solved? What is the gold standard to aspire to? npm is mentioned in the post but its not clear if that is just for lowering barrier to entry for new users.
What are the fundamental limitations to python/pip supporting the installation model of npm, where multiple versions of a dependency can happily co-exist?
So this is sort of like a virtualenv, except you don't need to activate anything, it just looks within the main project folder for a __pypackages__ folder, and uses that to look for packages
Pipenv is only focused on solving one problem, deploying complete packages you wrote to a server you have complete control over. It doesn't really care about the problem of developing applications and libraries for distribution and installation by third parties and on platforms running different python version/OS etc. than what it was developed on.
You kind of answered your own question. Virtualenv alone is too much work so you're using a tool like pipenv to streamline it. This is a similar idea to have less ritual and a faster developer workflow.
You can use any tool you like, but it triggers me every time some says something misinformed about `conda`. So please if you have anything negative to say about it, please read the official Managing Environments page of `conda` before doing so [1]
* Please don't confuse `conda` with Anaconda, Miniconda & Conda-Forge.
* You can use `conda` to manage your environment and it can download and install ALL your packages from `pypi.org` and not Anaconda.
I'm sorry, but I've had so much trouble setting up IDEs and had 3 major installation/update issues with PDM just from the start - it is unusable for production. Poetry has never failed for me even once, yes, it was slower back in the day, yet it worked like a charm properly resolving dependencies.
Why is "doesn't need to create a virtualenv" a good thing? Virtualenv is just PATH enhancement. I always thought of it to be simpler than the npm run-script magic.
Really wish the Python core developers would focus on this situation and almagate one official solution for installation, env and package management and one click distribution.
- PEP582 which it is based on is still in draft, and some tools (VS Code) won't fully support it until it's accepted.
- If you want to develop or test using different Python versions, you still need to use a virtual environment. PDM does handle this for you though.
IMO, the Python packaging ecosystem has been a dumpster fire for a long time, but the flurry of recent PEPs that provide a standard way of defining a package with pyproject.toml have made it so much better. Now it's just a matter of the tools catching up.