Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
PDM: A Modern Python Package Manager (fming.dev)
192 points by sciurus on Dec 9, 2021 | hide | past | favorite | 156 comments


I tried PDM earlier this year and there's a few things worth noting:

- PEP582 which it is based on is still in draft, and some tools (VS Code) won't fully support it until it's accepted.

- If you want to develop or test using different Python versions, you still need to use a virtual environment. PDM does handle this for you though.

IMO, the Python packaging ecosystem has been a dumpster fire for a long time, but the flurry of recent PEPs that provide a standard way of defining a package with pyproject.toml have made it so much better. Now it's just a matter of the tools catching up.


Dumpster fire compared to what? Nuget? NPM? It may be that some more recent languages managed to start a saner solution and keep it sane.

But really, it's a hard problem, between cross-platform support, backwards compatibility, security concerns, hosting, most authors being volunteers and so on.

And still, even with "just" pip or even conda I am enjoying the Python experience more than some other packaging solutions I've seen.


It's objectively a dumpster fire. I don't care about other languages also being a dumpster fire on the packaging front, because I don't use other languages :) It also doesn't help me to know that NPM or .NET have rubbish packaging systems, that's their problem and not mine. I'd primarily want python's packaging system to be good.

I mean, just look at this thread. Someone asks "So in light of this, what should I use for python packaging?", and they get two dozen different answers loaded with weirdness like pyenv, python-venv, virtualenvwrapper, etc... If I wasn't using python, I would've thought this is some cruel python in-joke that outsiders don't get.

Just looking at those names, I'm already confused as to what the hell each are doing and why do I need them?

But let's go back to pip and conda. Conda is unbearably slow. Pip is not entirely reliable in resolving versions properly. There's also not entirely cooperative interaction between conda and pip. If you use conda, you should not use pip (or just minimally) because it'll result in a mess.

Yes, packaging is hard, but it feels like python has managed to solve (or not solve) it in a uniquely obtuse and bad way so far.

Hopefully the slew of new PEPs will finally bring some clarity to this mess.


When you say Python's way is uniquely obtuse, you are making a comparison to other packaging systems.

Labeling something a "dumpster fire" is a value judgement. It can't be "objective", unless there is a physical dumpster that is burning. It's not like the Python community hasn't solved this problem because it's lazy or incompetent. It's just a hard problem.


> Pip is not entirely reliable in resolving versions properly

It must be true because people say something like this any time a "python packaging sucks" thread comes along ... but I can't recall ever experiencing that in 10 years of Python-ing

The people having problems always seem to mention Conda though

I guess the parts that have problems are primarily where FFI and compilation or linking of non-Python code is happening?


Compared to nothing. Dumpster fire just means it sucks. To be fair, most of them do though.


If Python’s packaging is a dumpster fire, then what is Perl’s packaging?


Whoa there. Setting aside the stagnation of Perl due to the v6 debacle (which, by the way, Python 3 came very close to succumbing to the same fate), CPAN is widely recognized as a very successful package system, and is frequently the envy of many other languages. DBI, the Net:: space, and many others just work, and the package names follow common sense.


I started a new Python project last month. I tried both Poetry and PDM but I decided not to use neither of them. PDM is currently basically one man show, and the Poetry's doc isn't great - The doc page seems pretty but it only describes command line usages and does not tell how I can configure metadata. Most importantly Poetry does not support the standard PEP621 yet.

So I stick with this setup:

- Use pyenv to manage different Python versions and virtual environments.

- Use the standard PEP621 specification as a high-level dependency description: https://www.python.org/dev/peps/pep-0621/#example

- Use pip freeze > requirements.txt as a "lockfile".


    $ poetry init
    $ poetry add django
    $ poetry shell
It's pretty simple. Check in the lock file, and then run

    $ poetry install 
to replicate it.

> - Use pip freeze > requirements.txt as a "lockfile".

There's lots of reasons to not do this anymore, and Dependency Hell is real, and has been for 25 years with RedHat RPM's, etc.

Even if you don't want to rely upon poetry for building in prod, poetry can still export a requirements.txt file for you, so you're not locked into using poetry, but you still get to specify the high level packages you want, and let it solve the dep graph for you.


> There's lots of reasons to not do this anymore, and Dependency Hell is real

If you want to be relaxed about dependencies, you can use "pip-chill".

    pip install pip-chill
    pip-chill > requirements.txt
And, if you are even more relaxed,

    pip-chill --no-version > requirements.txt


That probably works for smaller projects without many dependencies, but it’s just going to install the sub-dependency versions that satisfy whatever comes last in the requirements file. The pip docs describe that situation here: https://pip.pypa.io/en/latest/topics/dependency-resolution/

The pip docs also suggest using pip-tools to create lock files. Pip-tools is only for creating lock files (it’s not trying to fix virtualenvs like poetry is), and it works great.


Yeah our codebase has a requirements file that takes over an hour to install with the new dependency resolver (and over 10 minutes using the deprecated resolver.) This is on a 6c/12t ryzen with 32g of ram and a gigabit connection.


This has been my experience, too, and with a moderately simple set of dependencies. I don't understand why this is considered acceptable enough to deprecate the old resolver.


it seems like you are posting on the wrong platform, github is that --> way :}

the pip team has introduced strict performance checks to ensure the new resolver was not significantly slower than the old. they solicited requirements files to weed out situations exactly like the one you describe. for example https://github.com/pypa/pip/issues/8664

if you want it improved, please report it.


I believe pipenv (not pyenv!) is also a viable option for correct versioning, though I'm not actually sure whether it or pip-tools is more actively developed these days. Last I used pipenv though (~2 years ago), it was a nicer virtualenv + pip-tools combination, but had worse version resolution / less useful verbose output for no apparent reason (since iirc it shares tons of code with pip-tools).

Putting that aside though: yes, 100% pip-tools or an equivalent (which pyenv is not). It's the only sane way to both freeze dependencies for reproducibility, and maintain upgradability. I've used pip-tools for years on many large, complex projects, and it has been a huge benefit every single time. And it routinely revealed significant library-incompatibility problems that teams had only luckily dodged due to not using feature X yet, because pip's resolver has been so braindead for forever.


Pipenv quickly becomes unusably slow. As in half hour to change one dependency. Poetry is strictly better, and I say that believing poetry is not a great solution either. There aren't many good, general recommendations to give people regarding Python package management, but "not pipenv" is one of the few I'm confident in.


How long ago did you try pipenv? I don't use it for large projects (just a few dozens dependencies), it is not fast but it works (I expect it to be just a convenience wrapper around corresponding functionality provided by pip/pip-tools (there were substantial changes in pip's dependency resolution).


Good to know, thanks :)


Interesting. So use `pip-compile` instead of `pip freeze > requirements.txt`?


Sort of - you need to make a requirements.in file that specifies only your top level dependencies, then you run pip-compile on that file to generate the requirements.txt.


So another requirements.in file ignoring PEP621? I'm confused...

Maybe I will just try first. pip-tools seems to acknowledging PEP621

https://pip-tools.readthedocs.io/en/latest/changelog/?highli...

https://github.com/jazzband/pip-tools/issues/1047


pip-tools predates PEP621 by ~5-8 years, depending on how you count. So yes, it "ignores" it.


> - Use pip freeze > requirements.txt as a "lockfile".

This is not and has never ever been correct. It makes it infinitely harder to install an application vs the standard `pip install -e .` which works on every package manager and avoids PYTHONPATH issues, as well as being able to publish your application to PyPI for easy installation (as simple as pip install --user app or pipx install app).


PEP621 literally includes how Poetry lists dependencies as a synonym for the PEP's "dependencies" section. Poetry does, in fact, adhere to PEP621.


I 100% agree, I’ve made over 100 projects in the past 8 years of being deep into Python, ranging from enterprise software to little hobby projects. I’ve settled on exactly the setup. It’s great when you have multiple projects on the same machine, different dependencies, versions, maybe even specially forked changed and altered versions of libraries. It’s super versatile and easy to share. No problems running my software at all. I’ve even written a script that each time I commit for git it will quickly generate a new requirements file such that it’s always up to date. Thank you for sharing.


Genuine question: If I am starting a Python project NOW, which one do I use? I have been using pipenv for quite some time and it works great but locking speed has been problematic, specially after your project grows large enough (minutes waiting for it to lock without any progress warning at all).

Should I just upgrade to Poetry or should I just dive headfirst into PDM? Keep myself at Pipenv? I'm at a loss.

Thanks in advance!


I have never experienced any issues with a virtualenv and pip.

python3 -m venv venv && source venv/bin/activate && pip install -r requirements.txt

Python standard library is great and its a nice languate if you like the syntax but aside from a few constants like Django, Flask, and Pandas, the ecosystem feels like it is slowly turning into a fragmented mess.


If you're building a package, pip install -e . is preferable to -r requirements.txt. Most projects don't need and shouldn't use requirements.txt. The only ones that do are where you're shipping a whole tested environment out, like a docker image for deployment. And in that case you need to be using something like pip-tools to keep that requirements for up to date.


Same, I use two small bash scripts in my projects bin/ directory, "venv-create" for the creation of the .venv/ and "venv-python" for running the effective version of Python from the .venv/ -- this sets environment variables such as PYTHONPATH and PYTHONDONTWRITEBYTECODE and provides a single approach for running project files and packages.

I get versioned requirements files for the project base requirements, and also for each (version, implementation) of python, in case they are changes, and this has proven to be reliable for me.

It's all about finding the minimal-but-complete convenience / ergonomic solution over the, err, inconvenience of packaging. I also marvel at when I attempt to explain these things to experienced programmers, I only manage to convince them 50% of the time at most.


That's what I use as well. It works great, it's built in and it's easy to use and understand. Only issue is when you upgrade the version of Python you're running. In that case you might need to rebuild your virtualenv, but that's super easy.

I use the same solution to have multiple versions of Ansible installed.

If you need to run multiple version of Python, then virtualenvs might not be enough, but that's honestly not a problem I have. New version of Python, great, let's me just rebuild my virtualenv and get back to work.

One of the most important rules I have regarding working in Python is: Never, never ever, install ANYTHING with the global pip. Everything goes in to virtualenvs.


> python3 -m venv venv && source venv/bin/activate && pip install -r requirements.txt

... bless my `zsh` shell history for these incantations. I don't think I have any hope of remembering it -- probably because of all the old virtualenv incantations!

Kind of agree with pipenv though. It's painfully slow, but it abstracts away having to worry about various requirements files (eg: dev vs prod) and the .lock keeps things consistent.


I use this same setup but with a requirements folder. Inside I have a base.txt, dev.txt and tests.txt where dev.txt starts with '-r tests.txt', etc.

It's very handy for having different environments but doesn't lend itself to a lockfile.


PDM author here, if anyone want to know the advantage of __pypackages__ over virtualenv-based tools(venv, virtualenv, poetry, pipenv), here it is: The fact that virtualenvs come with a cloned(or symlinked) interpreter makes it vulnerable when users want to upgrade the host interpreter in-place unless you keep all the old installations in your system, which is what pyenv is doing. You can imagine how many interpreters, including virtualenv-embedded ones are on your machine.

You can regard __pypackages__ as a virtualenv WITHOUT the interpreter, it can easily work with any python interpreter you choose as long as it has the same major.minor version as the packages folder.


Use Poetry. It does everything you need and PDM is a bit too new still.


As a Python infra person, I would refine that to:

If you're building an application, use Poetry. If you're building a library, use Flit. Use PEP621 metadata in pyproject.toml regardless.

Poetry is much more focused on managing dependencies for applications than dealing with libraries that have to be used by other libraries or applications. See this deep discussion for some timely/relevant examples: https://iscinumpy.dev/post/bound-version-constraints/


> Poetry is much more focused on managing dependencies for applications than dealing with libraries

The linked article is quite long. Do you have a quick example? I've built lots of apps and libraries using Poetry and haven't run into any dependency issues, and you can specify versions constraints in the same way as Flit, so it's hard to see how they'd be different in that regard.

Edit: I think I see what the issue is--Poetry sets upper version bounds by default via its ^X.Y.Z and ~X.Y.Z syntax and advocates for always specifying upper bounds in their docs.

So, it's not that Poetry can't be used for a library, but you'll have to know that you can, and perhaps should, use different version syntax in some cases (i.e., the standard >=X.Y syntax).


Can we mix Poetry and PEP621 though?

I tried to use project.dependencies and project.optional-dependencies fields in pyproject.toml defined by PEP621. However Poetry seems to ignore it and it tries to stick with its own tools.poetry section.


Ah, you're right. I thought Poetry had adopted PEP 621 already, but it looks like there's still a debate about if/how they want to do that. A bit ironic considering the author of Poetry was listed as an author on PEP 621, and that was accepted just over a year ago.


It went final this March though. So Poetry might needs more time to think about the transitioning plan.

Python packaging is again in a chaos since 2021 because of the rise of standardized PEPs.


I know you didn't mean it that way, but "since 2021"..? :D

I was working actively with Python (and loving it) between 2012-2015 (during which the 2 vs 3 transition/conflicts were raging) and the couple of times I've tried to get back into the groove since then (in side projects) I keep hitting walls and getting confused about how to even get the basic things running.

I've heard good things about Poetry but never tried it. PDM being closer to npm/yarn sounds like an attractive option (since they're very familar), just to tinker around (not to build a complex application with specific requirements). Will give it a shot!


I've seen others make the same reference, but I've not understood it.

My package is a Python module and a set of command-line tools based on that module.

Is my package a library? Or an application? Because it seems to be "both".

It also includes several C extensions, using either the Python/C API manually or Cython. I would love to switch to something more modern than setup.py, but none of these new back-ends seem to support this. For example, I just checked - flit explicitly focuses on "pure Python packages with no build steps" "and leaves the hard things up to other tools".


Do you expect other people to add your project as a dependency of their projects, and call into your API? Then it's a library.

If the only expected users of your API is your own project, it's an application.

And yes, if you need to build native extensions, setuptools is still your only real option. It's not great, and results in hundreds of packages that all find their own definition of insanity. But given just how crazy the C/C++ toolchain ecosystem is, especially when you have to deal with Windows, macOS, and the diversity of Linux, no one else has been will to tackle that problem because setuptools already exists.


Following that definition, the project I mentioned is a library.

But quoting PEP 517 "Much experience suggests that large successful projects often originate as quick hacks".

I have another project which started as a command-line tool only, meant for the end-user to build, maintain, search, and update a data set. It was not designed to be called by another program, neither through import nor popen. Following the guideline, I should use poetry.

It then acquired a limited Python API so search functionality could be incorporated in a web app. One of my users paid for that feature. I don't know if others use it.

This turned my package into a library. Does that mean I should switch from poetry to flit?

There seem to be several people who have tackled that problem, including emscons, PyQt-builder, and mesonpep517, which all defer to an existing third-party build system. I don't use or have experience with any of the underlying build systems, and version numbers like 0.28.0 and 0.2 make me wary.

Are there others? Finding such modules on PyPI is not easy! A search for "517" is almost useless. Could there be a new trove category for such packages?


Sold everyone on using poetry a few months ago and now red faced as we have a litany of problems and time waste due to it. We are now sitting on some bleeding edge branch because specific dependencies cannot work at all without some new fangled feature and everyone wishes we were just using plain virtualenv as we had much less problems with that.


You would also not have any real control over what gets installed with just venv. So everyone might think the grass was greener on the other side, but it might not actually be, if you value reproducible environments.


What you've never spent hours trying out n! orderings of requirements.txt until you get the results you want/can live with? You have not lived until you've done this. /s


Dep Hell is real.


In other words, you work with people who have no clue what’s going on.


Thanks, that's good advice!


Great. I have heard "anecdata evidence" that sometimes poetry fails to install a combination of packages or something along those lines, did you find any of those shenanigans in your own experience?


(Not GP) Occasionally, but still use poetry.

The issues happen because poetry is strict about version constraints. Pip used to be lax, so some packages could get away with poor/overly restrictive constraints..

Now pip is strict as well, and poetry is also getting a lot more common, so maintainers have had pressure to fix most of these issues.


I've run into this a couple of times. Typically it's when starting a new project and gradually adding dependencies. By default, when adding (poetry add <dep>), poetry adds the most recent version of the library, and in pyproject.toml that becomes the minimum possible version. If you then later try to add something that depends on that library, but isn't up-to-date on the most recent version, poetry will (correctly!) error out because it fails to find a compatible solution. For example, something like this:

    poetry add ingredients  # most recent version: 1.0
    poetry add cake         # requires ingredients, but <1.0
So here, poetry would add ingredients to your pyproject.toml with version = "^1.0.0" in the first step. In the second step, when you tried to add cake as a new dependency, it would say it can't find a version of cake compatible with your pyproject.toml's requirement for ingredients.

The solution is to change pyproject.toml to be more permissive about the version of ingredients, resulting in it downgrading to a compatible version. Or submit an issue and/or PR with the maintainers of cake to bump up the supported versions.


In PDM, add command supports a `--unconstrained` option to relax the version constraints to find a solution. You don't need to edit the file manually.


Another way it happens: I was stuck on python 3.6 for a while and it meant that I couldn't just poetry add pandas because the latest version isn't compatible with 3.6 and it won't look for other versions.


No shenanigans for any library-ish code with few dependencies that targets relatively modern Python.

I've had one or two fundamental version conflicts with a 5+ year old application with 100+ dependencies and a decent amount of legacy stuff. They were a pain in the ass, and the sdispater's stance on not allowing overrides is a pain in the ass. We ended up forking the upstream libraries to resolve the version conflict.

With all of that, poetry is amazing and a huge step forward. I'd advocate it wholeheartedly.


I've spent the last year or so comparing these packages. Initially, I liked poetry, but I began to run into issues related to client cert auth with private repositories. I and several other users who encountered these problems submitted pull requests to address them, but found that it was nearly impossible to get any of them reviewed, much less merged.

Out of frustration, I decided to try pdm instead, and although pdm doesn't handle pushing to pypi repositories, Twine works fine for it, and I have had a pain-free experience using pdm for dependency management.

I like the idea of poetry, but it's unusable if I can't get bugs fixed, and I gave up after nearly a year of trying.


> but I began to run into issues related to client cert auth with private repositories.

Yeah, I think that's the worst thing about private infrastructure in 2021 is getting those certs correctly installed and seen by the software and both sides.

I'm thinking that tailscale or something like it may be a better alternative in the future where authorization is at the link layer before the tcp connect can happen.


As others have said, this is related to how Poetry does dependency resolution. If you say your program uses Python "^3.8", you would expect any Python version in that range to work, but instead, ALL Python versions in that range must work.

If one of your dependencies won't work with 3.9, the dependency graph cannot resolve. The author of Poetry said that this is because Poetry is meant for libraries, where this makes more sense.


Except Poetry's recommendations/decisions around version capping doesn't make sense for libraries. https://iscinumpy.dev/post/bound-version-constraints/


Is Poetry used a lot by Python devs? What package management a lot of Python devs use?

I mostly code on JavaScript and (obviously) use NPM a lot, and it makes me wonder.


I've been using Poetry exclusively for the past few years. I had a couple small problems with it early on, but lately I haven't run into any issues.


I use it for all of my projects, yes. I haven't really had many problems, just one or two that were easily resolved.


Just go for virtualenv and pip. Virtualenvwrapper to have handy shell tools.


Alternatively, pyenv and pyenv-virtualenv for shell integration and seamless virtualenv activation.

To be fair, I'm not saying there's anything wrong with virtualenvwrapper, just that I've never used it and for my purposes the above solution works well.


This is our setup - fits perfectly, don't even miss `workon`


That's been my thing, but I do nothing of interest.

Maybe 3.11 can make python packaging less of a beautiful disaster.


This doesn’t solve dependency management? All it does is it separates your env and you can install what you need there. But installing with pip is still subject to version incompatibility etc.


Just go docker


Poetry is good. We've used it 3 years in production without issue of large amounts of time to solve the dependencies.

    $ poetry init # to init the pyproject.toml file.
    $ poetry add <packages> 
    $ poetry shell # to activate inside the virutalenv.
Lock in the poetry.lock file after any change to your project dependencies, and other people can duplicate the project using after doing a git pull.

    $ poetry install


I tried them all a few months ago. Poetry was the best in my opinion.


Do you suffer from requiring different versions of the same library, or have some other kind of non trivial dependencies?

This will mark me out as a Luddite but I am still quite happy with a 5 line setup.py and “pip install .”

https://news.ycombinator.com/item?id=29446715

  $ cat setup.py
  from setuptools import setup
  setup(
    install_requires=['arrow'],
    packages=['dogclock'],
    scripts=['scripts/dogclock'])


I just started a Python project, using PDM. So far, I like it a lot. If I hit a show-stopper, no big deal, it's pretty easy to switch.


Pipenv is garbage. Don’t use it.

I’d just use pip and requirements files if you can. It’s doubtful that your requirements are sufficiently complex as to require a more complex resolver, although that depends on your ML needs.


Having used PDM now for several projects, it's my preferred package manager over poetry and others. Its dependency resolver is both faster and more forgiving than poetry's. I also like the built-in task management system similar to npm's.


It's sad to see so many negative comments here.

Quoting the GitHub page[0]:

> PDM is meant to be a next generation Python package management tool. It was originally built for personal use. If you feel you are going well with Pipenv or Poetry and don't want to introduce another package manager, just stick to it. But if you are missing something that is not present in those tools, you can probably find some goodness in pdm.

Having used PDM a bit, its ambition in my opinion may not be to replace existing tools, but rather to experiment and implement the most recent PEPs related to packaging.

While you can argue about PEP 582[1] implementation (which is still in draft), PDM doesn't prevent anyone from using virtual environments, and even provides a plugin[2] to support that. PDM also implements PEP 631[3], which most other package managers have been relunctant to support or slow to adopt.

[0]: https://github.com/pdm-project/pdm

[1]: https://www.python.org/dev/peps/pep-0582/

[2]: https://github.com/pdm-project/pdm-venv

[3]: https://www.python.org/dev/peps/pep-0631/


Thanks for the kind words on PDM. At time of creating PDM I don't want it to be similar with any other package mangers, so I chose PEP 582, and I thought I can play more new stuff on it.

But as PDM becomes mature, it is acknowleged by the Python packaging people, I also work hard to make PDM fit more people's workflow, fortunately, it has a strong plugin system. You can add virtualenv support(pdm-venv), publish command(pdm-publish) and more. In the future, I would like to see it can eventually push the iteration of PEP 582 and make it finalized.


Just made an account to say this. I am really impressed by your projects. I first found out about pdm after writing a small plugin for marko (which is amazing by the way) and checking out your github profile. I find what you write to be really well thought out and approachable.


God those PEP's are exactly what I need


This is understandable. But if it is experimental, please label is as such. No mention of that on the landing page right now.

EDIT: Oh, I should say: If it's meant to take over the world, say so, as well!


So, what's the difference with poetry? It seems pretty similar to me. Do we really need another python package manager?


The big distinguisher of PDM is that it support PEP 582[0]. That means it works less like Pip and works more like NPM of the JS world. To quote PEP 582:

> This PEP proposes to add to Python a mechanism to automatically recognize a __pypackages__ directory and prefer importing packages installed in this location over user or global site-packages. This will avoid the steps to create, activate or deactivate "virtual environments". Python will use the __pypackages__ from the base directory of the script when present.

Thus, the idea of PDM is that it will create a directory, called `__pypackages__` in the root of your project and in that folder it'll populate all the dependencies for that project. Then, when you run scripts in the root folder of your project, your Python install will see that there's a `__pypackages__` folder and use that folder to look up dependencies.

This style of "dependencies inside the project directory" is similar to how npm of the Javascript ecosystem works, where it creates a `node_modules/` folder in the root of your project and fills that folder with the dependencies for your project. This style of dependency management is different from other package managers such as Poetry (Python), Pip (Python), go (Golang), and cargo (Rust), all of which instead have a sort of "secret folder acting as cache of dependencies at particular versions", a folder that's usually pretty hidden out of the way, in which the package manager automatically manages the acquisition, storage, and versioning/version resolution (Poetry, Go, Cargo, all do this but Pip does not).

That's a very fast and probably wrong rundown on what makes this package manager different from others.

[0] - https://www.python.org/dev/peps/pep-0582/


I’ve long been of the opinion that pip and venv (and sometimes pyenv) is good enough. PEP 582 is a rare instance where a new packaging proposal makes sense right away when I read it and could beat pip and venv in simplicity.

It seems functionally similar to venv but has the benefit of standardizing the location of dependencies to __pypackages__/3.x/*. With venv the developer selects some arbitrarily named directory that is sometimes but not always .venv/*.


venv and PEP 582 seem to solve overlapping but slightly different problems. venv packages a python environment (interpreter and packages) without inherently tying it to a project (a project specific one might be embedded in a project directory, or a shared one might be used somewhere else, and you might even have multiple used for the same project, e.g., for different python versions or to validate against different versions of dependencies, etc.), while PEP 582 ties dependencies (possibly for multiple interpreter versions) to a project, but leaves supplying and identifying the right interpreter to use to some other system.


Wow! This is basically dramatically simplifying to a "venv per script" type approach -- some basic file reorganization and you're solid.

I have not come across PEP 582, thank you for linking.


I like the "venv per script" approach. So it is like... you create a directory that contains main.py and foo.pdm or whatever its extension is, and then it grabs all the dependencies into that directory when you run a command and has a lock file and everything? I did not check out the website yet, that is why I am asking. If this is the case, then it is a win in my book.


Yes, PDM does some hacks to make it work with the familiar `python` executable. See the example at https://github.com/pdm-project/pdm/#quickstart. All you need is a fews lines to set up your shell.


Basically PDM supports project-specific python package installs. This is different from how python has traditionally worked where it has installed globally for the user running it. Why is this important? Because with virtual environments it's easy to forget to activate a virtual environment and run a pip install or upgrade and clobber your computer or server's python environment. It also avoids the confusing issue where someone updates their PATH variable while in a venv, but then its no longer there after exiting the virtual environment.


That doesn't seem very smart to me. The advantage of having the packages in a centralized location is that that multiple projects can alias them.

Also this will just pollute your source directories with generated directories and files that shouldn't be there.


PDM also supports centralized package cache like [pnpm] while still keeping project's dependencies isolated.

> Also this will just pollute your source directories with generated directories and files that shouldn't be there.

I don't see why it is a problem, node.js also has node_modules in the project. And __pypackages__ is created with a .gitignore file so you won't commit it accidentally.


Why is node doing bad things a justification for other package managers doing bad things?


I'm not following why storing project scoped things at the project level is a bad thing.


Composer (php) also uses the "project local" approach


Cool to see an implementation of __pypackages__ support.

Ill still use Poetry, but this could be paving the way for Poetry to work without virtualenvs as well one day.


I think I'm starting to like poetry too!


PDM may also be a good fit for Blender, because of the per-project approach. Blender doesn't come with a package manager and has a varied relation to the system installed Python interpreters depending on platform and install choices.

Scripts, Plugins etc for Blender are currently distributed in a very ad-hoc way, and it is hard to get adoption with plugins that require more elaborate dependencies, especially binary modules.


That's cool. A local folder named '__pypackage__', I can get that into my head. This is the one obvious way to do it.


But please, let this be PIP, not PDM


This sounds great. Why hasn’t it been done before? It seems pythonic.


PDM’s support of PEP 582 seems promising. What would be even cooler is if the maintainers of pip agree with PEP 582 and incorporate it into pip itself.


So many of these... I now need a tool to manage the managers.


You jest and yet...

https://github.com/dephell/dephell

Dephell is a converter for python packaging systems. It can turn poetry files into requirements.txt, or setuptools' setup.py into pipenv's Pipfile etc.

Python Packaging: There is More Than One Way to Do It


I switched from Poetry to PDM and is feeling great.

If there's one reason to use PDM: the maintainer fixes bugs fairly quickly, not like Poetry, where many bugs are left open and nobody is taking care of it.


I can totally confirm that. Great support, even when the fault was mine. Well done @frostming!


Sure why not, python needs yet another package manager. It's not confusing enough already! /s


I think competition in this space is good. Not all package managers will get and keep a big audience or following, but they act as independent proving grounds for new ideas and solutions.


Tell that to a sysadmin who's trying to guarantee their server uptime...

I use python almost every day and I find it a cluster fuck anytime i have to install something using a specific version or update some packages. If it wasn't for containerization i would've lost hope very early on in the python packing ecosystem.


For people finding good package manager for Python, conda is the best. (Mamba is even better and is a drop-in replacement of conda that you can even do conda install mamba and use mamba from there.)

It doesn’t replace your setup.py or pyproject.toml as a maintainer, though. For that I uses poetry with pyproject.toml. As a maintainer, you need to release to PyPI first and then release it to anaconda or conda-forge (where I prefer the later.) Releasing on conda-forge seems more troublesome at first, and that process makes you trust conda-forge more than PyPI where it virtually has no barrier to entry.

As an end user, most packages you can pip install with can be conda/mamba install with. For those which doesn’t exist, you can use pip install inside a conda environment. (Which is like virtualenv.) dependencies already installed via conda won’t be repeated.

Conda is designed to be multi platform and multi language as well. Npm, Julia, ffmpeg, etc can be installed via conda. It is designed for scientific stack where there’s a lot of non pure Python dependencies. In this aspect, it is more like a package manager other than the dIstro provided ones, which should be compared to homebrew, Macports, nix, pkg-src, etc.


There should be a rule that before you create a new Python package manager, you have to kill off an existing one first.


There's always Jim Waldo's proposed rule for the C++ standards committee.

Every extension proposal should be required to be accompanied by a kidney. People would submit only serious proposals, and nobody would submit more than two. — Jim Waldo


It doesn't say it has to be your own kidney.


Thanks! I was really missing yet another package manager


I've never seen PEP582 until now, but it looks concerningly short. It doesn't discuss tradeoffs, alternatives, and is nowhere near comprehensive enough in its examples. It just doesn't seem very thoroughly thought through... It has the feeling of "this is such a great and simple idea! What could possibly go wrong??"


It's still in a draft phase


Oh, no. Please, not one more.


Suprised there is so much love for Poetry. Having only used it sparingly (and having some dependency issues with it at the time) I do not see the benefits over conda (miniforge version to easily avoid anaconda) or mamba if you care about speed.


Conda is missing a significant number of python packages and many they do have are perpetually out of date - coincidentally that includes poetry and pdm.


Conda is using their own non-portable, old, broken repos


Maybe dependency hell is a wicked problem and cannot really be solved? What is the gold standard to aspire to? npm is mentioned in the post but its not clear if that is just for lowering barrier to entry for new users.


In the Java world, maven works really well. It’s my personal gold standard, I haven’t found anything in another language which works better.


What are the fundamental limitations to python/pip supporting the installation model of npm, where multiple versions of a dependency can happily co-exist?


Python should update their logo to that snake eating itself


This is already the logo of PyPy. The JIT Python interpreter. https://www.pypy.org/


From Python 4 onwards, the Python language will be referred to as Ouroboros.


Is virtualenv not very useful? I use pipenv and find it extremely useful having my various projects separate.

Also when deploying to kubernetes or whatever it works the same way


So this is sort of like a virtualenv, except you don't need to activate anything, it just looks within the main project folder for a __pypackages__ folder, and uses that to look for packages


I use pipenv

Pipenv is only focused on solving one problem, deploying complete packages you wrote to a server you have complete control over. It doesn't really care about the problem of developing applications and libraries for distribution and installation by third parties and on platforms running different python version/OS etc. than what it was developed on.


I use only virtualenv and have never really had any issues with it for separating out my project environments.

Perhaps this is more of an issue with larger projects, but I also noticed that 'no need to use virtualenv' as a positive.


You kind of answered your own question. Virtualenv alone is too much work so you're using a tool like pipenv to streamline it. This is a similar idea to have less ritual and a faster developer workflow.


It's a single command more, and then you don't need to prefix everything with pipenv run.


> I use pipenv

> When deploying to kubernetes it works the same way

Can you elaborate on this?


How does it compare to poetry?


You can use any tool you like, but it triggers me every time some says something misinformed about `conda`. So please if you have anything negative to say about it, please read the official Managing Environments page of `conda` before doing so [1]

* Please don't confuse `conda` with Anaconda, Miniconda & Conda-Forge.

* You can use `conda` to manage your environment and it can download and install ALL your packages from `pypi.org` and not Anaconda.

[1] https://docs.conda.io/projects/conda/en/latest/user-guide/ta...


I'm sorry, but I've had so much trouble setting up IDEs and had 3 major installation/update issues with PDM just from the start - it is unusable for production. Poetry has never failed for me even once, yes, it was slower back in the day, yet it worked like a charm properly resolving dependencies.


How it is different from : easy_install, conda, poetry, pip, pipenv ....


Not another one... please


Why is "doesn't need to create a virtualenv" a good thing? Virtualenv is just PATH enhancement. I always thought of it to be simpler than the npm run-script magic.


Really wish the Python core developers would focus on this situation and almagate one official solution for installation, env and package management and one click distribution.


Another one?


Does it handle compiling C stuff easier?

If not, then I'd stick with minimal pip3 or fullblown anaconda3


Hardcoding the python version from the start, not for me ...


Okay now we have even one more https://xkcd.com/1987/


Looks a lot like poetry



We keep having to repost this: https://xkcd.com/1987/ Why don't we ever learn?


Please use descriptive words instead of 'modern'. Modern, to me, in this context, means untested and unreliable.


>Modern, to me, in this context, means untested and unreliable.

that's what PDM is


"The Python Package Manager That Is Newer Than All The Others"


"Have you tried TPPMTINTATO? It's great!"


I can't wait for the postmodern package managers...


You `pdm install requests` and it installs Django.


packages are really a social construct


Inspired by Michel Foucault's Pypi and Punish


Another package manager for Python is exactly what the world needed


over and over until one doesn't suck


PIP doesn't suck, repeat after me. Please. Just PIP, which improves, that's the way. Let all other managers DIE.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: