Hacker News new | past | comments | ask | show | jobs | submit login

I was recently explaining this here — you still end up with a virtualenv so it's not a difference in capabilities but rather ease of use:

1. It transparently creates the virtualenv for you

2. The pipfile format handles dependencies and version locking (including hashes of packages), including updates. That means that the versions won't change without your knowledge but upgrading to the latest versions of everything is simply running "pipenv update" to have the virtualenv completely rebuilt (i.e. you'll never forget to add a dependency to a requirements file) and the lock file updated so the next time you push your code the same versions you tested are certain to be used.

3. It'll automatically load the .env file for every command – i.e. your project can have "DJANGO_SETTINGS_MODULE=myproject.site_settings" in that file and you will never need to spend time talking about it in the future.

4. It separates regular and developer dependencies so you don't install as much on servers

5. "pipenv check" will let you know whether any of the versions of any of the packages installed have known security vulnerabilities

6. Pipfile also includes the version of the Python interpreter so e.g. your Python 2 project will seamlessly stay on 2.7 until you upgrade even if your system default python becomes 3.

None of this is something you couldn't do before but it's just easier. Every time a Python point release happens you have to rebuild a virtualenv and now it takes 5 seconds and no thought.




To further elaborate on 2, it solves the problem of maintaining loose version ranges in your requirements.txt file, but keeping the versions pinned when you deploy. For example if you put `foo>=2` in your requirements.txt, this is dangerous without some way of pinning e.g. `foo==2.18.2` and running your tests against that before you deploy. But you obviously don't want to manually edit requirements.txt with minor version numbers every time you update. In the past I've maintained a separate file with loose versions and then updated packages with

  pip install -r requirements-to-freeze.txt --upgrade && pip freeze -l -r requirements-to-freeze.txt > requirements.txt
Pipenv makes this much nicer.


Don't forget that usually you'll start to sort your requirements into dev requirements and production requirements which makes these packaging scripts much more complicated.

https://github.com/jazzband/pip-tools would be what I used before pipenv came to be.


Two features I miss from pip-tools:

1. `pip-sync`, An easy way to ensure my local environment actually matches my defined requirements. I guess the pipenv version of this would be `pipenv uninstall --all && pipenv install` which isn't quite as elegant, but perhaps good enough.

2. The ability to create more than two requirement sets. For my projects it's often handy to three sets of requirements:

• Normal production requirements end users will need to run the app

• CI requirements needed for testing, but not running the app in production (Selenium, Flake8, etc)

• Local debugging tools (ipython, ipdb)

I could include my local debugging tools in the `--dev` requirements, but then I'm unnecessarily making my CI builds slower by adding requirements for packages that should never actually be referenced in committed code. Alternatively, I could leave them out of my dependencies entirely, but then I have to remember to reinstall them every time I sync my local env to ensure it matches the piplock file.


  pip-compile --upgrade
  pip-compile --upgrade-package
are also necessary features to quickly track your dependencies (and transitive deps).

pipenv uses pip-tools, but they haven't exposed these features as far as I can tell.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: