Drone.io, pipenv, Docker, and the Quest for CI/CD

I wanted to use Drone.io to run tests for an app that uses pipenv for dependency management. Turns out it wasn’t as simple as I thought. The problem isn’t specific to Drone but more of a challenge with Docker and pipenv. I couldn’t find a good solution when browsing help forms so I’m just posting the solution I eventually came to.

Goal: Fast Drone tests with pipenv

Problem: pipenv installs requests in a random(ish) directory and install is relatively slow

The virtualenv’s name is the project’s root directory name, plus the hash of the full path to the project root.source

Solution

In Docker:

  • Use Pipenv docker image as base so pipenv is correctly

  • In your image, set working dir to where the src will eventually be clone to

    • You need to set this as the working dir because of the way virtualenv names directories (see above). If you create it elsewhere and then try to copy it to the src directory it won’t work. More info here. You could simply reinstall every time but that slows things down considerably.
  • Set PIPENV_VENV_IN_PROJECT=1 so virtualenv is created in project dir source

  • Run pip install

  • Move the .venv to a different folder so it’s not deleted when git clone happens

In Drone:

  • Use our new image as base

  • The default starting directory is root of my git repo

  • mv /app/.venv .  moves our pipenv dependencies into cloned folder

  • pipenv install --dev —-deploy install any dependencies that weren’t in our image already

    • --deploy makes it so we get an error if the Pipfile.lock is out of date

    • —-dev just makes us install dev dependencies along with regular ones

  • pipenv run pytest works great now

  • To deploy to GAE:

    • Create a requirements.txt from Pipfile source

    • Deploy to GAE with those requirements

Thanks:

https://pythonspeed.com/articles/pipenv-docker/  -  for some inspiration

Example Dockerfile.buildbox:

FROM kennethreitz**/pipenv
WORKDIR /drone/src/github.com/my-org/my-repo
COPY Pipfile* ./
ENV PIPENV_VENV_IN_PROJECT**=1
RUN pipenv install **--**dev **--**deploy
RUN mv .venv **/**app

Example .drone.yml:

pipeline:

  run_core_tests:
    # the &buildbox_image of the line below creates a variable that we can call elsewhere with *buildbox_image
    image: &buildbox_image us.gcr.io/my-gcp-project/private-custom-build-env-image:somehash
    pull: &pull_enabled false #This is important so we don't accidentally get stuck with an old build image during dev
    commands: # These should be cleaned up into a build step and a test step
      - mv /app/.venv .
      # Dev makes it so we install dev dependencies
      # Deploy makes it so we will throw an error if Pipfile.lock is out of date (aka out of sync with Pipfile)
      # https://pipenv.readthedocs.io/en/latest/cli/#cmdoption-pipenv-install-d
      # Most dependencies should already be installed from the docker image, however, if we added some recently this ensures they'll be installed
      - pipenv install --dev --deploy
      # Verbose makes output a bit more readable
      # Capture=no allows us to see print statements
      # pythonwarnings allows us to hide the python warnings
      - pipenv run pytest --verbose --capture=no --pythonwarning=ignore
      # Now that the tests passed, lets make the requirements.txt in case we want to deploy to GAE
      # This needs to happen in the same block as pipenv install in case we have any new dependencies
      - ./tools/create_requirements.sh
    when:
      event: [push, tag]

PS: This blog post was originally published on Medium, back before all the paywalls. It was moved here April 2024.