Poetry in Production


I regularly use poetry in order to isolate development environments as I'm putting applications together. I've been happy with it, and there are a number of methods that I've developed for using poetry in various environments.

For production, there are a number of different mechanisms used by people in the poetry community:

  • use poetry directly (poetry run application)
  • install directly in the root environment after exporting requirements (poetry export --without dev -o requirements.txt)
  • use an in-tree virtual environment

Decision process

I've tried all of them, and you can make them all work. However, after some investigation, I've decided to land right now on the in-tree virtual environment for ease of use.

I'd recommend against installing directly in the environment if possible because of the issues that arise if you need to install more than one virtual environment on a server. Generally, you'd think this would be unnecessary, as you should be isolating your servers anyway, and in a minimalist container environment it is likely true.

In our case, since we use slightly heavier-weight containers (Solaris Zones), I occasionally have other tools (like background processes that may be working on the same data) in the same zone. As such, you can still run into conflicts for dependencies and the virtualenv isolation brings some benefits.

Once you've determined that you're running in a virtual environment, the question becomes where to put your virtual environment data. For development, I prefer to leave it in the default (cache) directories because it's easier for me to remake those environments en masse when I upgrade the python enterpreter(s).

For production environments, the rebuild problem isn't an issue and the execution environments are generally limited. At this point, it's really a matter of tidiness and standardization.

Implementation

For our Solaris zones, they could go anywhere, but I perfer the ability to nuke and reconsistitute from source quickly and without having to go hunt down the virtualenv directory.

In the docker environments that I use for some applications, the problem becomes a bit more accute. Since I want to create the install environment using the python-dev container and then deploy it using a runtime container, that means I need to copy everything over and a standard location is better for this.

As such, my installation process tends to be:

  1. Set up the in-tree virtual environment:

    pip install poetry
    poetry config virtualenvs.in-project true
    
  2. Capture the execution environment (which should be in .venv):

    poetry env info --path
    
  3. Export the main-only requirements to a file for installation (skipping hashes in our environment because we have some home-built packages that we don't gather hashes on yet):

    poetry export --only main --without-hashes --output /tmp/requirements.txt
    
  4. Install the requirements in the virtual environment:

    pip install -r /tmp/requirements.txt
    

I have used these successfully both in Dockerfile (using multi-stage builds and copying the .venv over) and in Ansible for deployment in our Solaris zones environments.

Ansible

This isn't a complete ansible playbook, but it should give you an idea of how to construct an effective one:

- name: Install poetry
  pip:
    name: poetry
    state: present

- name: Set up the in-tree virtual environment
  command: poetry config virtualenvs.in-project true
  args:
    chdir: '{{ program_base }}'

- name: Capture the execution environment
  command: poetry env info --path
  register: poetry_env
  args:
    chdir: '{{ program_base }}'

- name: Export the main-only requirements to a file for installation
  command: poetry export --only main --without-hashes --output /tmp/requirements.txt
  args:
    chdir: "{{ poetry_env.stdout }}"

- name: Install the requirements in the virtual environment
  pip:
    requirements: /tmp/requirements.txt
    virtualenv: "{{ poetry_env.stdout }}"

In this case program_base is the directory where the pyproject.toml file is located.

Docker version

In the Docker version, you'd use a multi-stage build to create the .venv and then copy it over to the runtime container. Here's a simplified example:

FROM python:3.9 as python-dev
# Install poetry
RUN pip install poetry

# Set up the in-tree virtual environment
WORKDIR /app
RUN poetry config virtualenvs.in-project true

# Capture the execution environment
RUN poetry env info --path > /tmp/poetry_env_path

# Export the main-only requirements to a file for installation
RUN poetry export --only main --without-hashes --output /tmp/requirements.txt

# Install the requirements in the virtual environment
RUN pip install -r /tmp/requirements.txt

# Runtime container
FROM python:3.9

# Copy the virtual environment from the python-dev stage
COPY --from=python-dev /app/.venv /app/.venv

# Set the virtual environment as the default Python environment
ENV PATH="/app/.venv/bin:$PATH"

# Copy your application code to the container
COPY . /app

# Set the working directory
WORKDIR /app

# Run your application
CMD ["/app/.venv/bin/python", "app.py"]

This is a simplified example, but it should give you a good start. If you are installing only modules (for example if your build steps result in wheels), you will need to make some modifications.