How to Add a Dockerfile to Your uv Project

A good Dockerfile is like a clean kitchen; you can step in, cook your app, and know exactly what’s going into the final product. For Python projects that use uv, adding a Dockerfile is straightforward, but a few best practices will make your image smaller, faster to build, and easier to maintain.

In this article, we’ll walk through building a Dockerfile that:

  • Installs uv right at the start to take advantage of Docker layer caching
  • Keeps the build layers lean for faster CI/CD pipelines
  • Runs your app in a secure, production-ready way
  • Integrates with Docker Compose for live code watching and rebuilding
  • Uses a multi-stage build to create a slim, production-ready image

Why start with uv?

When you add uv at the very beginning of your Dockerfile, you give Docker a stable, cacheable layer for your dependency installation process. Since dependencies change far less often than source code, this means faster rebuilds.
It also ensures your build environment is predictable—no mismatched Python versions or rogue dependencies creeping in.


Multi-Stage Dockerfile for a uv-Based Project

Multi-stage builds let you separate build dependencies from runtime dependencies, resulting in a smaller, more secure final image. The first stage installs dependencies and builds any required assets. The second stage copies only the necessary artifacts into a clean base image.

# ---------- Stage 1: Builder ----------
FROM python:3.12-slim AS builder

# Install uv.
COPY --from=ghcr.io/astral-sh/uv:0.8 /uv /uvx /bin/

WORKDIR /app

# Copy dependency file(s) first for caching
COPY pyproject.toml uv.lock ./

# Install dependencies in a separate layer
RUN uv sync --frozen

# Copy application source code
COPY . .

# ---------- Stage 2: Runtime ----------
FROM python:3.12-slim AS runtime

# Security: create a non-root user
RUN useradd --create-home appuser
USER appuser

WORKDIR /app

# Copy dependencies and source code from builder
COPY --from=builder /app /app

# Default command (update for your project’s entry point)
CMD [".venv/bin/python", "main.py"]

Let's also add a .dockerignore file to keep the context low. See Docker Build Context Is Bigger Than You Think for more details on this.

.git/
__pycache__/
.venv/
dist/
*.egg-info
.env
.vscode
.coverage
htmlcov/

Benefits of This Approach

  • Smaller image size — the runtime stage doesn’t include build caches or unused packages.
  • Faster rebuilds — dependency caching still works thanks to uv in the first stage.
  • Cleaner production environment — no leftover tools that aren’t needed at runtime.

Using Docker Compose with watch for Live Development

When developing locally, you often want changes in your source code to be reflected inside the running container—without rebuilding the image every time.

Docker Compose’s develop.watch feature makes this easy. It allows you to:

  • Sync files from your local machine into the container in real time
  • Ignore certain paths like .venv/ to prevent unnecessary syncing
  • Trigger image rebuilds when critical files (like pyproject.toml) change

Here’s an example docker-compose.yaml for local dev:

services:
  app:
    build:
      context: .
      dockerfile: Dockerfile
    develop:
      # Create a `watch` configuration to update the app
      watch:
        # Sync the working directory with the `/app` directory in the container
        - action: sync
          path: .
          target: /app
          # Exclude the project virtual environment
          ignore:
            - .venv/

        # Rebuild the image on changes to the `pyproject.toml`
        - action: rebuild
          path: ./pyproject.toml

Build the container

docker compose build

Running Your App in Docker

Development (live reload):

docker compose up --watch

Production (multi-stage build):

docker build -t my-uv-app .
docker run my-uv-app

Final Thoughts

By combining:

  • uv at the start for caching and reproducibility
  • Docker Compose watch for fast development feedback
  • Multi-stage builds for smaller, secure production images

…you create a Docker workflow that works just as well for local iteration as it does in production deployment. This approach keeps DevOps pipelines lean and developers happy.