Vlad Filippov

Vlad Filippov

full-stack software developer / open source hacker

Python dependencies with uv

Python and uv

Modern Python dependency management has a clear best-in-class answer for new projects: uv. It’s fast, it handles your entire toolchain, and it produces reproducible environments that behave identically across every developer’s machine, every CI run, and every deployment.

I’ve been looking for something to solve the Python dependency problem for many years and I’m very happy to see uv succeed. In this post, I’ve documented the day-to-day workflow for a uv-based project, as well as common operations that uv helps you solve. If you’re maintaining an existing project built around requirements.txt, there’s a section at the end on interoperability.

A typical Python team used to need:

  • pyenv or asdf to manage Python versions
  • virtualenv or venv to create isolated environments
  • pip to install packages
  • pip-tools to compile and lock the full dependency graph
  • pip-audit for security scanning

uv replaces all of them with a single tool, written in Rust, that runs 10–100x faster than pip. It manages Python installations, creates virtual environments, resolves dependencies, and generates a lockfile — all in one consistent CLI. This eliminates the “works on my machine” class of problems and removes a significant chunk of onboarding friction of Python projects.

Install it once per machine:

# macOS / Linux
curl -LsSf https://astral.sh/uv/install.sh | sh

# Windows
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

# Or via pip if you prefer
pip install uv

New project

uv init my-project
cd my-project

This creates a minimal project structure:

my-project/
├── .python-version     # Pins the Python version for the project
├── pyproject.toml      # Project metadata and dependencies
├── README.md
└── src/
    └── hello.py

That’s your starting point. No virtualenv to create manually, no requirements file to bootstrap. The environment gets created automatically the first time you run anything.

pyproject.toml

pyproject.toml is the modern standard for Python project configuration (PEP 517/518). You don’t need to understand it deeply to use uv, but knowing the relevant parts helps. Here’s what a typical project looks like after adding a few dependencies:

[project]
name = "my-project"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = [
    "fastapi>=0.110.0",
    "httpx>=0.27.0",
    "pydantic>=2.6.0",
]

[dependency-groups]
dev = [
    "pytest>=8.1.0",
    "pytest-cov>=5.0.0",
    "ruff>=0.3.0",
    "pre-commit>=3.7.0",
]

The [project] block declares your direct runtime dependencies — what your application needs to run. The [dependency-groups] block is where environment-specific extras live. dev is the convention for development tools; you can add others like test or lint if your team prefers more granularity.

You generally don’t edit the version pins in pyproject.toml directly. uv manages them for you.

The Lockfile

When you add dependencies or run uv sync, uv generates a uv.lock file alongside pyproject.toml. This file pins every package in your entire dependency graph — not just your direct dependencies, but every transitive dependency too. It’s the equivalent of what pip-tools used to produce, but generated automatically.

Commit uv.lock to version control. This is what guarantees every developer on your team, every CI runner, and every deployment gets byte-for-byte identical environments. Never add it to .gitignore.


Adding and managing dependencies

# Add a runtime dependency
uv add fastapi

# Add a dev-only dependency
uv add --dev pytest ruff

# Remove a dependency
uv remove httpx

# Upgrade a specific package
uv add fastapi --upgrade

# Upgrade everything
uv lock --upgrade

Every uv add or uv remove does three things atomically: updates pyproject.toml, re-resolves the full dependency graph, and rewrites uv.lock. You never manually edit version pins, and you never need to remember to run a separate compile step.

Syncing environments

When a developer pulls new changes that include uv.lock updates, they run:

uv sync

This installs or removes packages until the local environment exactly matches the lockfile. It’s idempotent — running it multiple times is safe and fast because uv skips packages that are already at the right version.

For CI and deployment, where you don’t want dev tools installed:

# Install only runtime dependencies, skip dev group
uv sync --no-dev

Python versions

One of the less obvious wins from uv is that it handles Python itself. Rather than relying on each developer having the right version installed via pyenv or system packages:

# Install a specific Python version
uv python install 3.12

# Pin the project to a version (writes to .python-version)
uv python pin 3.12

The .python-version file tells uv which Python to use for this project. Commit it. When a new developer clones the repo and runs uv sync, uv will automatically download and use the pinned Python version — no pyenv required, no “which Python are you using?” debugging.


Commands

uv run is how you execute anything in the project environment. It verifies the environment is in sync with the lockfile before running, so you’re always executing against the correct dependencies:

# Run your application
uv run python src/main.py

# Run tests
uv run pytest

# Run the linter
uv run ruff check .

# Start an interactive shell inside the environment
uv run python

If you prefer activating the environment directly (for example, in a long interactive session), the virtualenv lives at .venv:

source .venv/bin/activate  # macOS/Linux
.venv\Scripts\activate     # Windows

Workflow

For a team, the daily workflow looks like this:

# First time setting up the project
git clone https://github.com/your-org/my-project
cd my-project
uv sync                  # Creates .venv, installs everything from uv.lock

# Adding a new dependency
uv add some-package
git add pyproject.toml uv.lock
git commit -m "Add some-package"

# After pulling changes that include lockfile updates
git pull
uv sync                  # Brings your environment up to date

# Running the test suite
uv run pytest

The key discipline is simple: always commit both pyproject.toml and uv.lock together when dependencies change. Never commit one without the other.

Makefile

A Makefile is still a useful convention for documenting and standardizing how the team interacts with the project:

.PHONY: install test lint format check

install:
	uv sync

test:
	uv run pytest --cov=src --cov-report=term-missing

lint:
	uv run ruff check .

format:
	uv run ruff format .

check: lint test

New developers can run make install to get started and make check before opening a PR, without needing to know anything about uv’s internals.


CI/CD integration

uv is well-suited for CI because its global package cache makes repeated installs fast and its lockfile makes environments deterministic. Here’s a GitHub Actions workflow:

# .github/workflows/ci.yml
name: CI

on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest
    strategy:
      matrix:
        python-version: ["3.14", "3.13"]

    steps:
      - uses: actions/checkout@v4

      - name: Install uv
        uses: astral-sh/setup-uv@v5
        with:
          enable-cache: true

      - name: Set up Python ${{ matrix.python-version }}
        run: uv python install ${{ matrix.python-version }}

      - name: Install dependencies
        run: uv sync --frozen --no-dev

      - name: Run tests
        run: uv run pytest --cov=src --cov-report=xml

      - name: Upload coverage
        uses: codecov/codecov-action@v4

The astral-sh/setup-uv action handles installing uv and wiring up the GitHub Actions cache so downloaded packages are reused across runs. The enable-cache: true option is the key line — it can reduce a cold install from 60 seconds to under 5.

Note --frozen in the install step. This tells uv to use the lockfile exactly as committed and fail loudly if pyproject.toml and uv.lock are out of sync — which would indicate someone forgot to commit the lockfile after adding a dependency. Catching this in CI is much better than catching it in production.

Security and dependency auditing

# Check for known CVEs in your dependency tree
uv run pip-audit

# See which packages have newer versions available
uv tree --outdated

pip-audit queries the Python Packaging Advisory Database and flags any installed packages with known vulnerabilities. Adding it to your CI pipeline as a required step ensures nothing with a known CVE ships to production undetected.

For automated updates, Dependabot supports uv lockfiles natively:

# .github/dependabot.yml
version: 2
updates:
  - package-ecosystem: "uv"
    directory: "/"
    schedule:
      interval: "weekly"
    open-pull-requests-limit: 5

Dependabot will open pull requests updating individual packages in uv.lock.


Deployment

Simple deployment

git pull origin main
uv sync --frozen --no-dev
sudo systemctl restart myapp

Because uv sync is driven by the lockfile and idempotent, this is safe to run repeatedly. It only installs or removes what has actually changed. The --frozen flag ensures you never silently re-resolve in production, if the lockfile is somehow stale, the deployment fails fast rather than proceeding with an unverified environment.

Deployments with a rollback

For production systems where you need a clean rollback path, install into a staging environment first and only cut over if everything succeeds:

#!/bin/bash
set -e

APP_DIR="/var/www/myapp"

cd $APP_DIR
git pull origin main

# Sync into a temporary environment
UV_PROJECT_ENVIRONMENT=.venv-next uv sync --frozen --no-dev

# Validate the new environment before cutting over
UV_PROJECT_ENVIRONMENT=.venv-next uv run python -c "import myapp; print('imports OK')"

# Swap environments
mv .venv .venv-prev
mv .venv-next .venv

# Restart
sudo systemctl restart myapp

echo "Deployment successful. Previous env preserved at .venv-prev"

requirements.txt

Some tools and older projects still expect a requirements.txt. You can export one from your uv lockfile at any time:

# Export runtime dependencies as a pinned requirements.txt
uv export --no-dev --output-file requirements.txt

# Export including dev dependencies
uv export --output-file requirements-dev.txt

The exported file is fully pinned with hashes, suitable for any standard pip workflow. The important thing is that requirements.txt becomes a build artifact in this workflow, not a source of truth. Keep pyproject.toml and uv.lock as the canonical source and regenerate the export as needed — ideally as a CI step so it’s always current.

If you’re migrating an existing project from requirements.txt to uv:

uv init --no-readme
uv add -r requirements.txt
uv add --dev -r requirements-dev.txt

Summary

  1. uv replaces your whole toolchain — pip, virtualenv, pyenv, and pip-tools in one install
  2. Commit uv.lock — this is what makes environments identical across your team and CI
  3. Use dependency groups to separate runtime and dev dependencies in pyproject.toml
  4. Use uv run for everything — it keeps environments in sync automatically
  5. Use --frozen in CI and production — fail fast if the lockfile is stale, never silently re-resolve
  6. Automate security audits with pip-audit in CI and Dependabot for weekly updates
  7. Export requirements.txt only when needed for interop, as a build artifact

With this workflow every developer gets the same environment, CI catches what local development misses, and deployments don’t surprise you. uv makes that easier to achieve than anything else in the Python ecosystem right now.

Have questions about Python dependencies or deployment workflows? Connect with me on GitHub or LinkedIn.

© Vlad Filippov