uv Changed How I Python
Remember when installing Python packages was annoying? pip was slow, virtual environments were confusing, and don't even get me started on managing Python versions.
Then uv came along and fixed... all of it?
I've been using uv for a few months now and I genuinely don't want to go back. Here's why it's become my default tool for everything Python.
What is uv?
It's a Python package manager written in Rust by Astral, the same folks who made ruff. It's fast. Like, absurdly fast. We're talking 10-100x faster than pip.
But speed isn't even the best part. It's the workflow improvements.
The old workflow (pip + venv + pyenv)
Starting a new project used to involve a lot of ceremony:
# Make sure we're using the right Python version
pyenv install 3.12.0 # if not already installed
pyenv local 3.12.0
# Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate # or .venv\\Scripts\\activate on Windows
# Install dependencies
pip install -r requirements.txt
# ...wait 30 seconds while pip downloads and builds things...
# Remember to activate every time you open a new terminal
source .venv/bin/activate
And don't forget: if you want reproducible builds, you need pip-tools or poetry for lockfiles. More tools, more cognitive overhead.
The new workflow (just uv)
uv init my-project
cd my-project
uv add starlette uvicorn pytest
uv run pytest
That's it. uv:
- Creates the project structure with pyproject.toml
- Creates a virtual environment automatically
- Installs dependencies
- Generates a lockfile (uv.lock) for reproducibility
- Runs commands in the venv without manual activation
The whole thing takes seconds, not minutes.
You don't need to activate
This is my favorite part. Instead of remembering to source .venv/bin/activate, just prefix commands with uv run:
uv run python script.py
uv run pytest
uv run mypy .
uv finds the virtual environment and runs the command inside it. No activation, no 'oh wait I'm in the wrong environment' moments.
Python version management built in
uv can install and manage Python versions too:
# Install Python 3.12
uv python install 3.12
# Create a venv with a specific Python version
uv venv --python 3.12
# Or specify in pyproject.toml
# requires-python = '>=3.12'
No more pyenv, no more random Python installations cluttering your system. uv downloads exactly what you need and keeps it organized.
Installing global tools the right way
Here's another thing uv does brilliantly: managing global CLI tools.
You know how you pip install tools like black, ruff, or httpie globally? And then you run into version conflicts when different projects need different versions? Or you forget which tools are installed where?
uv has a better solution:
uv tool install ruff
uv tool install black
uv tool install httpie
Each tool gets its own isolated environment. No conflicts, no pollution of your global Python. You can even install specific versions:
uv tool install 'black==23.12.0'
Want to see what tools you have installed? Simple:
uv tool list
And upgrading all your tools is just:
uv tool upgrade --all
I used to use pipx for this, and it worked fine. But having one tool (uv) handle both project dependencies AND global CLI tools means one less thing to remember. Plus, uv's speed makes installing and upgrading tools nearly instant.
The best part? Tools installed with uv tool install are automatically added to your PATH. Just install and run. No configuration, no manual PATH editing, no virtualenv activation gymnastics.
The lockfile is automatic
When you uv add a package, uv automatically updates both pyproject.toml AND the lockfile. You don't need to run a separate lock command.
uv add httpx
# Both pyproject.toml and uv.lock are updated
The lockfile captures exact versions of everything, including transitive dependencies. When a teammate runs uv sync, they get exactly the same versions you have.
How fast is it really?
I ran some informal benchmarks on a project with ~50 dependencies:
pip install -r requirements.txt: 28 secondsuv pip install -r requirements.txt: 1.2 seconds
That's not a typo. It's genuinely that much faster.
The difference comes from:
- Being written in Rust instead of Python
- Parallel downloads and installations
- Smart caching of packages
- Native compilation of pure-Python packages
Compatibility with pip
If you have an existing project using pip, uv works as a drop-in replacement:
uv pip install -r requirements.txt
uv pip freeze
uv pip list
You don't have to migrate everything at once. Use uv pip instead of pip and get the speed benefits immediately.
What about Poetry?
I used Poetry before uv. It's a good tool! But uv is faster and simpler.
Poetry has more features (like built-in publishing), but I rarely need those. For most projects, uv does everything I need with less complexity.
If you're happy with Poetry, that's fine. But if you're starting fresh or tired of waiting for pip, give uv a try.
Real-world migration: A case study
Let me walk you through how I migrated an actual production Flask app from pip to uv. The project had been around for a couple years, had about 40 dependencies, and was using the classic pip + venv setup.
The migration took 15 minutes. Here's what I did:
First, I made sure my existing requirements.txt was up to date:
pip freeze > requirements.txt
Then I initialized uv in the project:
uv init --no-readme
This created a basic pyproject.toml. Then I imported all my existing dependencies:
uv pip install -r requirements.txt
uv pip freeze | uv add -
Wait, that's not quite right. Here's the better approach I discovered:
# Convert requirements.txt to pyproject.toml dependencies
cat requirements.txt | while read package; do
uv add "$package"
done
Actually, uv has an even simpler way if you're starting fresh:
# Just this:
uv add $(cat requirements.txt)
But the cleanest migration path? Create the pyproject.toml manually with your main dependencies (not the transitive ones), then let uv figure out the rest:
# In pyproject.toml, under [project]
dependencies = [
"flask>=3.0.0",
"sqlalchemy>=2.0.0",
"pytest>=7.0.0",
]
Then run:
uv sync
And boom. uv reads the dependencies, resolves them, creates a lockfile with all transitive dependencies, and installs everything. The whole thing took 3 seconds instead of the 30+ seconds pip would take.
I deleted the old .venv folder and requirements.txt, committed the new pyproject.toml and uv.lock, and that was it. The app ran exactly the same, but now everyone on the team benefits from faster installs and reproducible builds.
Common patterns and workflows
Here are some patterns I use constantly with uv:
Starting a new script
When I want to write a quick Python script that needs a few packages:
mkdir my-script && cd my-script
uv init
uv add httpx rich # whatever packages I need
uv run python script.py
Total time: about 5 seconds. No faffing about with virtual environments.
Adding dev dependencies
uv supports dependency groups, which is perfect for separating dev tools from production dependencies:
uv add --dev pytest black mypy ruff
This adds them to a separate [dependency-groups.dev] section in pyproject.toml. When you deploy to production, you can skip installing dev dependencies:
uv sync --no-dev
Running one-off commands
Sometimes you just want to run a command with a specific package, without installing it in your project:
uvx ruff check .
uvx black --check .
The uvx command is like npx from the Node.js world. It installs the package in a temporary environment, runs it, and cleans up. Perfect for trying out tools or running scripts from packages you don't want to keep around.
Working with Docker
uv is fantastic in Docker because of its speed. Here's a pattern I use:
FROM python:3.12-slim
# Install uv
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
# Copy dependency files
COPY pyproject.toml uv.lock ./
# Install dependencies
RUN uv sync --frozen --no-dev
# Copy application
COPY . .
CMD ["uv", "run", "python", "app.py"]
The --frozen flag ensures uv doesn't update the lockfile, and --no-dev skips dev dependencies. This Docker build is significantly faster than the equivalent pip-based build.
Multiple Python versions for testing
If you maintain a library that needs to support multiple Python versions, uv makes testing easy:
# Install multiple Python versions
uv python install 3.9 3.10 3.11 3.12
# Test against each one
for version in 3.9 3.10 3.11 3.12; do
echo "Testing on Python $version"
uv venv --python $version .venv-$version
uv pip install -e . --python $version
uv run --python $version pytest
done
This is way cleaner than juggling pyenv or conda environments.
Things I wish I knew earlier
The cache is smart
uv caches everything. If you install a package once, it's cached locally. The next time you need it in any project, it's essentially instant.
The cache lives in ~/.cache/uv (or equivalent on Windows/Mac). You can clear it with:
uv cache clean
But you probably never need to. uv manages it well.
You can pin Python versions
In your pyproject.toml, you can specify which Python versions your project supports:
[project]
requires-python = ">=3.11"
When someone tries to uv sync with Python 3.9, uv will tell them they need to upgrade. This prevents the "works on my machine" problems from Python version mismatches.
Pre-releases and version constraints
Need to try a pre-release version of a package? uv makes it explicit:
uv add --pre 'django>=5.0'
You can also use all the same version specifiers as pip:
uv add 'requests>=2.28,<3' # Min and max version
uv add 'flask~=3.0' # Compatible version
uv add 'httpx==0.25.0' # Exact version
Environment variables and .env files
uv respects .env files automatically when you use uv run:
# .env file
DATABASE_URL=postgresql://localhost/mydb
DEBUG=true
# Run with environment variables loaded
uv run python app.py
No need for python-dotenv or loading it in your code. uv handles it.
Gotchas and troubleshooting
Dependency conflicts
Occasionally you'll hit a dependency conflict where two packages need incompatible versions of the same dependency. uv will tell you clearly:
error: Resolution failed due to dependency conflict:
package-a requires requests>=2.28
package-b requires requests<2.25
The solution is usually to update packages or find alternatives. At least uv tells you immediately instead of failing mysteriously at runtime.
Platform-specific dependencies
Some packages have platform-specific wheels. uv handles this well, but if you're building on Mac and deploying to Linux, make sure to test in a Linux environment (like Docker) before deploying.
Legacy packages
Very occasionally, you'll hit a package that doesn't have a wheel and needs to be built from source. uv handles this, but it's slower (though still faster than pip). If you hit this often, consider asking the package maintainers to publish wheels.
Comparing uv to other tools
Let me be clear about where uv fits:
vs pip: uv is better in every way. Faster, easier, more features. Just switch.
vs Poetry: Poetry has built-in publishing to PyPI and more opinionated project structure. If you're publishing packages, Poetry might still be worth it. For applications and scripts, uv is simpler and faster.
vs PDM: PDM is great and was ahead of its time with PEP 582 support. uv has caught up in features and is faster. Both are good choices.
vs Conda: Different use case. Conda shines when you need non-Python dependencies (like system libraries) or data science packages. For pure Python projects, uv is way lighter and faster.
vs pipenv: Just use uv. Pipenv had its moment, but it's been surpassed.
The Python tooling landscape used to be fragmented. uv feels like the convergence point we've been waiting for.
The future is fast
What excites me most isn't just that uv is fast now. It's that the Astral team has shown they can build tools that the Python community actually adopts.
They did it with ruff (which replaced about 5 different linting and formatting tools). Now they're doing it with uv (which replaces pip, venv, pyenv, pipx, and more).
The Python tooling ecosystem is finally getting the performance and developer experience it deserves. We're seeing Rust-powered tools bring the speed and reliability that Python tools struggled to achieve.
Should you switch?
If you're starting a new project: absolutely. There's no reason not to.
If you have existing projects: try it on a small one first. The migration is straightforward, and you'll immediately feel the difference.
If you're on a team: the speed benefits compound. Every teammate benefits from faster installs, and the lockfile ensures everyone has identical environments.
I don't usually evangelize tools, but uv has genuinely changed my daily Python workflow for the better. It removed friction I didn't even realize was slowing me down.
Getting started
Install uv:
# macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
# Or with Homebrew
brew install uv
# Or with pip (ironic, I know)
pip install uv
Try it on a new project:
uv init my-project
cd my-project
uv add requests
uv run python -c "import requests; print('It works!')"
That's it. You're using uv.
The catch?
Honestly? I still haven't found one. It's been drop-in compatible with everything I've tried. No weird edge cases, no broken installs.
The only "downside" is that it's relatively new (first released in early 2024), so the ecosystem is still catching up. But the tool itself is stable, well-maintained, and actively developed.
The team at Astral (who also make ruff) is doing incredible work. They seem to have a mission to make Python tooling not suck, and they're succeeding.
Python is my favorite language, but the tooling has always been its weakest point. With ruff and uv, that's finally changing. The future of Python development is fast, and I'm here for it.