Top 3 PEP 668 Solutions for 2025: Why I Made venv-stack
Struggling with PEP 668 errors? Discover the best solutions for 2025, including venv, pipx, and why I built venv-stack to solve complex dependency workflows.
Daniel Rojas
Python developer and creator of venv-stack, passionate about improving development workflows.
The Frustrating Error We've All Seen
If you've used a modern Linux distribution like Debian 12, Ubuntu 23.04+, or the latest Fedora in the past year, you've almost certainly run into this message: error: externally-managed environment
. This roadblock, a direct result of PEP 668, stops a seemingly innocent pip install
command dead in its tracks. Your first reaction might be frustration, but this change is a crucial step forward for the stability and security of Python on system-managed OSes.
For years, developers (myself included) casually used sudo pip install
to get tools and libraries onto our systems. This often led to a tangled mess where system utilities dependent on a specific version of a Python package would break after an upgrade. PEP 668 puts a stop to this. It protects the system's Python environment, forcing us to be more deliberate about how we manage our packages.
In this post, we'll explore the best ways to work within this new paradigm in 2025. We'll cover the standard best practices, a specialized tool for applications, and the emergency escape hatch. But more importantly, I'll explain why these solutions felt incomplete for my complex development workflow, which ultimately led me to create a new tool: venv-stack.
What is PEP 668 and Why Does it Matter?
At its core, PEP 668 is a formal specification that allows Python environments to declare themselves as "externally-managed." Think of your operating system's package manager (like apt
on Debian/Ubuntu or dnf
on Fedora) as the official manager of your system's Python. It knows which packages are needed for critical system tools to function correctly.
When you try to use pip
to install a package directly into this system-level environment, you're acting as an unapproved, external manager. PEP 668 makes pip
aware of this, and by default, it refuses to proceed. This prevents a scenario where you, for example, upgrade the requests
library for a personal script, only to discover you've broken your system's network manager which relied on the older version.
This isn't a punishment; it's a safeguard. It forces a clean separation between the Python environment your operating system needs and the environments your projects need. This separation is the foundation of modern, reproducible Python development.
Top 3 Solutions for Navigating PEP 668 in 2025
With sudo pip install
off the table, how should we manage our packages? Let's look at the three primary approaches, from the most common to the most dangerous.
Solution 1: The Old Faithful - Standard `venv`
The most direct and officially recommended solution is to use virtual environments. Python's built-in venv
module is your best friend here. It allows you to create isolated, self-contained Python environments for each of your projects.
The workflow is simple and effective:
- Create an environment:
python3 -m venv .venv
- Activate it:
source .venv/bin/activate
- Install dependencies:
pip install requests django pandas
Pros:
- Built-in: No extra tools to install. It's part of the Python standard library.
- Total Isolation: Each project gets its own set of packages, preventing any cross-project conflicts.
- Reproducibility: You can generate a
requirements.txt
file (pip freeze > requirements.txt
) that precisely documents the project's dependencies.
Cons:
- Activation Overhead: Constantly activating and deactivating environments can be tedious.
- Doesn't solve the "global tool" problem: What if you just want to install a command-line tool like
ruff
orblack
to use everywhere, without creating a project for it?
Solution 2: The Tool Specialist - `pipx`
This is where pipx
shines. It's a tool designed specifically for installing and running Python applications that you want to use as command-line tools. Think of linters, formatters, code auditors, or even specialized utilities like yt-dlp
.
When you run pipx install ruff
, pipx
does something brilliant: it creates a dedicated, isolated virtual environment just for ruff
, installs it there, and then adds a symbolic link to the application into a directory on your system's PATH
(usually ~/.local/bin
). You get to run ruff
from anywhere, without it ever touching your system Python or your project environments.
Pros:
- Perfect for CLI Apps: The ideal solution for installing global Python tools.
- Automated Isolation: Manages all the virtual environments for you behind the scenes.
- Safe and Clean: Keeps your system and project environments pristine.
Cons:
- Not for project libraries: It's not meant for installing libraries like
pandas
ordjango
that your code willimport
. - Another tool: You first need to install
pipx
itself (usually via your OS package manager, likesudo apt install pipx
).
Solution 3: The Last Resort - `--break-system-packages`
When you see the PEP 668 error, pip
itself suggests this option. The --break-system-packages
flag is an explicit override that tells pip
, "I know this is a bad idea, but do it anyway."
You should avoid this flag almost all the time. Using it is functionally equivalent to the old, risky behavior of sudo pip install
. You are knowingly risking the stability of your operating system. So, when is it okay? The only truly defensible use case is within a fully disposable environment, like a Docker container you're building for a specific application. In that controlled context, you are the sole "manager," and breaking the container's environment doesn't have wider consequences.
Pros:
- It works: It bypasses the protection and installs the package.
Cons:
- Extremely Dangerous: Can easily break system dependencies and render your OS unstable.
- Goes against best practices: It's a temporary fix that encourages bad habits and leads to non-reproducible environments.
PEP 668 Solutions: A Head-to-Head Comparison
Feature | `venv` | `pipx` | `--break-system-packages` |
---|---|---|---|
Primary Use Case | Project-specific dependencies | Global command-line applications | Disposable environments (e.g., Docker) |
Isolation Level | Excellent (Per-project) | Excellent (Per-application) | None (Modifies system directly) |
Safety | Very High | Very High | Very Low |
Ease of Use | Moderate (Requires activation) | High (Install and forget) | High (But dangerously simple) |
Recommended for... | All Python application/library development | Installing tools like `ruff`, `black`, `ansible` | Almost never; only in controlled, ephemeral contexts |
The Gap in the Ecosystem: Why I Created venv-stack
For 95% of use cases, a combination of venv
for projects and pipx
for tools is a fantastic workflow. It's clean, safe, and robust. But I kept running into a specific, frustrating scenario that neither tool solved elegantly. I work on multiple, related Django projects. There's a core application, and then several plugins that extend it. They all share a large, common base of dependencies: Django, Django REST Framework, Celery, and more. But each plugin has its own small set of unique dependencies.
The standard `venv` approach meant creating a completely separate environment for each plugin, re-installing that large common base every single time. This was slow and wasted dozens of gigabytes of disk space. I wanted a way to say: "Give me an environment that has everything from my core project, plus this one extra library." This desire for layered, inheritable environments is what led me to build venv-stack.
What is venv-stack?
venv-stack
is a command-line tool that manages Python virtual environments as a stack. Instead of being completely isolated, environments can be layered on top of each other. It's like Docker image layers, but for Python venvs.
When you "push" a new environment onto the stack, it inherits all the packages from the environment below it. You can then install new packages into the top layer without affecting the base. When you "pop" the environment, you return to the one below, instantly.
How venv-stack Bridges the Gap
Let's revisit my Django problem. With `venv-stack`, the workflow becomes:
- Create a base environment:
venv-stack new base --python python3.11
. - Activate and install common packages:
venv-stack activate base
and thenpip install django djangorestframework ...
. - Work on a plugin:
venv-stack push plugin-A --on base
. - Install plugin-specific dependencies:
pip install some-plugin-dependency
. This is installed only in the `plugin-A` layer.
Now, my `plugin-A` environment has Django and all its friends, plus its own dependency, without duplicating the base packages. I can pop back to the `base` environment or push another environment for `plugin-B` on top of `base`. This approach saves a tremendous amount of disk space and time, especially when the base dependencies are large (think data science stacks with PyTorch or TensorFlow).
It combines the explicit, per-project nature of `venv` with a more efficient and flexible model for managing related projects, filling a niche that I believe many developers face in complex monorepos or multi-package projects.
Conclusion: Choosing the Right Tool for the Job
PEP 668 is a positive evolution for the Python ecosystem, pushing us toward more robust and reliable development practices. In 2025, your toolkit for handling it is stronger than ever.
- For your day-to-day project development, `venv` remains the undisputed, built-in standard.
- For installing and managing Python-based command-line tools, `pipx` is the best-in-class solution.
- The `--break-system-packages` flag should be treated like a biohazard sign: only to be touched with extreme caution in isolated, disposable environments.
And if you, like me, find yourself managing multiple projects with significant dependency overlap, I invite you to give `venv-stack` a try. It was born from a real-world need to make complex development workflows faster, leaner, and more enjoyable. You can find it on GitHub and PyPI today.