Modern Dependency Management and Packaging in Python
The Python ecosystem has evolved rapidly in recent years, with new standards and tools designed to make projects more reliable, reproducible, and easy to share. In this post, we’ll explore the modern approach to dependency management and packaging, so you can build robust, maintainable Python projects from the start.
Why Dependency Management Matters
Every Python project relies on external libraries. Without careful management, you risk “dependency hell”—conflicting versions, broken environments, and non-reproducible builds. Modern Python tooling solves these problems by isolating dependencies and making them explicit.
Python Packaging Standards
pyproject.toml
: The Modern Standard
The pyproject.toml
file is now the heart of Python packaging. Introduced in PEP 518, it standardizes how projects declare build systems, dependencies, and metadata. Key benefits include:
- Unified configuration: One file for build requirements, dependencies, and project metadata.
- Tool interoperability: Supported by modern tools like Poetry, Hatch, Flit, and uv.
- Future-proof: Enables advanced features such as dependency groups (PEP 735).
Technical Deep-Dive: pyproject.toml Structure
A comprehensive pyproject.toml
file typically contains several sections:
# Build system configuration (PEP 518)
[build-system]
requires = ["setuptools>=42", "wheel"]
build-backend = "setuptools.build_meta"
# Project metadata (PEP 621)
[project]
name = "example-package"
version = "0.1.0"
description = "An example Python package"
readme = "README.md"
requires-python = ">=3.8"
license = {text = "MIT"}
authors = [
{name = "Your Name", email = "your.email@example.com"},
]
classifiers = [
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
]
dependencies = [
"requests>=2.25.0",
"pandas>=1.3.0",
]
# Optional dependencies (PEP 621)
[project.optional-dependencies]
dev = [
"pytest>=6.0.0",
"black>=21.5b2",
"mypy>=0.812",
]
docs = [
"sphinx>=4.0.0",
"sphinx-rtd-theme>=0.5.2",
]
# Entry points (executables)
[project.scripts]
example-cli = "example_package.cli:main"
# Tool-specific configurations
[tool.black]
line-length = 88
[tool.mypy]
python_version = "3.8"
warn_return_any = true
[tool.pytest.ini_options]
testpaths = ["tests"]
Under the Hood: How Build Systems Use pyproject.toml
When you run a build command like pip install .
or python -m build
:
- Discovery: The build system locates
pyproject.toml
in the project root - Environment Setup: A temporary isolated build environment is created with packages listed in
requires
- Backend Loading: The specified
build-backend
module is loaded - Metadata Extraction: The backend reads project metadata from
pyproject.toml
- Package Building: The backend creates distributions (wheel and/or sdist)
This approach solves the “bootstrapping problem” of older packaging systems, where you needed to install dependencies before you could install dependencies.
PEP 621 and PEP 735
- PEP 621: Standardizes how project metadata (name, version, authors, etc.) is declared in
pyproject.toml
. - PEP 735: Introduces dependency groups, allowing you to categorize dependencies (e.g.,
dev
,docs
,test
) for flexible installs.
PEP 621: Metadata Standardization in Depth
PEP 621 solved a critical fragmentation problem in the Python ecosystem. Before its adoption:
- Each build backend had its own configuration format
- Projects needed different metadata formats for different tools
- Migration between tools was difficult due to incompatible formats
The standard defines core metadata fields that all build backends must support:
Field | Type | Purpose | Example |
---|---|---|---|
name |
String | Distribution name | "my-package" |
version |
String | Package version | "1.0.0" |
description |
String | Short description | "A package for..." |
readme |
String/Table | Project README | "README.md" or {file = "README.md", content-type = "text/markdown"} |
requires-python |
String | Python version spec | ">=3.8" |
license |
Table | License info | {text = "MIT"} or {file = "LICENSE"} |
authors /maintainers |
Array | People info | [{name = "Name", email = "email@example.com"}] |
dependencies |
Array | Required packages | ["requests>=2.25.0"] |
PEP 735: Dependency Groups Implementation
PEP 735 (currently a draft proposal) extends dependency management with formalized grouping. Here’s how it works technically:
# Traditional optional dependencies (PEP 621)
[project.optional-dependencies]
dev = ["pytest", "black", "mypy"]
docs = ["sphinx", "sphinx-rtd-theme"]
# New dependency groups (PEP 735)
[tool.pdm.dependencies]
requests = ">=2.25.0"
[tool.pdm.dependency-groups]
dev = {dependencies = ["pytest", "black"], optional = true}
docs = {dependencies = ["sphinx"], optional = true}
test = {dependencies = ["pytest"], optional = false}
Key technical differences:
- Resolution scope: Groups can have controlled resolution scope (project-wide or group-local)
- Non-optional groups: Can define mandatory groups for organizational purposes (e.g., separating runtime vs test dependencies)
- Cross-references: Groups can reference other groups to create hierarchies
- Conditional activation: Groups can be activated based on environment markers
Modern Package Managers
Several tools build on the new standards, offering streamlined workflows:
- Poetry: All-in-one tool for dependency management, packaging, and publishing. Handles virtual environments automatically.
- Hatch: Focuses on project management, environment creation, and extensibility.
- Flit: Lightweight tool for simple projects; great for pure Python packages.
- uv: A fast, modern package manager that emphasizes performance and reliability.
Each tool has its strengths—experiment to find what fits your workflow best.
Technical Comparison of Modern Package Managers
Feature | pip + venv | Poetry | PDM | Hatch | uv |
---|---|---|---|---|---|
Install speed | Moderate | Moderate | Fast | Moderate | Very Fast |
Dependency resolution | Basic | Advanced | Advanced | Advanced | Advanced |
Lock file | No (pip-tools needed) | Yes (poetry.lock ) |
Yes (pdm.lock ) |
Yes (hatch.lock ) |
Yes (requirements.lock ) |
Virtual env creation | Manual | Automatic | Automatic | Automatic | Compatible with all |
PEP 621 support | Via setuptools |
Yes | Yes | Yes | Yes |
Build backend | setuptools |
Built-in | Pluggable | Built-in | Compatible with all |
How Dependency Resolution Works
Modern package managers use sophisticated dependency resolution algorithms:
-
Constraint Satisfaction Problem (CSP):
- Treating dependencies as variables with version constraints
- Finding a set of versions that satisfies all constraints simultaneously
-
Resolution Process:
for each package in dependencies:
add package to resolution graph
for each dependency of package:
recursively resolve dependency
if conflict detected:
backtrack and try different versions
- Performance Optimizations:
- Package metadata caching
- Pre-computed dependency graphs
- Parallel downloads and installations
- Binary wheel prioritization
For example, uv’s remarkable speed comes from its Rust implementation and innovative techniques like concurrent resolution, optimized HTTP polling, and advanced caching mechanisms.
Building and Publishing a Package: Hands-on Guide
Let’s walk through a practical example of creating and publishing a simple Python package using uv, one of the fastest and most modern Python package managers. Follow along to create your own package:
Step 1: Setting Up Your Project
# Install uv if you haven't already
pip install uv
# Create project directory
mkdir my-awesome-package
cd my-awesome-package
# Create a virtual environment
uv venv
# Activate the environment
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Create basic project structure
mkdir my_awesome_package tests
touch my_awesome_package/__init__.py tests/__init__.py README.md
Step 2: Configure Your Project
Your project now has this structure:
my-awesome-package/
├── my_awesome_package/
│ └── __init__.py
├── tests/
│ └── __init__.py
└── README.md
Let’s create a pyproject.toml
file:
touch pyproject.toml
Now, edit the pyproject.toml
file with the following content:
[build-system]
requires = ["setuptools>=42", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "my-awesome-package"
version = "0.1.0"
description = "A sample package to demonstrate modern Python packaging"
authors = [
{name = "Your Name", email = "your.email@example.com"}
]
readme = "README.md"
requires-python = ">=3.8"
dependencies = [
"requests>=2.28.1",
]
[project.optional-dependencies]
dev = [
"pytest>=7.0.0",
"black>=23.1.0",
]
[tool.setuptools]
packages = ["my_awesome_package"]
The above file uses the standard PEP 621 format which works with any modern build tool, not just a specific package manager.
Step 3: Add Your Code
Create a simple function in my_awesome_package/__init__.py
:
__version__ = '0.1.0'
def fetch_data(url):
"""Fetch data from a URL and return the response text."""
import requests
response = requests.get(url)
response.raise_for_status() # Raise exception for HTTP errors
return response.text
Step 4: Install Dependencies and Test
# Install dependencies (while in activated virtual environment)
uv pip install -e . # Install the package in development mode
uv pip install pytest # Install test dependencies
# Or install with dev dependencies
uv pip install -e ".[dev]"
# Run your code
python -c "import my_awesome_package; print(my_awesome_package.fetch_data('https://httpbin.org/json'))"
# Run tests (if you've written any)
pytest
Step 5: Build and Publish
# Install build tools if you haven't already
uv pip install build twine
# Build the package
python -m build
# This creates both .whl and .tar.gz files in dist/
ls dist/
# Publish to PyPI (you'll need a PyPI account)
# For testing purposes, use Test PyPI first:
python -m twine upload --repository testpypi dist/*
# When ready for the real thing:
# python -m twine upload dist/*
Package Structure and Best Practices
Modern Python packages typically follow this structure:
my_package/
├── pyproject.toml # Project configuration
├── README.md # Project documentation
├── LICENSE # License file
├── src/ # Source directory (recommended)
│ └── my_package/ # Actual package
│ ├── __init__.py # Makes it a package
│ ├── module1.py # Core functionality
│ └── module2.py # More functionality
├── tests/ # Test directory
│ ├── __init__.py
│ └── test_module1.py
└── docs/ # Documentation
Using a src/
layout provides several technical advantages:
- Prevents accidental imports from the project root
- Ensures you’re testing the installed package, not the development version
- Forces proper namespacing and import paths
Build Process Under the Hood
When building a package, these steps occur:
-
Source Distribution (sdist):
# What happens when you run: python -m build --sdist
- Project metadata is gathered from
pyproject.toml
- Source files are collected based on
include
patterns - A
.tar.gz
archive is created with all source files - A
PKG-INFO
file is generated with metadata
- Project metadata is gathered from
-
Wheel Distribution (bdist_wheel):
# What happens when you run: python -m build --wheel
- The package is built in a temporary directory
- Python bytecode is compiled (
.pyc
files) - Platform tags are determined (e.g.,
py3-none-any
) - A
.whl
file (renamed ZIP) is created with the built package - Wheels are ready for direct installation without compilation
-
Publishing Process:
# What happens when you run: python -m twine upload dist/*
- Authentication with PyPI (or TestPyPI)
- HTTPS POST requests with file uploads
- Package validation on the server
- Index updates on PyPI
- CDN propagation
The key difference between modern tools and traditional approaches is automation of these steps while maintaining compliance with Python packaging standards.
Best Practices
- Isolate dependencies using virtual environments (see my tutorial on setting up virtual environments for step-by-step instructions).
- Lock dependencies for reproducible installs (e.g.,
poetry.lock
). - Group optional/development dependencies for flexible environments.
- Automate quality checks (linting, formatting, tests) in your packaging workflow.
- Document your project and include type annotations for maintainability.
Complete Walkthrough: From Project to Published Package
Let’s create a small but complete package that demonstrates modern Python packaging in action. We’ll build a simple CLI tool called githubstats
that fetches statistics for a GitHub repository.
1. Project Setup
# Create project directory
mkdir githubstats
cd githubstats
# Create a virtual environment with uv
uv venv
# Activate the environment
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Create basic directory structure
mkdir -p src/githubstats tests
touch src/githubstats/__init__.py tests/__init__.py README.md
2. Add Dependencies
# Create pyproject.toml file
touch pyproject.toml
# Install required packages
uv pip install requests click
# Install development dependencies
uv pip install pytest pytest-cov black isort mypy
3. Create the Package Structure
# Create the source files
touch src/githubstats/cli.py
touch src/githubstats/github.py
4. Update pyproject.toml for Package Configuration
Edit your pyproject.toml
to include:
[build-system]
requires = ["setuptools>=42", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "githubstats"
version = "0.1.0"
description = "A CLI tool to fetch GitHub repository statistics"
authors = [
{name = "Your Name", email = "your.email@example.com"}
]
readme = "README.md"
requires-python = ">=3.8"
license = {text = "MIT"}
dependencies = [
"requests>=2.28.2",
"click>=8.1.3",
]
[project.optional-dependencies]
dev = [
"pytest>=7.3.1",
"pytest-cov>=4.1.0",
"black>=23.3.0",
"isort>=5.12.0",
"mypy>=1.3.0",
]
[project.scripts]
githubstats = "githubstats.cli:main"
[tool.setuptools]
package-dir = {"" = "src"}
packages = ["githubstats"]
[tool.black]
line-length = 88
[tool.isort]
profile = "black"
[tool.mypy]
python_version = "3.8"
warn_return_any = true
Note how we’re using the standard PEP 621 format that works with any modern build tool.
5. Implement the Code
Create the GitHub API client in src/githubstats/github.py
:
import requests
from typing import Dict, Any
class GitHubClient:
"""A simple GitHub API client."""
def __init__(self, token: str = None):
"""Initialize the GitHub client.
Args:
token: Optional GitHub API token for authenticated requests
"""
self.base_url = "https://api.github.com"
self.headers = {}
if token:
self.headers["Authorization"] = f"token {token}"
def get_repo_stats(self, owner: str, repo: str) -> Dict[str, Any]:
"""Get basic statistics for a repository.
Args:
owner: Repository owner (user or organization)
repo: Repository name
Returns:
Dictionary with repository statistics
"""
url = f"{self.base_url}/repos/{owner}/{repo}"
response = requests.get(url, headers=self.headers)
response.raise_for_status()
data = response.json()
return {
"name": data["full_name"],
"stars": data["stargazers_count"],
"forks": data["forks_count"],
"open_issues": data["open_issues_count"],
"watchers": data["subscribers_count"],
"created_at": data["created_at"],
"updated_at": data["updated_at"],
"language": data["language"],
}
Create the CLI interface in src/githubstats/cli.py
:
import os
import click
from .github import GitHubClient
@click.command()
@click.argument("repo", format="owner/repo")
@click.option("--token", help="GitHub API token", envvar="GITHUB_TOKEN")
def main(repo: str, token: str = None):
"""Fetch statistics for a GitHub repository.
REPO should be in the format 'owner/repo', e.g., 'python/cpython'
"""
try:
owner, repo_name = repo.split("/")
except ValueError:
click.echo("Error: Repository should be in the format 'owner/repo'")
return
client = GitHubClient(token)
try:
stats = client.get_repo_stats(owner, repo_name)
click.echo(f"\n{stats['name']} Statistics:")
click.echo(f"⭐ Stars: {stats['stars']:,}")
click.echo(f"🍴 Forks: {stats['forks']:,}")
click.echo(f"⚠️ Open Issues: {stats['open_issues']:,}")
click.echo(f"👀 Watchers: {stats['watchers']:,}")
click.echo(f"🔤 Language: {stats['language']}")
click.echo(f"📅 Created: {stats['created_at']}")
click.echo(f"📝 Updated: {stats['updated_at']}\n")
except Exception as e:
click.echo(f"Error: Failed to fetch repository stats: {e}")
if __name__ == "__main__":
main()
Update src/githubstats/__init__.py
:
"""A CLI tool to fetch GitHub repository statistics."""
__version__ = "0.1.0"
6. Test Your Package
Create a test file in tests/test_github.py
:
import pytest
from unittest.mock import patch, MagicMock
from githubstats.github import GitHubClient
@pytest.fixture
def mock_response():
mock = MagicMock()
mock.json.return_value = {
"full_name": "test/repo",
"stargazers_count": 100,
"forks_count": 50,
"open_issues_count": 10,
"subscribers_count": 25,
"created_at": "2022-01-01T00:00:00Z",
"updated_at": "2022-02-01T00:00:00Z",
"language": "Python",
}
return mock
def test_get_repo_stats(mock_response):
with patch("requests.get", return_value=mock_response) as mock_get:
client = GitHubClient()
stats = client.get_repo_stats("test", "repo")
mock_get.assert_called_once()
assert stats["name"] == "test/repo"
assert stats["stars"] == 100
assert stats["forks"] == 50
Run your tests:
poetry run pytest -v
7. Try the CLI Locally
# Install your package in development mode
uv pip install -e .
# Run the CLI
githubstats python/cpython
8. Create a GitHub Actions Workflow
Create a file .github/workflows/ci.yml
:
name: Python CI
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11"]
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install uv
run: pip install uv
- name: Install dependencies
run: |
uv venv
source .venv/bin/activate
uv pip install -e ".[dev]"
- name: Lint with black and isort
run: |
source .venv/bin/activate
black --check .
isort --check .
- name: Type check with mypy
run: |
source .venv/bin/activate
mypy src
- name: Test with pytest
run: |
source .venv/bin/activate
pytest --cov=src
publish:
needs: test
if: startsWith(github.ref, 'refs/tags/v')
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"
- name: Install build and twine
run: pip install build twine
- name: Build and publish
run: |
python -m build
python -m twine upload dist/* --username __token__ --password ${{ secrets.PYPI_TOKEN }}
9. Build and Publish
# Build the package using the build module
python -m build
# This creates both wheel and sdist in the dist/ directory
ls dist/
# When ready to publish, use:
# python -m twine upload dist/*
This practical walkthrough shows you how to:
- Set up a complete Python package with modern tools
- Configure dependencies properly with Poetry
- Implement a simple but useful CLI tool
- Create tests for your package
- Set up a CI/CD pipeline for automated testing and publishing
Try it yourself and adapt it for your own projects!
Conclusion
Modern Python packaging and dependency management are more powerful and accessible than ever. By adopting tools like Poetry, Hatch, or uv, and embracing standards like pyproject.toml
, you’ll save time, avoid common pitfalls, and make your projects easy to share, maintain, and scale.
Whether you’re building a reusable library or a production application, mastering these fundamentals will level up your Python development workflow.
References
- PEP 518 – pyproject.toml
- PEP 621 – Project metadata in pyproject.toml
- PEP 735 – Dependency groups
- PEP 517 – Build system API
- The Python Packaging User Guide
- Poetry Documentation
- Hatch Documentation
- Flit Documentation
- uv
- Python Packaging Authority (PyPA)
- My previous blog post on Python virtual environments
From requirements.txt to Modern Solutions
Many Python developers are familiar with the traditional requirements.txt
approach:
# requirements.txt
requests==2.28.1
pandas>=1.3.0,<2.0.0
matplotlib==3.5.3
While requirements.txt
files are still valid and supported, modern approaches offer significant advantages:
Feature | requirements.txt | Modern Approaches (pyproject.toml) |
---|---|---|
Version pinning | Manual | Automated via lock files |
Dev vs. prod dependencies | Multiple files needed | Built-in dependency groups |
Reproducibility | Limited (no hashes) | Complete (hashes, platforms, Python versions) |
Environment management | Separate tools needed | Often integrated |
Metadata | Separate setup.py/cfg | Unified in one file |
When to Still Use requirements.txt
Despite the advantages of modern tools, there are still valid use cases for requirements.txt:
- Simple applications where packaging isn’t needed
- Deployment environments that specifically expect them
- CI/CD pipelines with established patterns
- Legacy projects where migration costs outweigh benefits
Most modern tools provide ways to work with requirements.txt
files when needed:
# Generate a requirements.txt from a pyproject.toml with uv
uv pip compile pyproject.toml -o requirements.txt
# Install from requirements.txt with uv (much faster!)
uv pip install -r requirements.txt
# Or using pip-tools
pip-compile pyproject.toml -o requirements.txt
This gives you the best of both worlds: modern workflow for development but compatibility with environments that expect traditional files.
Hands-on: Migrating from requirements.txt to pyproject.toml with uv
Let’s walk through a real-world migration from requirements.txt
to a modern setup with uv:
# Imagine we have this requirements.txt file
cat requirements.txt
# Output:
# flask==2.2.3
# requests==2.28.2
# python-dotenv==1.0.0
# pytest==7.3.1 # for testing
# 1. Create a pyproject.toml file
touch pyproject.toml
# 2. Add our dependencies to the pyproject.toml
cat > pyproject.toml << EOF
[build-system]
requires = ["setuptools>=42", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "my-project"
version = "0.1.0"
description = ""
authors = [
{name = "Your Name", email = "your.email@example.com"}
]
readme = "README.md"
requires-python = ">=3.8"
dependencies = [
"flask==2.2.3",
"requests==2.28.2",
"python-dotenv==1.0.0",
]
[project.optional-dependencies]
dev = [
"pytest==7.3.1",
]
EOF
# 3. Create and activate a virtual environment with uv
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# 4. Install dependencies with uv (much faster than pip)
uv pip install -e . # Install main package dependencies
uv pip install -e ".[dev]" # Install dev dependencies too
# 5. If needed, generate an updated requirements.txt for CI/CD
uv pip compile pyproject.toml -o requirements.txt
The migration to a standard PEP 621 compliant pyproject.toml
file makes your project more maintainable and compatible with all modern Python tools, while using uv gives you significant performance benefits.
Benefits you’ll immediately get:
- Automatic virtual environment management
- Dependency groups for dev vs. production dependencies
- A lock file for reproducible installs
- Simplified dependency commands
Hands-on: Same Task, Different Tools
Let’s compare how to accomplish the same task—adding a dependency and running a script—using different package managers:
Traditional Approach (pip + venv)
# Create and activate virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install packages
pip install requests
pip freeze > requirements.txt
# Run script
python -c "import requests; print(requests.get('https://httpbin.org/json').json())"
Using Poetry
# Create project and add dependency
mkdir poetry-demo && cd poetry-demo
poetry init --no-interaction
poetry add requests
# Run script
poetry run python -c "import requests; print(requests.get('https://httpbin.org/json').json())"
# Or open a shell in the virtual environment
poetry shell
python -c "import requests; print(requests.get('https://httpbin.org/json').json())"
Using PDM
# Create project and add dependency
mkdir pdm-demo && cd pdm-demo
pdm init
pdm add requests
# Run script
pdm run python -c "import requests; print(requests.get('https://httpbin.org/json').json())"
Using uv
# Install uv if you haven't already
pip install uv
# Create virtual environment and install packages
mkdir uv-demo && cd uv-demo
uv venv
uv pip install requests
# Activate and run
source .venv/bin/activate # On Windows: .venv\Scripts\activate
python -c "import requests; print(requests.get('https://httpbin.org/json').json())"
Try these examples yourself to get a feel for each tool’s workflow and see which one fits your style best. Notice how the modern tools handle environment creation automatically and simplify dependency management.
Common Dependency Issues and Solutions
Let’s tackle some real-world dependency problems you might encounter:
Issue 1: Dependency Conflicts
Problem: You get an error like “Cannot install X and Y because these package versions have conflicting dependencies.”
Solution:
# Using Poetry, you can see the dependency tree
poetry show --tree
# Using pip, you can install the visualizer
pip install pipdeptree
pipdeptree -r -p problematic_package
# Using uv, view explain mode to debug conflicts
uv pip install package_name --explain
What to look for: Find packages that require incompatible versions of the same dependency, then either:
- Update your direct dependencies to versions with compatible requirements
- Add version constraints to force compatibility
- Use
pip-compile
or similar tools to pin specific working versions
Issue 2: Different Behavior in Development vs. Production
Problem: Your code works locally but fails in production with import errors.
Solution: Use lock files to ensure identical dependencies:
# Poetry: Make sure you commit poetry.lock
poetry lock
git add poetry.lock
git commit -m "Update dependency lock file"
# In production
git pull # Get the updated lock file
poetry install --no-dev # Install only production dependencies
Issue 3: Slow Package Installation
Problem: Installing packages takes too long, especially in CI/CD pipelines.
Solution: Use uv or a binary cache:
# Install uv for faster package installation
pip install uv
uv pip install -r requirements.txt
# Or use pip's binary cache
pip install --cache-dir=.pip-cache -r requirements.txt
Issue 4: Platform-Specific Packages
Problem: Packages work on your machine but fail on another platform.
Solution: Specify platform-specific dependencies:
# In pyproject.toml with Poetry
[tool.poetry.dependencies]
python = "^3.8"
package_name = {version = "^1.0", platform = "linux"}
package_name_windows = {version = "^1.0", platform = "win32", markers = "sys_platform == 'win32'"}
These hands-on examples should help you diagnose and solve real dependency problems in your projects.