Python Monorepos and uv: When Simple Imports Become Complex Puzzles

Alex Pill6 min
pythonuvmonorepodebuggingfastapicliproject-structure

Python Monorepos and uv: When Simple Imports Become Complex Puzzles

"It should be simple," I thought, staring at my screen. "Just from lib import monitoring_types in my FastAPI application. How hard can it be?"

Famous last words. What followed was a deep dive into Python packaging, workspace configurations, and the subtle art of making different parts of a monorepo actually talk to each other.

The Project: Page Monitoring Made Modular

Let me set the scene. I was building a page monitoring system - think of it as a personal watchdog for websites that alerts you when content changes. The architecture seemed straightforward:

  • lib/: A shared library containing all the core monitoring logic, data types, and utilities
  • api/: A FastAPI server exposing monitoring functionality via REST endpoints
  • cli/: A command-line interface for direct monitoring operations

Classic separation of concerns. The library handles the business logic, while the API and CLI are just different interfaces to the same functionality. In Node.js land, this would be a standard Lerna or Nx monorepo setup. In Python with uv... well, that's where things got interesting.

The First Attempt: "It Should Just Work"

My initial approach was optimistic. I structured the project with three separate packages, each with its own pyproject.toml. The lib would be installed as a dependency in both the API and CLI packages. Simple, clean, logical.

page-monitoring/
├── lib/           # Core monitoring library
├── api/           # FastAPI server  
├── cli/           # Command-line interface
└── pyproject.toml # Root workspace config

The first uv run api/src/main.py gave me a reality check:

ModuleNotFoundError: No module named 'lib'

The Workspace Wrestling Match

This is where my developer debugging instincts kicked in. The error was clear - the API couldn't find the lib package. But why? The dependency was declared in api/pyproject.toml:

dependencies = [
    "fastapi[standard]>=0.116.1",
    "lib",
]

The issue wasn't the declaration - it was making uv understand that lib was a workspace member, not a PyPI package. This required the [tool.uv.sources] section:

[tool.uv.sources]
lib = { workspace = true }

But even with workspace sources configured, I kept running into context issues. Running uv run from the wrong directory, commands not finding the right Python environment, imports working in development but failing when packaged.

The Command Orchestration Problem

Here's where the real pain point emerged. What I wanted was simple developer ergonomics:

# From the project root, just run:
uv run cli --help                    # Use the CLI tool
uv run fastapi dev api/app          # Start the dev server  

But what I was getting instead was a constant dance of directory changes:

cd api && uv run fastapi dev app/main.py
cd ../cli && uv run python -m src.main --help
cd ..

The cognitive overhead was killing my flow. Every command required context switching - not just mentally, but literally navigating directories. For a developer used to npm run dev or cargo run working from anywhere in the project, this felt like a step backwards.

The Import Puzzle: Development vs Distribution

One of the most confusing aspects was the difference between how imports work during development versus when packages are actually built and distributed. During development with editable installs, Python's import system works one way. When you build wheels and install them, it works differently.

The solution involved carefully structuring the [tool.hatch.build.targets.wheel] configurations to ensure that packages were built correctly, while maintaining the workspace relationships that make development possible.

For the lib package, this meant:

[tool.hatch.build.targets.wheel]
packages = ["src/lib"]

While for the consuming packages (API and CLI):

[tool.hatch.build.targets.wheel]
packages = ["src"]

The Debugging Mindset Applied

What saved me through this process was treating it like debugging a complex software issue:

  1. Isolate the problem: Start with the simplest possible import case
  2. Understand the tool: Read uv documentation thoroughly, not just skim it
  3. Test assumptions: Verify each step of the workspace resolution
  4. Document the solution: Future me will thank present me

The breakthrough came when I realized I was thinking about Python packages the way I think about JavaScript modules - but they're fundamentally different beasts. Python's import system, combined with modern packaging tools like uv, requires a more explicit approach to declaring relationships between packages.

The Working Solution

The breakthrough came when I properly configured the root pyproject.toml to expose the sub-packages' scripts at the workspace level:

Root pyproject.toml:

[project]
name = "page-monitoring"
version = "0.1.0"
dependencies = ["cli", "api"]
 
[tool.uv.workspace]
members = ["lib", "cli", "api"]
 
[tool.uv.sources]
cli = { workspace = true }
api = { workspace = true }  
lib = { workspace = true }
 
[project.scripts]
cli = "cli.src.main:cli"

API pyproject.toml:

dependencies = [
    "fastapi[standard]>=0.116.1",
    "lib",
]
 
[tool.uv.sources]
lib = { workspace = true }

The key insight was that by declaring the workspace packages as dependencies in the root project and properly configuring the script entry points, uv could resolve everything from the top level.

Commands That Actually Work

With the workspace properly configured, the dream workflow finally became reality:

# Use the CLI tool from project root
uv run cli --help
uv run cli monitor https://example.com
 
# Start the FastAPI dev server from project root  
uv run fastapi dev api/app
 
# Install all workspace dependencies
uv sync
 
# The old way that I wanted to avoid:
# cd api && uv run fastapi dev app/main.py
# cd ../cli && uv run python -m src.main --help

The magic was in the [project.scripts] section combined with workspace dependencies. By making the root project depend on cli and api, and exposing their entry points as scripts, uv could resolve everything in the workspace context.

The AI Advantage

Throughout this process, I found AI tools incredibly helpful for understanding the nuances of uv's workspace system. Being able to paste error messages and get contextualized explanations of Python packaging concepts accelerated the debugging process significantly.

AI doesn't replace the systematic debugging approach, but it can quickly surface relevant documentation sections, suggest configuration patterns, and help interpret cryptic error messages.

Lessons for Future Monorepos

If I were starting a similar project today, here's what I'd do differently:

  1. Configure workspace dependencies at the root level - make your root project depend on the sub-packages
  2. Expose scripts in the root pyproject.toml for the commands you use most
  3. Test the command ergonomics early - if you're cd-ing around constantly, something's wrong
  4. Document the working commands immediately when you find them

The Python packaging ecosystem is powerful but complex. Tools like uv are making it better, but they still require understanding the underlying concepts to use effectively.

The Developer Parallel

This experience reinforced something I've learned throughout my career: the debugging skills we develop for code apply everywhere. Whether you're chasing a race condition in a distributed system, diagnosing hardware failures, or figuring out why your monorepo imports don't work, the methodology remains consistent.

Break down the problem, test your assumptions, understand your tools, and document your solutions. The domain changes, but the approach stays the same.


The page monitoring system is now running smoothly, with a clean separation between the core library, API server, and CLI interface. The imports work, the commands run, and future contributors (including future me) have a clear path to understanding the project structure.

Sometimes the most valuable debugging sessions aren't about fixing bugs in our code - they're about understanding the tools and systems that make our code possible in the first place.

Working on a Python monorepo or struggling with uv workspace configurations? I'd love to hear about your experiences and solutions. Drop me a line and let's share our debugging adventures.

This article was written with AI assistance.