<!-- Thank you for contributing to uv! To help us out with reviewing, please consider the following: - Does this pull request include a summary of the change? (See below.) - Does this pull request include a descriptive title? - Does this pull request include references to any relevant issues? --> ## Summary <!-- What's the purpose of the change? What does it do, and why? --> This PR adds instructions to install a C compiler on Fedora-based Linux distributions. ## Test Plan ``` # Start Fedora container interactively (can probably be done on Docker as well) podman run -it registry.fedoraproject.org/fedora # From now on, run all commands inside the container. # Install Rustup curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh # Add cargo bin folder to PATH export PATH="${HOME}/.cargo/bin:${PATH}" # Install git, clone uv project and get into its folder dnf install git git clone https://github.com/astral-sh/uv.git cd uv # Try to compile uv and fail (error: linker `cc` not found) cargo build # Install C compiler dnf install gcc # Try to compile uv again. This time, successfully. cargo build ``` Signed-off-by: Mateus Devino <mdevino@ibm.com>
6.4 KiB
Contributing
We have issues labeled as Good First Issue and Help Wanted which are good opportunities for new contributors.
Setup
Rust (and a C compiler) are required to build uv.
On Ubuntu and other Debian-based distributions, you can install a C compiler with:
sudo apt install build-essential
On Fedora-based distributions, you can install a C compiler with:
sudo dnf install gcc
Testing
For running tests, we recommend nextest.
If tests fail due to a mismatch in the JSON Schema, run: cargo dev generate-json-schema.
Python
Testing uv requires multiple specific Python versions; they can be installed with:
cargo run python install
The storage directory can be configured with UV_PYTHON_INSTALL_DIR. (It must be an absolute path.)
Snapshot testing
uv uses insta for snapshot testing. It's recommended (but not necessary) to use
cargo-insta for a better snapshot review experience. See the
installation guide for more information.
In tests, you can use uv_snapshot! macro to simplify creating snapshots for uv commands. For
example:
#[test]
fn test_add() {
let context = TestContext::new("3.12");
uv_snapshot!(context.filters(), context.add().arg("requests"), @"");
}
To run and review a specific snapshot test:
cargo test --package <package> --test <test> -- <test_name> -- --exact
cargo insta review
Local testing
You can invoke your development version of uv with cargo run -- <args>. For example:
cargo run -- venv
cargo run -- pip install requests
Running inside a Docker container
Source distributions can run arbitrary code on build and can make unwanted modifications to your system ("Someone's Been Messing With My Subnormals!" on Blogspot, "nvidia-pyindex" on PyPI), which can even occur when just resolving requirements. To prevent this, there's a Docker container you can run commands in:
$ docker build -t uv-builder -f crates/uv-dev/builder.dockerfile --load .
# Build for musl to avoid glibc errors, might not be required with your OS version
cargo build --target x86_64-unknown-linux-musl --profile profiling
docker run --rm -it -v $(pwd):/app uv-builder /app/target/x86_64-unknown-linux-musl/profiling/uv-dev resolve-many --cache-dir /app/cache-docker /app/scripts/popular_packages/pypi_10k_most_dependents.txt
We recommend using this container if you don't trust the dependency tree of the package(s) you are trying to resolve or install.
Profiling and Benchmarking
Please refer to Ruff's Profiling Guide, it applies to uv, too.
We provide diverse sets of requirements for testing and benchmarking the resolver in
scripts/requirements and for the installer in scripts/requirements/compiled.
You can use scripts/benchmark to benchmark predefined workloads between uv versions and with other
tools, e.g., from the scripts/benchmark directory:
uv run resolver \
--uv-pip \
--poetry \
--benchmark \
resolve-cold \
../scripts/requirements/trio.in
Analyzing concurrency
You can use tracing-durations-export to
visualize parallel requests and find any spots where uv is CPU-bound. Example usage, with uv and
uv-dev respectively:
RUST_LOG=uv=info TRACING_DURATIONS_FILE=target/traces/jupyter.ndjson cargo run --features tracing-durations-export --profile profiling -- pip compile scripts/requirements/jupyter.in
RUST_LOG=uv=info TRACING_DURATIONS_FILE=target/traces/jupyter.ndjson cargo run --features tracing-durations-export --bin uv-dev --profile profiling -- resolve jupyter
Trace-level logging
You can enable trace level logging using the RUST_LOG environment variable, i.e.
RUST_LOG=trace uv
Documentation
To preview any changes to the documentation locally:
-
Install the Rust toolchain.
-
Run
cargo dev generate-all, to update any auto-generated documentation. -
Run the development server with:
# For contributors. uvx --with-requirements docs/requirements.txt -- mkdocs serve -f mkdocs.public.yml # For members of the Astral org, which has access to MkDocs Insiders via sponsorship. uvx --with-requirements docs/requirements-insiders.txt -- mkdocs serve -f mkdocs.insiders.yml
The documentation should then be available locally at http://127.0.0.1:8000/uv/.
To update the documentation dependencies, edit docs/requirements.in and
docs/requirements-insiders.in, then run:
uv pip compile docs/requirements.in -o docs/requirements.txt --universal -p 3.12
uv pip compile docs/requirements-insiders.in -o docs/requirements-insiders.txt --universal -p 3.12
Documentation is deployed automatically on release by publishing to the Astral documentation repository, which itself deploys via Cloudflare Pages.
After making changes to the documentation, format the markdown files with:
npx prettier --prose-wrap always --write "**/*.md"
Note that the command above requires Node.js and npm to be installed on your system. As an alternative, you can run this command using Docker:
$ docker run --rm -v .:/src/ -w /src/ node:alpine npx prettier --prose-wrap always --write "**/*.md"
Releases
Releases can only be performed by Astral team members.
Changelog entries and version bumps are automated. First, run:
./scripts/release.sh
Then, editorialize the CHANGELOG.md file to ensure entries are consistently styled.
Then, open a pull request, e.g., Bump version to ....
Binary builds will automatically be tested for the release.
After merging the pull request, run the
release workflow with the version
tag. Do not include a leading v. The release will automatically be created on GitHub after
everything else publishes.