## Summary
Allows, e.g., `UV_SYSTEM_PYTHON=false uv pip install --python
.venv/bin/python`.
This was intended to work after fixing
https://github.com/astral-sh/uv/issues/3000, but I think I misdiagnosed
the scope when closing that issue, and the linked PR there only fixed
some _other_ problems around index URLs.
The only thing we really lose here is we no longer error when
`--break-system-packages` is provided without `--system`, but we can
enforce that elsewhere if we want.
Closes https://github.com/astral-sh/uv/issues/3829.
## Summary
This PR makes a variety of invalid states unrepresentable by changing
`Preference` to require a `PackageName` and `Version`, rather than
accepting a generic `Requirement`. There should be no meaningful
behavior changes.
## Summary
We actually _already_ ignore these (preferences only apply to versions,
not URLs), it just happens later on. This PR thus just avoids crashing.
The behavior is unchanged.
Closes#3822.
Closes#3784
The cache did not use an absolute path. I'm not sure this is actually a
new bug, as this code wasn't touched in #3266 but perhaps there was a
slight difference in the paths we were passing around. Note, just
canonicalizing the path as soon as we see it doesn't work because then
we jump out of the virtual environmnent into the system interpreter.
## Test plan
```
❯ uv venv
Using Python 3.12.3 interpreter at: /opt/homebrew/opt/python@3.12/bin/python3.12
Creating virtualenv at: .venv
Activate with: source .venv/bin/activate
❯ uv pip install anyio
Resolved 3 packages in 81ms
Installed 3 packages in 4ms
+ anyio==4.3.0
+ idna==3.7
+ sniffio==1.3.1
❯ mkdir uv-issue-3784 && cd uv-issue-3784
❯ uv venv
Using Python 3.12.3 interpreter at: /opt/homebrew/opt/python@3.12/bin/python3.12
Creating virtualenv at: .venv
Activate with: source .venv/bin/activate
❯ gcm
Switched to branch 'main'
Your branch is up to date with 'origin/main'.
❯ cargo run -q -- pip list -v -p .venv
DEBUG Checking for Python interpreter in directory `.venv`
TRACE Cached interpreter info for Python 3.12.3, skipping probing: .venv/bin/python3
DEBUG Using Python 3.12.3 environment at .venv/bin/python3
Package Version
------- -------
anyio 4.3.0
idna 3.7
sniffio 1.3.1
❯ cd uv-issue-3784
❯ cargo run -q -- pip list -v -p .venv
DEBUG Checking for Python interpreter in directory `.venv`
TRACE Cached interpreter info for Python 3.12.3, skipping probing: .venv/bin/python3
DEBUG Using Python 3.12.3 environment at /Users/zb/workspace/uv/.venv/bin/python3
Package Version
------- -------
anyio 4.3.0
idna 3.7
sniffio 1.3.1
❯ cd ..
❯ gco zb/fix-relative-venv
Switched to branch 'zb/fix-relative-venv'
❯ cargo run -q -- pip list -v -p .venv
DEBUG Checking for Python interpreter in directory `.venv`
TRACE Cached interpreter info for Python 3.12.3, skipping probing: .venv/bin/python3
DEBUG Using Python 3.12.3 environment at .venv/bin/python3
Package Version
------- -------
anyio 4.3.0
idna 3.7
sniffio 1.3.1
❯ cd uv-issue-3784
❯ cargo run -q -- pip list -v -p .venv
DEBUG Checking for Python interpreter in directory `.venv`
TRACE Cached interpreter info for Python 3.12.3, skipping probing: .venv/bin/python3
DEBUG Using Python 3.12.3 environment at .venv/bin/python3
```
## Summary
Related to https://github.com/astral-sh/uv/issues/3818. We should
_always_ include the package name if we know it's not a file path, even
if it starts with an environment variable.
## Summary
I haven't tested on Windows yet, but the idea here is that we should use
a portable representation when printing paths.
I decided to limit the scope here to paths that we write to output
files.
Closes https://github.com/astral-sh/uv/issues/3800.
## Summary
It turns out that in the
[spec](https://packaging.python.org/en/latest/specifications/binary-distribution-format/#file-name-convention),
if a wheel filename includes a build tag, then we need to use it to
break ties. This PR implements that behavior. (Previously, we dropped
the build tag entirely.)
Closes#3779.
## Test Plan
Run: `cargo run pip install -i https://pypi.anaconda.org/intel/simple
mkl_fft==1.3.8 --python-platform linux --python-version 3.10`. This now
resolves without error. Previously, we selected build tag 63 of
`mkl_fft==1.3.8`, which led to an incompatibility with NumPy. Now, we
select build tag 70.
When parsing requirements from any source, directly parse the url parts
(and reject unsupported urls) instead of parsing url parts at a later
stage. This removes a bunch of error branches and concludes the work
parsing url parts once and passing them around everywhere.
Many usages of the assembled `VerbatimUrl` remain, but these can be
removed incrementally.
Please review commit-by-commit.
## Summary
This seems to be one of the most consistent benchmark cases we have in
terms of standard deviation:
```
➜ hyperfine "target/profiling/main pip compile scripts/requirements/airflow.in" --runs 200
Benchmark 1: target/profiling/main pip compile scripts/requirements/airflow.in
Time (mean ± σ): 292.6 ms ± 6.6 ms [User: 414.1 ms, System: 194.2 ms]
Range (min … max): 282.7 ms … 320.1 ms 200 runs
```
For smaller benchmarks, scispacy and dtlssocket seem to be a bit more
consistent than our current jupyter benchmark, but it hasn't given us
any problems so I'll leave it for now.
e.g. in `uv pip install anyio -v` this message is just noise
```
DEBUG Requirement satisfied: anyio
DEBUG Requirement satisfied: idna>=2.8
DEBUG Requirement satisfied: sniffio>=1.1
DEBUG All editables satisfied:
```
```
DEBUG Acquired lock for `.venv`
```
instead of
```
DEBUG Trying to lock if free: .venv/.lock
```
At trace level, this includes the pre-lock message as well
```
TRACE Checking lock for `.venv`
DEBUG Acquired lock for `.venv`
```
We'll still display the lock file path when something goes wrong
The venv subcommand requires a system interpreter. The tests python path
discovery would previously allow a venv interpreter, failing the venv
tests that don't have system interpreter anymore.
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
Allows requesting additional transitive dependencies when running a
tool.
e.g. `uv tool run -v --with anyio ruff check example.py` (Why would you
want anyio with ruff? Who knows 😄)
The motivation for doing this now is that I think the first
implementation of `uv tool install` might just shim into `uv tool run`
with pinned dependencies? Regardless this is something we need in the
long run and is a trivial addition right now.
## Summary
We now show yanks as part of the resolution diagnostics, so they now
appear for `sync`, `install`, `compile`, and any other operations.
Further, they'll also appear for cached packages (but not packages that
are _already_ installed).
Closes https://github.com/astral-sh/uv/issues/3768.
Closes#3766.
We usually infer the package the tool is pulled from to be the same name
as the tool itself, but that's not always the case. This allows users to
provide a custom package.
## Summary
This PR removes most of the code in `project/mod.rs` in favor of the
routines exposed in `pip/operations.rs`.
I think we can do a lot more to add more abstraction here and reduce the
verbosity, but for now it deduplicates a _ton_ of logic. The remaining
logic is just instantiating settings etc.
e.g. this error message is not great
```
❯ uv venv --python 3.12.2
× No interpreter found for Python 3.12.2 in provided path, search path, managed toolchains, or parent interpreter
```
e.g.
```
❯ echo "anyio" | cargo run -q -- pip compile - --python 3.9 -v
DEBUG Searching for interpreter that fulfills Python @ 3.9
DEBUG Found a virtual environment at: /Users/zb/workspace/uv/.venv
DEBUG Using Python 3.9.18 interpreter at bin/cpython-3.9.18-macos-aarch64-none/install/bin/python3 for builds
```
e.g. instead of
```
❯ uv venv --python pypy@3.10
× No interpreter found for pypy 3.10 in search path
```
we say
```
❯ uv venv --python pypy@3.10
× No interpreter found for PyPy 3.10 in search path
```
Closes#2222
Closes https://github.com/astral-sh/uv/issues/2058
Replaces https://github.com/astral-sh/uv/pull/2338
See also https://github.com/astral-sh/uv/issues/2649
We use an environment variable (`UV_INTERNAL__PARENT_INTERPRETER`) to
track the invoking interpreter when `python -m uv` is used. The parent
interpreter is preferred over all other sources (though it will be
skipped if it does not meet a `--python` request or if `--system` is
used and it belongs to a virtual environment). We warn if `--system` is
not provided and this interpreter would mutate system packages, but
allow it.
Previously, we enforced `SystemPython` outside of the interpreter
discovery exclusively with source selection. Now, we perform additional
filtering of interpreters depending on if they are a virtual
environment. This should not change any existing behavior, but will make
it much easier to have consistent behavior in ambiguous cases like
https://github.com/astral-sh/uv/pull/3736#discussion_r1610072262 where a
source could provide either a system interpreter or virtual environment
interpreter.
## Summary
This PR takes the functions used in `pip install`, moves them into a
common module, and then replaces all the `pip sync` logic with calls
into those functions. The net effect is that `pip install` and `pip
sync` share far more code and demonstrate much more consistent behavior.
Closes https://github.com/astral-sh/uv/issues/3555.
## Summary
This PR adds editables using a new source type (`editable+...`), and
then extracts the editables from the lockfile in `uv sync`.
Closes https://github.com/astral-sh/uv/issues/3695.
Otherwise `uv venv --python 3.12` can prefer `.venv/bin/python` over the
system Python (which is always used if you don't provide a `--python`
flag). I would find this confusing as a user.
Updates our executable name searches to support implementation names
i.e. `cpython` and `pypy` and adds support for PyPy.
We might want to _not_ support searching for `cpython` because that's
non-standard?
Adds `--offline` support to `uv tool run` and `uv run` because I needed
it on the airplane today.
I think we should move `--offline` to the global settings like
`--native-tls`.
## Summary
Closes https://github.com/astral-sh/uv/issues/3715.
## Test Plan
```
❯ echo "/../test" | cargo run pip compile -
error: Couldn't parse requirement in `-` at position 0
Caused by: path could not be normalized: /../test
/../test
^^^^^^^^
❯ echo "-e /../test" | cargo run pip compile -
error: Invalid URL in `-`: `/../test`
Caused by: path could not be normalized: /../test
Caused by: cannot normalize a relative path beyond the base directory
```
This is mostly a shorter version of `uv run` that infers a requirement
name from the command. The main goal here is to do the smallest amount
of work necessary to get #3560 started.
Closes#3613
e.g.
```shell
$ uv tool run -- ruff check
warning: `uv tool run` is experimental and may change without warning.
Resolved 1 package in 34ms
Installed 1 package in 2ms
+ ruff==0.4.4
error: Failed to parse example.py:1:5: Expected an expression
example.py:1:5: E999 SyntaxError: Expected an expression
Found 1 error.
```
Updates our Python interpreter discovery to conform to the rules
described in #2386, please see that issue for a full description of the
behavior. Briefly, we now will search for interpreters that satisfy a
requested version without stopping at the first Python executable.
Additionally, if retrieving information about an interpreter fails we
will continue to search for a working interpreter. We also add the
plumbing necessary to request Python implementations other than CPython,
though we do not add support for other implementations at this time.
A major internal goal of this work is to prepare for user-facing managed
toolchains i.e. fetching a requested version during `uv run`. These APIs
are not introduced, but there is some managed toolchain handling as
required for our test suite.
Some noteworthy implementation changes:
- The `uv_interpreter::find_python` module has been removed in favor of
a `uv_interpreter::discovery` module.
- There are new types to help structure interpreter requests and track
sources
- Executable discovery is implemented as a big lazy iterator and is a
central authority for source precedence
- `uv_interpreter::Error` variants were split into scoped types in each
module
- There's much more unit test coverage, but not for Windows yet
Remaining work:
- [x] Write new test cases
- [x] Determine correct behavior around executables in the current
directory
- _Future_: Combine `PythonVersion` and `VersionRequest`
- _Future_: Consider splitting `ManagedToolchain` into local and remote
variants
- _Future_: Add Windows unit test coverage
- _Future_: Explore behavior around implementation precedence (i.e.
CPython over PyPy)
Refactors split into:
- #3329
- #3330
- #3331
- #3332Closes#2386
Instead of saying
> we can conclude that you require==0a0.dev0 and
pandas-stubs==2.0.3.230814 are incompatible.
we'll say
> we can conclude that your requirements and pandas-stubs==2.0.3.230814
are incompatible.
Closes#3710
I'm not sure how to get unit test coverage for this, might look into
that. Ideally we'd skip this branch entirely?
I don't really understand why this only happens on windows clippy and
not on linux too, but as usual, boxing the error variant fixes it.
Fixup for #3585
Add minimal support for workspace discovery, only used for determining
paths in the bluejay commands.
We can now discover the workspace structure, namely that the
`pyproject.toml` of a package belongs to a workspace `pyproject.toml`
with members and exclusion. The globbing logic is inspired by cargo. We
don't resolve `workspace = true` metadata declarations yet.
Pubgrub stores incompatibilities as (package name, version range)
tuples, meaning it needs to clone the package name for each
incompatibility, and each non-borrowed operation on incompatibilities.
https://github.com/astral-sh/uv/pull/3673 made me realize that
`PubGrubPackage` has gotten large (expensive to copy), so like `Version`
and other structs, i've added an `Arc` wrapper around it.
It's a pity clippy forbids `.deref()`, it's less opaque than `&**` and
has IDE support (clicking on `.deref()` jumps to the right impl).
## Benchmarks
It looks like this matters most for complex resolutions which, i assume
because they carry larger `PubGrubPackageInner::Package` and
`PubGrubPackageInner::Extra` types.
```bash
hyperfine --warmup 5 "./uv-main pip compile -q ./scripts/requirements/jupyter.in" "./uv-branch pip compile -q ./scripts/requirements/jupyter.in"
hyperfine --warmup 5 "./uv-main pip compile -q ./scripts/requirements/airflow.in" "./uv-branch pip compile -q ./scripts/requirements/airflow.in"
hyperfine --warmup 5 "./uv-main pip compile -q ./scripts/requirements/boto3.in" "./uv-branch pip compile -q ./scripts/requirements/boto3.in"
```
```
Benchmark 1: ./uv-main pip compile -q ./scripts/requirements/jupyter.in
Time (mean ± σ): 18.2 ms ± 1.6 ms [User: 14.4 ms, System: 26.0 ms]
Range (min … max): 15.8 ms … 22.5 ms 181 runs
Benchmark 2: ./uv-branch pip compile -q ./scripts/requirements/jupyter.in
Time (mean ± σ): 17.8 ms ± 1.4 ms [User: 14.4 ms, System: 25.3 ms]
Range (min … max): 15.4 ms … 23.1 ms 159 runs
Summary
./uv-branch pip compile -q ./scripts/requirements/jupyter.in ran
1.02 ± 0.12 times faster than ./uv-main pip compile -q ./scripts/requirements/jupyter.in
```
```
Benchmark 1: ./uv-main pip compile -q ./scripts/requirements/airflow.in
Time (mean ± σ): 153.7 ms ± 3.5 ms [User: 165.2 ms, System: 157.6 ms]
Range (min … max): 150.4 ms … 163.0 ms 19 runs
Benchmark 2: ./uv-branch pip compile -q ./scripts/requirements/airflow.in
Time (mean ± σ): 123.9 ms ± 4.6 ms [User: 152.4 ms, System: 133.8 ms]
Range (min … max): 118.4 ms … 138.1 ms 24 runs
Summary
./uv-branch pip compile -q ./scripts/requirements/airflow.in ran
1.24 ± 0.05 times faster than ./uv-main pip compile -q ./scripts/requirements/airflow.in
```
```
Benchmark 1: ./uv-main pip compile -q ./scripts/requirements/boto3.in
Time (mean ± σ): 327.0 ms ± 3.8 ms [User: 344.5 ms, System: 71.6 ms]
Range (min … max): 322.7 ms … 334.6 ms 10 runs
Benchmark 2: ./uv-branch pip compile -q ./scripts/requirements/boto3.in
Time (mean ± σ): 311.2 ms ± 3.1 ms [User: 339.3 ms, System: 63.1 ms]
Range (min … max): 307.8 ms … 317.0 ms 10 runs
Summary
./uv-branch pip compile -q ./scripts/requirements/boto3.in ran
1.05 ± 0.02 times faster than ./uv-main pip compile -q ./scripts/requirements/boto3.in
```
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
This is bare-bones support for editables in `uv sync` as basis for
workspace support, notably without lockfile integration. It leverages
the existing `ResolvedEditables` infrastructure.
## Summary
This PR falls back to writing an unnamed requirement if it appears to be
a relative URL. pip is way more flexible when providing an unnamed
requirement than when providing a PEP 508 requirement. For example,
_only_ this works:
```
black @ file:///Users/crmarsh/workspace/uv/scripts/packages/black_editable
```
Any other form will fail.
Meanwhile, _all_ of these work:
```
file:///Users/crmarsh/workspace/uv/scripts/packages/black_editable
scripts/packages/black_editable
./scripts/packages/black_editable
file:./scripts/packages/black_editable
file:scripts/packages/black_editable
```
Closes https://github.com/astral-sh/uv/issues/3180.
Since we're adding a `Option<MarkerTree>` to `PubGrubPackage`, and since
we just make `PubGrubPackage` implement `Ord`, it follows that we want
`MarkerTree` to also implement `Ord`.
This makes use of the newly added `Ord` impl on `PubGrubPackage` to make
the output of `format_terms` independent of hashmap iteration order.
This was already collecting the terms into an intermediate `Vec`, so
sorting probably isn't going to add any significant overhead here.
(Plus, this is only running when formatting an error message after a
solution could not be found, so an extra sort doesn't seem like a big
deal here.)
Note that some tests are updated in this commit as a result of this
change. As far as I can tell, the semantic meaning of the output remains
the same. But the order of the listed packages does not.
Specific thing motivating this change is, in a subsequent, I added
`Option<MarkerTree>` to `PubGrubPackage::Package`, and this caused
similar changes in test output. So I backtracked and isolated this
change from the addition of `Option<MarkerTree>`.
It turns out that we use PubGrubPackage as the key in hashmaps in a fair
few places. And when we iterate over hashmaps, the order is unspecified.
This can in turn result in changes in output as a result of changes in
the PubGrubPackage definition, purely as a function of its changing
hash. This is confusing as there should be no semantic difference.
Thus, this is a precursor to introducing some more determinism to places
I found in the error reporting whose output depending on hashmap
iteration order.
It looks like the last vestiges of `Derivative` were removed in commit
7eaed07f6c, but the then rendered
superfluous `derive(Derivative)` wasn't removed.
This is split out from workspaces support, which needs editables in the
bluejay commands. It consists mainly of refactorings:
* Move the `editable` module one level up.
* Introduce a `BuiltEditableMetadata` type for `(LocalEditable,
Metadata23, Requirements)`.
* Add editables to `InstalledPackagesProvider` so we can use
`EmptyInstalledPackages` for them.
## Summary
If you have (e.g.) `extra-index-url` in your configuration file _and_
provide `--extra-index-url` on the command-line, we now merge the
options rather than ignoring those in the configuration file. As such,
merging the CLI and the persistent configuration is now semantically
identical to how we merge (project persistent configuration) with (user
persistent configuration).
Closes https://github.com/astral-sh/uv/issues/3541.
## Summary
The main motivation here is that the `.filename()` method that we
implement on `Url` will do URL decoding for the last segment, which we
were missing here.
The errors are a bit awkward, because in
`crates/uv-resolver/src/lock.rs`, we wrap in `failed to extract filename
from URL: {url}`, so in theory we want the underlying errors to _omit_
the URL? But sometimes they use `#[error(transparent)]`?
## Summary
Uncertain about this, but we don't actually need the full
`SourceDistFilename`, only the name and version -- and we often have
that information already (as in the lockfile routines). So by flattening
the fields onto `RegistrySourceDist`, we can avoid re-parsing for
information we already have.
Prompted by
https://github.com/astral-sh/uv/pull/3657#discussion_r1606041239
There's still some level of discomfort here, as the `tool` module needs
needs to import the `project` module to manage an environment. We should
probably move most of the basic operations in the `project` module root
into some sort of shared module for behind the scenes operations?
Regardless, this change should simplify that future move.
## Summary
Restore API-compatibility with pre-1.1.0 versions of the `zip` crate,
and pin the dependency to the 0.6 series, due to concerns discussed in
https://github.com/astral-sh/uv/issues/3642.
## Test Plan
```
cargo run -p uv-dev -- fetch-python
cargo test
```
## Summary
This PR adds lossless deserialization for `GitSourceDist` distributions
in the lockfile. Specifically, we now properly preserve the requested
revision, the subdirectory, and the precise Git commit SHA.
## Test Plan
`cargo test`
## Summary
This PR introduces parallelism to the resolver. Specifically, we can
perform PubGrub resolution on a separate thread, while keeping all I/O
on the tokio thread. We already have the infrastructure set up for this
with the channel and `OnceMap`, which makes this change relatively
simple. The big change needed to make this possible is removing the
lifetimes on some of the types that need to be shared between the
resolver and pubgrub thread.
A related PR, https://github.com/astral-sh/uv/pull/1163, found that
adding `yield_now` calls improved throughput. With optimal scheduling we
might be able to get away with everything on the same thread here.
However, in the ideal pipeline with perfect prefetching, the resolution
and prefetching can run completely in parallel without depending on one
another. While this would be very difficult to achieve, even with our
current prefetching pattern we see a consistent performance improvement
from parallelism.
This does also require reverting a few of the changes from
https://github.com/astral-sh/uv/pull/3413, but not all of them. The
sharing is isolated to the resolver task.
## Test Plan
On smaller tasks performance is mixed with ~2% improvements/regressions
on both sides. However, on medium-large resolution tasks we see the
benefits of parallelism, with improvements anywhere from 10-50%.
```
./scripts/requirements/jupyter.in
Benchmark 1: ./target/profiling/baseline (resolve-warm)
Time (mean ± σ): 29.2 ms ± 1.8 ms [User: 20.3 ms, System: 29.8 ms]
Range (min … max): 26.4 ms … 36.0 ms 91 runs
Benchmark 2: ./target/profiling/parallel (resolve-warm)
Time (mean ± σ): 25.5 ms ± 1.0 ms [User: 19.5 ms, System: 25.5 ms]
Range (min … max): 23.6 ms … 27.8 ms 99 runs
Summary
./target/profiling/parallel (resolve-warm) ran
1.15 ± 0.08 times faster than ./target/profiling/baseline (resolve-warm)
```
```
./scripts/requirements/boto3.in
Benchmark 1: ./target/profiling/baseline (resolve-warm)
Time (mean ± σ): 487.1 ms ± 6.2 ms [User: 464.6 ms, System: 61.6 ms]
Range (min … max): 480.0 ms … 497.3 ms 10 runs
Benchmark 2: ./target/profiling/parallel (resolve-warm)
Time (mean ± σ): 430.8 ms ± 9.3 ms [User: 529.0 ms, System: 77.2 ms]
Range (min … max): 417.1 ms … 442.5 ms 10 runs
Summary
./target/profiling/parallel (resolve-warm) ran
1.13 ± 0.03 times faster than ./target/profiling/baseline (resolve-warm)
```
```
./scripts/requirements/airflow.in
Benchmark 1: ./target/profiling/baseline (resolve-warm)
Time (mean ± σ): 478.1 ms ± 18.8 ms [User: 482.6 ms, System: 205.0 ms]
Range (min … max): 454.7 ms … 508.9 ms 10 runs
Benchmark 2: ./target/profiling/parallel (resolve-warm)
Time (mean ± σ): 308.7 ms ± 11.7 ms [User: 428.5 ms, System: 209.5 ms]
Range (min … max): 287.8 ms … 323.1 ms 10 runs
Summary
./target/profiling/parallel (resolve-warm) ran
1.55 ± 0.08 times faster than ./target/profiling/baseline (resolve-warm)
```
## Summary
Fixes a small discrepancy between the pip compile outputs for
`annotation-style=split` and `annotation-style=line` commands.
### Problem
Consider the following `pyproject.toml` file.
```sh
$ cat pyproject.toml
[project]
name = "uv_test"
dynamic = ["version"]
dependencies = ["click"]
```
Running uv pip compile with annotation-style=split on uv 0.1.44 version
yields the following:
```sh
❯ uv pip compile pyproject.toml --annotation-style=split
Resolved 1 package in 2ms
# This file was autogenerated by uv via the following command:
# uv pip compile pyproject.toml --annotation-style=split
click==8.1.7
# via uv-test (pyproject.toml)
```
However, running uv pip compile with annotation-style=line doesn't
include source info for root level dependencies.
```sh
❯ uv pip compile pyproject.toml --annotation-style=line
Resolved 1 package in 1ms
# This file was autogenerated by uv via the following command:
# uv pip compile pyproject.toml --annotation-style=line
click==8.1.7
```
With this PR:
```sh
❯ ../target/debug/uv pip compile --annotation-style=line pyproject.toml
Resolved 1 package in 6ms
# This file was autogenerated by uv via the following command:
# uv pip compile --annotation-style=line pyproject.toml
click==8.1.7 # via uv-test (pyproject.toml)
```
This also now matches `pip-tools` output:
```sh
❯ pip-compile --annotation-style=line pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
# pip-compile --annotation-style=line pyproject.toml
#
click==8.1.7 # via uv_test (pyproject.toml)
```
## Test Plan
`cargo test`
## Summary
If a user includes markers after an editable, we now ignore them (rather
than including them in the parsed URL). This matches pip's behavior. In
the future, we could further improve by respecting them, but that
_would_ be a deviation from pip.
For example, given:
```
-e ./scripts/packages/black_editable ; python_version >= "3.9" and python_ver
```
We now split at the first whitespace (just before the `;`), parse
everything before, and throw out everything after.
This logic also extends to extras. So given:
```
-e ./scripts/packages/black_editable[dev, colorama]
```
We'll now parse this as the URL
`./scripts/packages/black_editable[dev,`, and throw out ` colorama]`.
Instead, you need to do:
```
-e ./scripts/packages/black_editable[dev,colorama]
```
(I.e., remove the space.)
This _also_ matches pip's behavior. I could "fix" this but I'm unsure if
I should -- it means requirements files will be parseable by uv that
won't work with pip. Open to input. My gut reaction is that we _should_
properly support `-e ./scripts/packages/black_editable[dev, colorama]`
even if pip would reject it, but `requirements.txt` is
implementation-defined so it'd be a "deviation".
Closes https://github.com/astral-sh/uv/issues/3604.
Following from #3595, we'd like wheels to make their way into the lock
file even if the current environment selects an sdist. With #3595, this
didn't happen:
$ cargo run -p uv -- pip compile -p3.10 <(echo psycopg2)
--unstable-uv-lock-file
$ cat uv.lock
version = 1
[[distribution]]
name = "psycopg2"
version = "2.9.9"
source = "registry+https://pypi.org/simple"
[distribution.sdist]
url =
"dc6acaf46d76fce95daac5e0f0301b/psycopg2-2.9.9.tar.gz"
hash =
"sha256:d1454bde93fb1e224166811694d600e746430c006fbb031ea06ecc2ea41bf156"
The above example uses `psycopg2`, which has an sdist and wheels only on
Windows. Since I ran the above on Linux, an sdist was selected. But no
wheels appeared in the lock file.
With this PR, wheels are now correctly plumbed through:
$ cargo run -p uv -- pip compile -p3.10 <(echo psycopg2)
--unstable-uv-lock-file
$ cat uv.lock
version = 1
[[distribution]]
name = "psycopg2"
version = "2.9.9"
source = "registry+https://pypi.org/simple"
[distribution.sdist]
url =
"dc6acaf46d76fce95daac5e0f0301b/psycopg2-2.9.9.tar.gz"
hash =
"sha256:d1454bde93fb1e224166811694d600e746430c006fbb031ea06ecc2ea41bf156"
[[distribution.wheel]]
url =
"2767d96391f5cde90b82cb3e8c2a12/psycopg2-2.9.9-cp310-cp310-win32.whl"
hash =
"sha256:38a8dcc6856f569068b47de286b472b7c473ac7977243593a288ebce0dc89516"
[[distribution.wheel]]
url =
"6572dec6831f85491a5e4dda606a98/psycopg2-2.9.9-cp310-cp310-win_amd64.whl"
hash =
"sha256:426f9f29bde126913a20a96ff8ce7d73fd8a216cfb323b1f04da402d452853c3"
[[distribution.wheel]]
url =
"1fc5b9d33c858a602868a592cdc1b0/psycopg2-2.9.9-cp311-cp311-win32.whl"
hash =
"sha256:ade01303ccf7ae12c356a5e10911c9e1c51136003a9a1d92f7aa9d010fb98372"
[[distribution.wheel]]
url =
"5133dd3183e671b278ce248810b7f7/psycopg2-2.9.9-cp311-cp311-win_amd64.whl"
hash =
"sha256:121081ea2e76729acfb0673ff33755e8703d45e926e416cb59bae3a86c6a4981"
[[distribution.wheel]]
url =
"f74ffe6b6fe119ccb8a6546c3fb893/psycopg2-2.9.9-cp312-cp312-win32.whl"
hash =
"sha256:d735786acc7dd25815e89cc4ad529a43af779db2e25aa7c626de864127e5a024"
[[distribution.wheel]]
url =
"c4a26e1918ab7ee854fb5247f16c40/psycopg2-2.9.9-cp312-cp312-win_amd64.whl"
hash =
"sha256:a7653d00b732afb6fc597e29c50ad28087dcb4fbfb28e86092277a559ae4e693"
[[distribution.wheel]]
url =
"ffeb9ac356ce0d6c4f2f34e396dbc0/psycopg2-2.9.9-cp37-cp37m-win32.whl"
hash =
"sha256:5e0d98cade4f0e0304d7d6f25bbfbc5bd186e07b38eac65379309c4ca3193efa"
[[distribution.wheel]]
url =
"0a39176d36fd7105774e57996f63cd/psycopg2-2.9.9-cp37-cp37m-win_amd64.whl"
hash =
"sha256:7e2dacf8b009a1c1e843b5213a87f7c544b2b042476ed7755be813eaf4e8347a"
[[distribution.wheel]]
url =
"86b90d30c4420cc3c0f6da2b8f3a9a/psycopg2-2.9.9-cp38-cp38-win32.whl"
hash =
"sha256:ff432630e510709564c01dafdbe996cb552e0b9f3f065eb89bdce5bd31fabf4c"
[[distribution.wheel]]
url =
"c439b378ef79997a935f10374f3c0d/psycopg2-2.9.9-cp38-cp38-win_amd64.whl"
hash =
"sha256:bac58c024c9922c23550af2a581998624d6e02350f4ae9c5f0bc642c633a2d5e"
[[distribution.wheel]]
url =
"5080c0e61ad5f08b9503e508aac116/psycopg2-2.9.9-cp39-cp39-win32.whl"
hash =
"sha256:c92811b2d4c9b6ea0285942b2e7cac98a59e166d59c588fe5cfe1eda58e72d59"
[[distribution.wheel]]
url =
"ec73fe66d4d65f5bbe54efb191d9e6/psycopg2-2.9.9-cp39-cp39-win_amd64.whl"
hash =
"sha256:de80739447af31525feddeb8effd640782cf5998e1a4e9192ebdf829717e3913"
Ref #3351
Our current flow of data from "simple registry package" to "final
resolved distribution" goes through a number of types:
* `SimpleMetadata` is the API response from a registry that includes all
published versions for a package. Each version has an assortment of
metadata
associated with it.
* `VersionFiles` is the aforementioned metadata. It is split in two: a
group of files for source distributions and a group of files for wheels.
* `PrioritizedDist` collects a subset of the files from `VersionFiles`
to form a selection of the "best" sdist and the "best" wheel for the
current environment.
* `CompatibleDist` is created from a borrowed `PrioritizedDist` that,
perhaps among other things, encapsulates the decision of whether to pick
an sdist or a wheel. (This decision depends both on compatibility and
the action being performed. e.g., When doing installation, a
`CompatibleDist` will sometimes select an sdist over a wheel.)
* `ResolvedDistRef` is like a `ResolvedDist`, but borrows a `Dist`.
* `ResolvedDist` is the almost-final-form of a distribution in a
resolution and is created from a `ResolvedDistRef`.
* `AnnotatedResolvedDist` is a new data type that is the actual final
form of a distribution that a universal lock file cares about. It
bundles a `ResolvedDist` with some metadata needed to generate a lock
file.
One of the requirements of a universal lock file is that we include all
wheels (and maybe all source distributions? but at least one if it's
present) associated with a distribution. But the above flow of data (in
the step from `VersionFiles` to `PrioritizedDist`) drops all wheels
except for the best one.
To remedy this, in this PR, we rejigger `PrioritizedDist`,
`CompatibleDist` and `ResolvedDistRef` so that all wheel data is
preserved. And when a `ResolvedDistRef` is finally turned into a
`ResolvedDist`, we copy all of the wheel data. And finally, we adjust
the `Lock` constructor to read this new data and include it in the lock
file. To make this work, we also modify `RegistryBuiltDist` so that it
can contain one or more wheels instead of just one.
One shortcoming here (called out in the code as a FIXME) is that if a
source distribution is selected as the "best" thing to use (perhaps
there are no compatible wheels), then the wheels won't end up in the
lock file. I plan to fix this in a follow-up PR.
We also aren't totally consistent on source distribution naming.
Sometimes we use `sdist`. Sometimes `source`. Sometimes `source_dist`.
I think it'd be nice to just use `sdist` everywhere, but I do prefer
the type names to be `SourceDist`. And sometimes you want function
names to match the type names (i.e., `from_source_dist`), which in turn
leads to an appearance of inconsistency. I'm open to ideas.
Closes#3351
## Summary
In `ResolutionGraph::from_state`, we have mechanisms to grab the hashes
and metadata for all distributions -- but we then throw that information
away. This PR preserves it on a new `AnnotatedDist` (yikes, open to
suggestions) that wraps `ResolvedDist` and includes (1) the hashes
(computed or from the registry) and (2) the `Metadata23`, which lets us
extract the version.
Closes https://github.com/astral-sh/uv/issues/3356.
Closes https://github.com/astral-sh/uv/issues/3357.
## Summary
Splits this into two loops that each handle independent cases, to make
the code a little easier to reason about. No behavioral or logic changes
-- just splitting the `match` across two loops.
## Summary
Closes
https://github.com/astral-sh/uv/issues/3578#issuecomment-2110675382.
## Test Plan
Verified that in the OpenSUSE test, we create both, and they're
symlinks:
```text
INFO: Creating virtual environment with `venv`...
INFO: Installing into `venv` virtual environment...
DEBUG Found a virtualenv named .venv at: /tmp/tmp4nape29h/.venv
DEBUG Cached interpreter info for Python 3.10.14, skipping probing: .venv/bin/python
DEBUG Using Python 3.10.14 environment at .venv/bin/python
DEBUG Trying to lock if free: .venv/.lock
purelib: "/tmp/tmp4nape29h/.venv/lib/python3.10/site-packages"
platlib: "/tmp/tmp4nape29h/.venv/lib64/python3.10/site-packages"
is_same_file(purelib, platlib): Ok(true)
```
## Summary
Increment the removed file counts in filters
in install_registry_source_dist_cached test, to make it work again on
Gentoo. The tested counts were updated
in 9a92a3ad37, but the filters were not.
That said, the respective count increased in Gentoo as well, so adjust
both input and output strings. I'm updating Windows as a guesswork,
though I suspect that filter may not be necessary anymore, given that CI
was passing.
## Test Plan
`cargo test` on Gentoo :-).
## Summary
Fixes a typo in a comment
## Test Plan
I assume there's no need to test comment changes, other than having a
human check they make sense. That's what this PR is for 😉
## Summary
Uses the editable handling from `pip sync`, and improves the
abstractions such that we can pass those resolved editables into the
resolver.
---------
Co-authored-by: konstin <konstin@mailbox.org>
## Summary
It's confusing that we use `constraints` here because constraints mean
something else for us (e.g., `--constraint constraints.txt`). These are
really the dependencies of a given `PubGrubPackage` -- the type is even
called `PubGrubDependencies`.
Windows does not support cloning whole directories so clone each file
instead.
closes#3547
## Test Plan
Ran ` uv pip install setuptools --link-mode=clone` manually
## Summary
I don't love this, but it turns out that setuptools is not robust to
parallel builds: https://github.com/pypa/setuptools/issues/3119. As a
result, if you run uv from multiple processes, and they each attempt to
build the same source distribution, you can hit failures.
This PR applies an advisory lock to the source distribution directory.
We apply it unconditionally, even if we ultimately find something in the
cache and _don't_ do a build, which helps ensure that we only build the
distribution once (and wait for that build to complete) rather than
kicking off builds from each thread.
Closes https://github.com/astral-sh/uv/issues/3512.
## Test Plan
Ran:
```sh
#!/bin/bash
make_venv(){
target/debug/uv venv $1
source $1/bin/activate
target/debug/uv pip install opentracing --no-deps --verbose
}
for i in {1..8}
do
make_venv ./$1/$i &
done
```
## Summary
I think this is overall good change because it explicitly encodes (in
the type system) something that was previously implicit. I'm not a huge
fan of the names here, open to input.
It covers some of https://github.com/astral-sh/uv/issues/3506 but I
don't think it _closes_ it.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
Just fix typos.
While `alpha-numeric` is not really a misspelling:
- it is missing from mainstream curated dictionaries, all of them
suggest `alphanumeric`;
- it is less used than `alphanumeric` (more than ⨉10 less) according to
the Google [Ngram
Viewer](https://books.google.com/ngrams/graph?content=alpha-numeric%2Calphanumeric&year_start=1900&year_end=2019&corpus=en-2019);
- it is [missing from
SCOWL](http://app.aspell.net/lookup?dict=en_US-large;words=alpha-numeric).
## Test Plan
CI jobs.
## Summary
runpy.run_path was added in python 2.7 and 3.2 - and every python that
is not EOL supports it.
It is arguably nicer to read and the path is only given once in the
command.
At least right now, runpy - unlike exec with S102 - is not flagged by
any bandit-derived ruff check.
(I guess because it loads from a file instead of a simple string...)
Because of the import, it is also not a one-liner anymore. (But that
could be fixed with an __import__('runpy').run_path...)
## Test Plan
import runpy
runpy.run_path('/path/to/venv/bin/activate_this.py')
## Summary
If you run the script included in the linked issue, then `uv cache
clean`, we hit permissions errors on certain directories created by
`setuptools`. The permissions on those directories look like:
```
❯ sudo ls -l /Users/crmarsh/Library/Caches/uv/built-wheels-v3/pypi/opentracing/2.4.0/M-fYsaHAaQQvedmPMUl9D/opentracing-2.4.0.tar.gz/build/bdist.macosx-14.2-arm64/wheel/opentracing
Password:
total 0
drwxr-xr-x 3 crmarsh staff 96 May 11 12:51 harness
```
This PR adds logic to make those directories readable by the current
user.
Closes https://github.com/astral-sh/uv/issues/3515.
## Summary
pip passes these as positional arguments, and at least one build backend
relies on that. My personal opinion is that it's a spec violation, and
the build backend should be updated, but I'd prefer to favor
compatibility over strictness here.
Closes https://github.com/astral-sh/uv/issues/3509.
## Test Plan
`cargo run pip install cryptacular==1.6.2`
## Summary
This PR consolidates the concurrency limits used throughout `uv` and
exposes two limits, `UV_CONCURRENT_DOWNLOADS` and
`UV_CONCURRENT_BUILDS`, as environment variables.
Currently, `uv` has a number of concurrent streams that it buffers using
relatively arbitrary limits for backpressure. However, many of these
limits are conflated. We run a relatively small number of tasks overall
and should start most things as soon as possible. What we really want to
limit are three separate operations:
- File I/O. This is managed by tokio's blocking pool and we should not
really have to worry about it.
- Network I/O.
- Python build processes.
Because the current limits span a broad range of tasks, it's possible
that a limit meant for network I/O is occupied by tasks performing
builds, reading from the file system, or even waiting on a `OnceMap`. We
also don't limit build processes that end up being required to perform a
download. While this may not pose a performance problem because our
limits are relatively high, it does mean that the limits do not do what
we want, making it tricky to expose them to users
(https://github.com/astral-sh/uv/issues/1205,
https://github.com/astral-sh/uv/issues/3311).
After this change, the limits on network I/O and build processes are
centralized and managed by semaphores. All other tasks are unbuffered
(note that these tasks are still bounded, so backpressure should not be
a problem).
This only makes hashes optional for wheels/sdists that come from
registires or direct URLs. For wheels/sdists that come from other
sources, a hash should not be present.
For path dependencies, a hash should not be present because the state of
the path dependency is not intended to be tracked in the lock file. This
is consistent with how other tools deal with path dependencies, and if
it were otherwise, the hash would I believe need to be updated for every
change to the path dependency.
For git dependencies (source dists only), a hash should not be present
because the lock will contain the specific commit revision hash. This is
functionally equivalent to a hash, and so a hash is redundant.
As part of this change, we validate the presence or absence of a hash
based on the dependency source. We also add our first regression tests.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
likely necessary to resolve https://github.com/astral-sh/uv/issues/2500
made this a separate PR in an attempt to make the changes as small as
possible; let me know if it's preferred to keep them as a single PR.
<!-- What's the purpose of the change? What does it do, and why? -->
## Test Plan
- edited the test in `interpreter.rs`
- tested manually via `println!`
```
$ cargo run --quiet pip show test
["/Users/chankang/Library/Caches/uv/.tmpKzNEPN", "/Users/chankang/.pyenv/versions/3.12.2/lib/python312.zip", "/Users/chankang/.pyenv/versions/3.12.2/lib/python3.12", "/Users/chankang/.pyenv/versions/3.12.2/lib/python3.12/lib-dynload", "/Users/chankang/repos/uv/.venv/lib/python3.12/site-packages"]
warning: Package(s) not found for: test
chankang@chans-Air ~/repos/uv - (syspath)
$ git diff
diff --git a/crates/uv-interpreter/src/environment.rs b/crates/uv-interpreter/src/environment.rs
index 33b785ce..8ebf0864 100644
--- a/crates/uv-interpreter/src/environment.rs
+++ b/crates/uv-interpreter/src/environment.rs
@@ -106,6 +106,7 @@ impl PythonEnvironment {
/// Some distributions also create symbolic links from `purelib` to `platlib`; in such cases, we
/// still deduplicate the entries, returning a single path.
pub fn site_packages(&self) -> impl Iterator<Item = &Path> {
+ println!("{:?}", self.interpreter.sys_path());
if let Some(target) = self.interpreter.target() {
Either::Left(std::iter::once(target.root()))
} else {
chankang@chans-Air ~/repos/uv - (syspath)
$ python -c "import sys; print(sys.path)"
['', '/Users/chankang/.pyenv/versions/3.12.2/lib/python312.zip', '/Users/chankang/.pyenv/versions/3.12.2/lib/python3.12', '/Users/chankang/.pyenv/versions/3.12.2/lib/python3.12/lib-dynload', '/Users/chankang/.pyenv/versions/3.12.2/lib/python3.12/site-packages']
chankang@chans-Air ~/repos/uv - (syspath)
```
<!-- How was it tested? -->
This still keeps the resolver state on the stack, but it organizes it
into a more structured representation. This is a precursor to
implementing resolver forking, where we will ultimately put this state
on the heap. The idea is that this will let us maintain multiple
independent resolver states that will all produce their own resolution
(and potentially other forked states).
Closes#3354
## Summary
I've started to refer to this as the "project" API in various places, it
seems less duplicative than the "workspace" API which is a little
different.
Now that the type is fully encapsulated, we can pretty easily
migrate to using an Arc inside of a MarkerEnvironment.
It looks like the pyo3 macros can't deal with an Arc, so we
write out the getter methods by hand.
We now use the getters and setters everywhere.
There were some places where we wanted to build a `MarkerEnvironment`
out of whole cloth, usually in tests. To facilitate those use cases, we
add a `MarkerEnvironmentBuilder` that provides a convenient constructor.
It's basically like a `MarkerEnvironment::new`, but with named
parameters. That's useful here because there are so many fields (and
they many have the same type).
This test was failing on master. I guess we don't test
this crate with the pyo3 feature enabled? I think this
regression was due to a recent change in the error reporting
of the pep440 crate.
## Summary
Ensures that we track the origins for requirements regardless of whether
they come from `pyproject.toml` or `setup.py` or `setup.cfg`.
Closes#3480.
This commit touches a lot of code, but the conceptual change here is
pretty simple: make it so we can run the resolver without providing a
`MarkerEnvironment`. This also indicates that the resolver should run in
universal mode. That is, the effect of a missing marker environment is
that all marker expressions that reference the marker environment are
evaluated to `true`. That is, they are ignored. (The only markers we
evaluate in that context are extras, which are the only markers that
aren't dependent on the environment.)
One interesting change here is that a `Resolver` no longer needs an
`Interpreter`. Previously, it had only been using it to construct a
`PythonRequirement`, by filling in the installed version from the
`Interpreter` state. But we now construct a `PythonRequirement`
explicitly since its `target` Python version should no longer be tied to
the `MarkerEnvironment`. (Currently, the marker environment is mutated
such that its `python_full_version` is derived from multiple sources,
including the CLI, which I found a touch confusing.)
The change in behavior can now be observed through the
`--unstable-uv-lock-file` flag. First, without it:
```
$ cat requirements.in
anyio>=4.3.0 ; sys_platform == "linux"
anyio<4 ; sys_platform == "darwin"
$ cargo run -qp uv -- pip compile -p3.10 requirements.in
anyio==4.3.0
exceptiongroup==1.2.1
# via anyio
idna==3.7
# via anyio
sniffio==1.3.1
# via anyio
typing-extensions==4.11.0
# via anyio
```
And now with it:
```
$ cargo run -qp uv -- pip compile -p3.10 requirements.in --unstable-uv-lock-file
x No solution found when resolving dependencies:
`-> Because you require anyio>=4.3.0 and anyio<4, we can conclude that the requirements are unsatisfiable.
```
This is expected at this point because the marker expressions are being
explicitly ignored, *and* there is no forking done yet to account for
the conflict.
We provide a new API on a `Requirement` that specifically
ignores the marker environment and only evaluates a requirement's
marker expression with respect to extras. Any marker expressions
that reference the marker environment automatically evaluate to
true.
Instead of duplicating the evaluation code, we just make a marker
environment optional on the lower level APIs. In theory, we could
just writer a separate evaluation routine that ignores everything
except extras, but the evaluator has a fair bit of other stuff in it
(such as emitting warnings) that would be good to keep DRY IMO.
This doc test seems to fail due to the recent change making
`Requirement` generic on its URL type. While the type parameter
was given a default of `VerbatimUrl`, this default doesn't always
apply. For example, the `FromStr` impl on `Requirement` is still
generic on any URL type, and so callers must indicate the type
of the URL to return. (An alternative would be to define the
`FromStr` impl for just the default URL type.)
## Summary
If a requirement is omitted due to a marker expression, we shouldn't
include it as the "source" of a package in the output.
For example, if your constraints include `pathspec ; python_version <
'3.12'`, and you're on Python 3.12, we should _not_ include the
constraint file as a "source" in the output annotations.
## Summary
Unfortunately, the `-I` flag was added in Python 3.4. So if we query a
Python version prior to 3.4 (e.g., Python 2.7), we can't run our script
at all, and lose the ability to match against our structured error.
This PR adds an additional check against the stderr output for these
cases.
Closes https://github.com/astral-sh/uv/issues/3474.
## Test Plan
Installed Python 2.7, and verified that it was skipped (and that we
instead found my `python3`).
## Summary
Fixes https://github.com/astral-sh/uv/issues/1343. This is kinda a first
draft at the moment, but does at least mostly work locally (barring some
bits of the test suite that seem to not work for me in general).
## Test Plan
Mostly running the existing tests and checking the revised output is
sane
## Outstanding issues
Most of these come down to "AFAIK, the existing tools don't support
these patterns, but `uv` does" and so I'm not sure there's an existing
good answer here! Most of the answers so far are "whatever was easiest
to build"
- [x] ~~Is "-r pyproject.toml" correct? Should it show something else or
get skipped entirely~~ No it wasn't. Fixed in
3044fa8b86
- [ ] If the requirements file is stdin, that just gets skipped. Should
it be recorded?
- [ ] Overrides get shown as "--override<override.txt>". Correct?
- [x] ~~Some of the tests (e.g.
`dependency_excludes_non_contiguous_range_of_compatible_versions`) make
assumptions about the order of package versions being outputted, which
this PR breaks. I'm not sure if the text is fairly arbitrary and can be
replaced or whether the behaviour needs fixing?~~ - fixed by removing
the custom pubgrub PartialEq/Hash
- [ ] Are all the `TrackedFromStr` et al changes needed, or is there an
easier way? I don't think so, I think it's necessary to track these sort
of things fairly comprehensively to make this feature work, and this
sort of invasive change feels necessary, but happy to be proved wrong
there :)
- [x] ~~If you have a requirement coming in from two or more different
requirements files only one turns up. I've got a closed-source example
for this (can go into more detail if needed), mostly consisting of a
complicated set of common deps creating a larger set. It's a rarer case,
but worth considering.~~ 042432b200
- [ ] Doesn't add annotations for `setup.py` yet
- This is pretty hard, as the correct location to insert the path is
`crates/pypi-types/src/metadata.rs`'s `parse_pkg_info`, which as it's
based off a source distribution has entirely thrown away such matters as
"where did this package requirement get built from". Could add "`built
package name`" as a dep, but that's a little odd.
## Summary
This PR takes a different approach to `--with` for `uv run`. Now,
instead of merging the requirements and re-resolving, we have two
phases: (1) sync the workspace requirements to the workspace
environment; then (2) sync the ephemeral `--with` requirements to an
ephemeral environment. The two environments are then layered by setting
the `PATH` and `PYTHONPATH` variables appropriately.
I think this approach simplifies a few things:
1. Once we have a lockfile, the semantics are much clearer, and we can
actually reuse it for the workspace. If we had to add arbitrary
dependencies via `--with`, then it's not really clear how the lockfile
would/should behave.
2. Caching becomes simpler, because we can just cache the ephemeral
environment based on the requirements.
The current version of this PR loses a few behaviors though that I need
to restore:
- `--python` support -- but I'm not yet sure how this is supposed to
behave within projects? It's also left unclear in `uv sync` and `uv
lock`.
- The "reuse the workspace environment if it already satisfies the
ephemeral requirements" behavior.
Closes#3411.
## Summary
This is universal environment variable used to determine the mac OS
deployment target. We now respect it in `--python-platform` -- so we
default to 12.0, but users can override it as needed.
## Summary
We already _don't_ discover a `pyproject.toml` in `~/.config/uv` -- it
must be `uv.toml`. This PR makes the same change for `--config-file` --
it _has_ to be a `uv.toml`.
I think this is reasonable and more consistent, though I'm not sure. A
`pyproject.toml` "means" something -- it defines a project itself, in
which case we should be using project configuration. But creating a
`pyproject.toml` outside the project and passing it via `--config-file`
seems like an anti-pattern.
## Summary
This PR follows Cargo's strategy for merging configuration, albeit in a
more limited way (we don't support as many configuration locations).
Specifically, we merge the user configuration with the workspace
configuration if both are present. The workspace configuration has
priority, such that we take values from the workspace configuration and
ignore those in the user configuration if both are specified for a given
setting -- with the exception of arrays and maps, which are
concatenated.
For now, if a user provides a configuration file with `--config-file`,
we _don't_ merge in the user settings.
See:
https://doc.rust-lang.org/cargo/reference/config.html#hierarchical-structure.
Closes#3420.
## Summary
This is annoying both locally in CI. If anyone wants to fuss with the
filters to fix it, that's fine too, but IMO it's better to disable than
leave it enabled on macOS for now.
When using `tool.uv.sources`, we warn that requirements have a bound,
i.e. at least a lower version constraint.
When using a library, the symbols you import were introduced in
different versions, creating an implicit lower bound. This warning makes
this explicit. This is crucial to prevent backtracking resolvers from
selecting an ancient versions that is not compatible (or worse, doesn't
build), and a performance optimization on top.
This feature is gated to `tool.uv.sources` (as it should have been to
begin with for #3263/#3443) to not unnecessarily break legacy workflows.
It is also helpful specifically when using a `tool.uv.sources` section
that contains constraints that are not published to pypi, e.g. for
workspace dependencies. We can adjust those later to e.g. not constrain
workspace dependencies with `publish = false`, but i think it's the
right setting to start with.
## Summary
These aren't intended for production use; instead, I'm just trying to
frame out the overall data flows and code-sharing for these commands. We
now have `uv sync` (sync the environment to match the lockfile, without
refreshing or resolving) and `uv lock` (generate the lockfile). Both
_require_ a virtual environment to exist (something we should change).
`uv sync`, `uv run`, and `uv lock` all share code for the underlying
subroutines (resolution and installation), so the commands themselves
are relatively small (~100 lines) and mostly consist of reading
arguments and such.
`uv lock` and `uv sync` don't actually really work yet, because we have
no way to include the project itself in the lockfile (that's a TODO in
the lockfile implementation).
Closes https://github.com/astral-sh/uv/issues/3432.
We would previously show the parsed version when erroring due to
trailing content after a valid version, which can look different than
the input. E.g. when encountering `0.1-bulbasaur`, we would display:
```
after parsing '0.1b0', found 'ulbasaur', which is not part of a valid version
```
With storing the input string instead of the input version, we now show:
```
after parsing '0.1-b', found 'ulbasaur', which is not part of a valid version
```
It turns out setuptools often uses Metadata-Version 2.1 in their
PKG-INFO:
4e766834d7/setuptools/dist.py (L64)
`Metadata23` requires Metadata-Version of at least 2.2.
This means that uv doesn't actually recognise legacy editable
installations from the most common way you'd actually get legacy
editable installations (works great for most legacy editables I make at
work though!)
Anyway, over here we only need the version and don't care about anything
else. Rather than make a `Metadata21`, I just add a version field to
`Metadata10`. The one slightly tricky thing is that only
Metadata-Version 1.2 and greater guarantee that the [version number is
PEP 440 compatible](https://peps.python.org/pep-0345/#version), so I
store the version in `Metadata10` as a `String` and only parse to
`Version` at time of use.
Also did you know that back in 2004, paramiko had a pokemon based
versioning system?
Pubgrub got a new feature where all unavailability is a custom, instead
of the reasonless `UnavailableDependencies` and our custom `String` type
previously (https://github.com/pubgrub-rs/pubgrub/pull/208). This PR
introduces a `UnavailableReason` that tracks either an entire version
being unusable, or a specific version. The error messages now also track
this difference properly.
The pubgrub commit is our main rebased onto the merged
https://github.com/pubgrub-rs/pubgrub/pull/208, i'll push
`konsti/main-rebase-generic-reason` to `main` after checking for rebase
problems.
## Summary
It's not clear to me that this should exist at all, but it's causing
errors in projects that don't use `tool.uv.sources`, so we should
definitely remove it for now.
## Summary
We already have a global `--isolated`, which means "ignore any on-disk
configuration". I think we should reuse this for the "ignore the
workspace" setting in `uv run`, rather than `--no-workspace`.
I've also merged the existing `--isolated` and `--no-workspace`
behaviors in `uv run` into a single flag. We may not need separate flags
for this, since the current intent seems to be "ignore the workspace
environment"? Though we could always re-add later.
Closes https://github.com/astral-sh/uv/issues/3421.
Resolves [#3419](https://github.com/astral-sh/uv/issues/3419)
## Summary
Add compatargs to pip install command and hint the user to create a venv
for --user arg.
## Test Plan
Tested it locally.
```bash
cargo run pip install --user flask
Compiling uv v0.1.39 (/home/ahmedilyas/uv/crates/uv)
Finished `dev` profile [unoptimized + debuginfo] target(s) in 8.96s
Running `target/debug/uv pip install --user flask`
error: pip install's `--user` is unsupported (use a virtual environment instead).
```
This change allows switching out the url type for requirements. The
original idea was to allow different types for different requirement
origins, so that core metadata reads can ban non-pep 508 requirements
while we only allow them for requirements.txt. This didn't work out
because we expect `&Requirement`s from all sources to match.
I also tried to split `pep508_rs` into a PEP 508 compliant crate and
into our extensions, but they are to tightly coupled.
I think this change is an improvement still as it reduces the hardcoded
dependence on `VerbatimUrl`.
We now correctly emit relative paths in `uv pip compile` with
`tool.uv.sources` path inputs.
`tool.uv.sources` is mainly intended to be used with the uv lock file
over requirements.txt, but it's good to have basic `uv pip` support
working.
Fixes#3366
## Summary
All of the resolver code is run on the main thread, so a lot of the
`Send` bounds and uses of `DashMap` and `Arc` are unnecessary. We could
also switch to using single-threaded versions of `Mutex` and `Notify` in
some places, but there isn't really a crate that provides those I would
be comfortable with using.
The `Arc` in `OnceMap` can't easily be removed because of the uv-auth
code which uses the
[reqwest-middleware](https://docs.rs/reqwest-middleware/latest/reqwest_middleware/trait.Middleware.html)
crate, that seems to adds unnecessary `Send` bounds because of
`async-trait`. We could duplicate the code and create a `OnceMapLocal`
variant, but I don't feel that's worth it.
## Summary
Users often find themselves dropped into environments that contain
`.egg-info` packages. While we won't support installing these, it's not
hard to support identifying them (e.g., in `pip freeze`) and
_uninstalling_ them.
Closes https://github.com/astral-sh/uv/issues/2841.
Closes#2928.
Closes#3341.
## Test Plan
Ran `cargo run pip freeze --python
/opt/homebrew/Caskroom/miniforge/base/envs/TEST/bin/python`, with an
environment that includes `pip` as an `.egg-info`
(`/opt/homebrew/Caskroom/miniforge/base/envs/TEST/lib/python3.12/site-packages/pip-24.0-py3.12.egg-info`):
```
cffi @ file:///Users/runner/miniforge3/conda-bld/cffi_1696001825047/work
pip==24.0
pycparser @ file:///home/conda/feedstock_root/build_artifacts/pycparser_1711811537435/work
setuptools==69.5.1
wheel==0.43.0
```
Then ran `cargo run pip uninstall`, verified that `pip` was uninstalled,
and no longer listed in `pip freeze`.
Fixes#3371
It seems like uv doesn't proactively enforce 3.8+ and in most cases just
issues a warning. This PR keeps that property, only adding the new check
when it is known to fail. I checked the imports in this file and the
other ones seem fine.
## Summary
Refreshes some of the activation scripts, and fixes some bugs in
`activate_this.py` that were likely the rest of some erroneous
copy-pasting.
Closes https://github.com/astral-sh/uv/issues/3346.
## Test Plan
```
❯ python
Python 3.12.0 (main, Feb 28 2024, 09:44:16) [Clang 15.0.0 (clang-1500.1.0.2.5)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import httpx
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'httpx'
>>> activator = '.venv/bin/activate_this.py'
>>> with open(activator) as f:
... exec(f.read(), {'__file__': activator})
...
>>> import httpx
```
Only allow using `tool.uv.sources` with preview mode, the design isn't
finalized yet.
Not sure what to label this, do we want a preview section and label for
the release notes?
## Summary
We need to partition the editable and non-editable requirements. As-is,
`editable = true` requirements were still being installed as
non-editable.
## Summary
We were writing the build dependencies into the `--target` directory,
which both made builds fail and led to them leaking into the user's
directory.
Closes https://github.com/astral-sh/uv/issues/3349.
## Introduction
PEP 621 is limited. Specifically, it lacks
* Relative path support
* Editable support
* Workspace support
* Index pinning or any sort of index specification
The semantics of urls are a custom extension, PEP 440 does not specify
how to use git references or subdirectories, instead pip has a custom
stringly format. We need to somehow support these while still stying
compatible with PEP 621.
## `tool.uv.source`
Drawing inspiration from cargo, poetry and rye, we add `tool.uv.sources`
or (for now stub only) `tool.uv.workspace`:
```toml
[project]
name = "albatross"
version = "0.1.0"
dependencies = [
"tqdm >=4.66.2,<5",
"torch ==2.2.2",
"transformers[torch] >=4.39.3,<5",
"importlib_metadata >=7.1.0,<8; python_version < '3.10'",
"mollymawk ==0.1.0"
]
[tool.uv.sources]
tqdm = { git = "https://github.com/tqdm/tqdm", rev = "cc372d09dcd5a5eabdc6ed4cf365bdb0be004d44" }
importlib_metadata = { url = "https://github.com/python/importlib_metadata/archive/refs/tags/v7.1.0.zip" }
torch = { index = "torch-cu118" }
mollymawk = { workspace = true }
[tool.uv.workspace]
include = [
"packages/mollymawk"
]
[tool.uv.indexes]
torch-cu118 = "https://download.pytorch.org/whl/cu118"
```
See `docs/specifying_dependencies.md` for a detailed explanation of the
format. The basic gist is that `project.dependencies` is what ends up on
pypi, while `tool.uv.sources` are your non-published additions. We do
support the full range or PEP 508, we just hide it in the docs and
prefer the exploded table for easier readability and less confusing with
actual url parts.
This format should eventually be able to subsume requirements.txt's
current use cases. While we will continue to support the legacy `uv pip`
interface, this is a piece of the uv's own top level interface. Together
with `uv run` and a lockfile format, you should only need to write
`pyproject.toml` and do `uv run`, which generates/uses/updates your
lockfile behind the scenes, no more pip-style requirements involved. It
also lays the groundwork for implementing index pinning.
## Changes
This PR implements:
* Reading and lowering `project.dependencies`,
`project.optional-dependencies` and `tool.uv.sources` into a new
requirements format, including:
* Git dependencies
* Url dependencies
* Path dependencies, including relative and editable
* `pip install` integration
* Error reporting for invalid `tool.uv.sources`
* Json schema integration (works in pycharm, see below)
* Draft user-level docs (see `docs/specifying_dependencies.md`)
It does not implement:
* No `pip compile` testing, deprioritizing towards our own lockfile
* Index pinning (stub definitions only)
* Development dependencies
* Workspace support (stub definitions only)
* Overrides in pyproject.toml
* Patching/replacing dependencies
One technically breaking change is that we now require user provided
pyproject.toml to be valid wrt to PEP 621. Included files still fall
back to PEP 517. That means `pip install -r requirements.txt` requires
it to be valid while `pip install -r requirements.txt` with `-e .` as
content falls back to PEP 517 as before.
## Implementation
The `pep508` requirement is replaced by a new `UvRequirement` (name up
for bikeshedding, not particularly attached to the uv prefix). The still
existing `pep508_rs::Requirement` type is a url format copied from pip's
requirements.txt and doesn't appropriately capture all features we
want/need to support. The bulk of the diff is changing the requirement
type throughout the codebase.
We still use `VerbatimUrl` in many places, where we would expect a
parsed/decomposed url type, specifically:
* Reading core metadata except top level pyproject.toml files, we fail a
step later instead if the url isn't supported.
* Allowed `Urls`.
* `PackageId` with a custom `CanonicalUrl` comparison, instead of
canonicalizing urls eagerly.
* `PubGrubPackage`: We eventually convert the `VerbatimUrl` back to a
`Dist` (`Dist::from_url`), instead of remembering the url.
* Source dist types: We use verbatim url even though we know and require
that these are supported urls we can and have parsed.
I tried to make improve the situation be replacing `VerbatimUrl`, but
these changes would require massive invasive changes (see e.g.
https://github.com/astral-sh/uv/pull/3253). A main problem is the ref
`VersionOrUrl` and applying overrides, which assume the same
requirement/url type everywhere. In its current form, this PR increases
this tech debt.
I've tried to split off PRs and commits, but the main refactoring is
still a single monolith commit to make it compile and the tests pass.
## Demo
Adding
d1ae3b85d5/pyproject.json
as json schema (v7) to pycharm for `pyproject.toml`, you can try the IDE
support already:

[dove.webm](https://github.com/astral-sh/uv/assets/6826232/c293c272-c80b-459d-8c95-8c46a8d198a1)
In *some* places in our crates, `serde` (and `rkyv`) are optional
dependencies. I believe this was done out of reasons of "good sense,"
that is, it follows a Rust ecosystem pattern where serde integration
tends to be an opt-in crate feature. (And similarly for `rkyv`.)
However, ultimately, `uv` itself requires `serde` and `rkyv` to
function. Since our crates are strictly internal, there are limited
consumers for our crates without `serde` (and `rkyv`) enabled. I think
one possibility is that optional `serde` (and `rkyv`) integration means
that someone can do this:
cargo test -p pep440_rs
And this will run tests _without_ `serde` or `rkyv` enabled. That in
turn could lead to faster iteration time by reducing compile times. But,
I'm not sure this is worth supporting. The iterative compilation times
of
individual crates are probably fast enough in debug mode, even with
`serde` and `rkyv` enabled. Namely, `serde` and `rkyv` themselves
shouldn't need to be re-compiled in most cases. On `main`:
```
from-scratch: `cargo test -p pep440_rs --lib` 0.685
incremental: `cargo test -p pep440_rs --lib` 0.278s
from-scratch: `cargo test -p pep440_rs --features serde,rkyv --lib` 3.948s
incremental: `cargo test -p pep440_rs --features serde,rkyv --lib` 0.321s
```
So while a from-scratch build does take significantly longer, an
incremental build is about the same.
The benefit of doing this change is two-fold:
1. It brings out crates into alignment with "reality." In particular,
some crates were _implicitly_ relying on `serde` being enabled
without explicitly declaring it. This technically means that our
`Cargo.toml`s were wrong in some cases, but it is hard to observe it
because of feature unification in a Cargo workspace.
2. We no longer need to deal with the cognitive burden of writing
`#[cfg_attr(feature = "serde", ...)]` everywhere.
This PR principally adds a routine for converting a `Lock` to a
`Resolution`, where a `Resolution` is a map of package names pinned to
a specific version.
I'm not sure that a `Resolution` is ultimately what we want here (we
might need more stuff), but this was the quickest route I could find to
plug a `Lock` into our existing `uv pip install` infrastructure.
This commit also does a little refactoring of the `Lock` types. The
main thing is to permit extra state on some of the types (like a
`by_id` map on `Lock` for quick lookups of distributions) that aren't
included in the serialization format of a `Lock`. We achieve this
by defining separate `Wire` types that are automatically converted
to-and-from via `serde`.
Note that like with the lock file format types themselves, we leave a
few `todo!()` expressions around. The main idea is to get something
minimally working without spending too much effort here. (A fair bit
of refactoring will be required to generate a lock file, and it's
not clear how much this code will wind up needing to change anyway.)
In particular, we only handle the case of installing wheels from a
registry.
A demonstration of the full flow:
```
$ requirements.in
anyio
$ cargo run -p uv -- pip compile -p3.10 requirements.in --unstable-uv-lock-file
$ uv venv
$ cargo run -p uv -- pip install --unstable-uv-lock-file anyio -r requirements.in
Installed 5 packages in 7ms
+ anyio==4.3.0
+ exceptiongroup==1.2.1
+ idna==3.7
+ sniffio==1.3.1
+ typing-extensions==4.11.0
```
In order to install from a lock file, we start from the root and do a
breadth first traversal over its dependencies. We aren't yet filtering
on marker expressions (since they aren't in the lock file yet), but we
should be able to add that in the future. In so doing, the traversal
should select only the subset of distributions relevant for the current
platform.
Split out of #3266
The "selector" concept doesn't seem well enough defined as-is. For
example, `PythonVersion` belongs there but isn't present. Going for
smaller modules instead.
Split out of https://github.com/astral-sh/uv/pull/3266
If `UV_BOOTSTRAP_DIR` and `CARGO_MANIFEST_DIR` are both unset, we
currently panic. This isn't good once we start to use managed toolchains
in production. We'll need to change this more later once the toolchain
directory is more user-facing.
Moves all of `uv-toolchain` into `uv-interpreter`. We may split these
out in the future, but the refactoring I want to do for interpreter
discovery is easier if I don't have to deal with entanglement. Includes
some restructuring of `uv-interpreter`.
Part of #2386
Another split out from https://github.com/astral-sh/uv/pull/3263. This
abstracts the copy&pasted check whether an installed distribution
satisfies a requirement used by both plan.rs and site_packages.rs into a
shared module. It's less useful here than with the new requirement but
helps with reducing https://github.com/astral-sh/uv/pull/3263 diff size.
Previously, a noop `uv pip install` would only show "Audited {}
package(s)" but no details, not even with `-vv`. Now it debug logs which
requirements were met and it also debug logs which requirement was
missing to trigger the full routine, allowing it investigate caching
behaviour.
First `uv pip install -v jupyter`:
```
DEBUG At least one requirement is not satisfied: jupyter
```
Second `uv pip install -v jupyter`:
```
DEBUG Found a virtualenv named .venv at: /home/konsti/projects/uv-main/.venv
DEBUG Cached interpreter info for Python 3.12.1, skipping probing: .venv/bin/python
DEBUG Using Python 3.12.1 environment at .venv/bin/python
DEBUG Trying to lock if free: .venv/.lock
DEBUG Requirement satisfied: anyio
DEBUG Requirement satisfied: anyio>=3.1.0
DEBUG Requirement satisfied: argon2-cffi-bindings
DEBUG Requirement satisfied: argon2-cffi>=21.1
DEBUG Requirement satisfied: arrow>=0.15.0
DEBUG Requirement satisfied: asttokens>=2.1.0
DEBUG Requirement satisfied: async-lru>=1.0.0
DEBUG Requirement satisfied: attrs>=22.2.0
DEBUG Requirement satisfied: babel>=2.10
...
DEBUG Requirement satisfied: webencodings
DEBUG Requirement satisfied: webencodings>=0.4
DEBUG Requirement satisfied: websocket-client>=1.7
DEBUG Requirement satisfied: widgetsnbextension~=4.0.10
DEBUG All editables satisfied:
Audited 1 package in 12ms
```
This will clash with the `tool.uv.sources` PR, i'll rebase it on top.
## Summary
Hi! Added `UV_NO_BUILD_ISOLATION` as a boolean environment variable for
the `--no-build-isolation` command-line option.
Closes https://github.com/astral-sh/uv/issues/3309
## Test Plan
Added new test `respect_no_build_isolation_env_var` to check that the
behaviour is the same as if the ``--no-build-isolation``
command-line-option is set.
This is meant to be a base on which to build. There are some parts
which are implicitly incomplete and others which are explicitly
incomplete. The latter are indicated by TODO comments.
Here is a non-exhaustive list of incomplete things. In many cases, these
are incomplete simply because the data isn't present in a
`ResolutionGraph`. Future work will need to refactor our resolver so
that this data is correctly passed down.
* Not all wheels are included. Only the "selected" wheel for the current
distribution is included.
* Marker expressions are always absent.
* We don't emit hashes for certainly kinds of distributions (direct
URLs, git, and path).
* We don't capture git information from a dependency specification.
Right now, we just always emit "default branch."
There are perhaps also other changes we might want to make to the format
of a more cosmetic nature. Right now, all arrays are encoded using
whatever the `toml` crate decides to do. But we might want to exert more
control over this. For example, by using inline tables or squashing more
things into strings (like I did for `Source` and `Hash`). I think the
main trade-off here is that table arrays are somewhat difficult to read
(especially without indentation), where as squashing things down into a
more condensed format potentially makes future compatible additions
harder.
I also went pretty light on the documentation here than what I would
normally do. That's primarily because I think this code is going to
go through some evolution and I didn't want to spend too much time
documenting something that is likely to change.
Finally, here's an example of the lock file format in TOML for the
`anyio` dependency. I generated it with the following command:
```
cargo run -p uv -- pip compile -p3.10 ~/astral/tmp/reqs/anyio.in --unstable-uv-lock-file
```
And that writes out a `uv.lock` file:
```toml
version = 1
[[distribution]]
name = "anyio"
version = "4.3.0"
source = "registry+https://pypi.org/simple"
[[distribution.wheel]]
url = "2f20c40b45242c0b33774da0e2e34f/anyio-4.3.0-py3-none-any.whl"
hash = "sha256:048e05d0f6caeed70d731f3db756d35dcc1f35747c8c403364a8332c630441b8"
[[distribution.dependencies]]
name = "exceptiongroup"
version = "1.2.1"
source = "registry+https://pypi.org/simple"
[[distribution.dependencies]]
name = "idna"
version = "3.7"
source = "registry+https://pypi.org/simple"
[[distribution.dependencies]]
name = "sniffio"
version = "1.3.1"
source = "registry+https://pypi.org/simple"
[[distribution.dependencies]]
name = "typing-extensions"
version = "4.11.0"
source = "registry+https://pypi.org/simple"
[[distribution]]
name = "exceptiongroup"
version = "1.2.1"
source = "registry+https://pypi.org/simple"
[[distribution.wheel]]
url = "79fe92dd414cadab75055b0ae00b33/exceptiongroup-1.2.1-py3-none-any.whl"
hash = "sha256:5258b9ed329c5bbdd31a309f53cbfb0b155341807f6ff7606a1e801a891b29ad"
[[distribution]]
name = "idna"
version = "3.7"
source = "registry+https://pypi.org/simple"
[[distribution.wheel]]
url = "741d8c8280948df2ea0eda2c8b79e8/idna-3.7-py3-none-any.whl"
hash = "sha256:82fee1fc78add43492d3a1898bfa6d8a904cc97d8427f683ed8e798d07761aa0"
[[distribution]]
name = "sniffio"
version = "1.3.1"
source = "registry+https://pypi.org/simple"
[[distribution.wheel]]
url = "75a9c94214239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl"
hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2"
[[distribution]]
name = "typing-extensions"
version = "4.11.0"
source = "registry+https://pypi.org/simple"
[[distribution.wheel]]
url = "936e2092671b43269810cd589ceaf5/typing_extensions-4.11.0-py3-none-any.whl"
hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"
```
Resolves https://github.com/astral-sh/uv/issues/3313
## Summary
Add a new env variable `UV_LINK_MODE` as alias for the cli argument
--link-mode.
Updated the README env variables section.
## Test Plan
Tested manually using `UV_LINK_MODE=hardlink cargo run -p uv pip install
flask`.
When running
```
set UV_CACHE_DIR=%LOCALAPPDATA%\uv\cache-foo && uv venv venv
```
in windows CMD, the error would be just
```
error: The system cannot find the path specified. (os error 3)
```
The problem is that the first action in the cache dir is adding the tag,
and the `cachedir` crate is using `std::fs` instead of `fs_err`. I've
copied the two functions we use from the crate and changed the import
from `std::fs` to `fs_err`.
The new error is
```
error: failed to open file `C:\Users\Konstantin\AppData\Local\uv\cache-foo \CACHEDIR.TAG`
Caused by: The system cannot find the path specified. (os error 3)
```
which correctly explains the problem.
Closes#3280
## Summary
It seems like Azure might return a 401 when you request a package that
doesn't exist (even with valid credentials)? But I admittedly haven't
tested this. (We already skip 403, and this seems similar?)
Closes https://github.com/astral-sh/uv/issues/3291.
## Summary
This index strategy resolves every package to the latest possible
version across indexes. If a version is in multiple indexes, the first
available index is selected.
Implements #3137
This closely matches pip.
## Test Plan
Good question. I'm hesitant to use my certifi example here, since that
would inevitably break when torch removes this package. Please comment!
## Summary
The approach taken here is to model `--target` as an install scheme in
which all the directories are just subdirectories of the `--target`.
From there, everything else... just works? Like, upgrade, uninstalls,
editables, etc. all "just work".
Closes#1517.
## Summary
based on PEP 508 the `platform_machine` should be same as
`platform.machine()` output:
https://peps.python.org/pep-0508/#environment-markers
From my macOS M2
```python
In [1]: import platform
In [2]: platform.machine()
Out[2]: 'arm64'
```
I napari we also use `arm64` in requirements and it works as expected:
9fcf63e69a/pyproject.toml (L120)
## Test Plan
<!-- How was it tested? -->
I rebased https://github.com/astral-sh/uv/pull/2757 then realized that
we want to implement this for more than `uv venv`.
Closes https://github.com/astral-sh/uv/issues/2587
Closes https://github.com/astral-sh/uv/issues/2757
```
❯ cargo run -q -- pip install -p /Users/mz/bin/python3.7 anyio
warning: uv is only compatible with Python 3.8+, found Python 3.7.17.
Audited 1 package in 84ms
❯ cargo run -q -- venv -p /Users/mz/bin/python3.7
warning: uv is only compatible with Python 3.8+, found Python 3.7.17.
Using Python 3.7.17 interpreter at: /Users/mz/bin/python3.7
Creating virtualenv at: .venv
Activate with: source .venv/bin/activate
```
---------
Co-authored-by: Stevie Gayet <stegayet@users.noreply.github.com>
## Summary
Simplifies dependency errors of the form `you require package-a and you
require package-b` to `you require package-a and package-b`. Resolves
https://github.com/astral-sh/uv/issues/1009.
The only thing a `OnceMap` really needs to be able to do with the value
is to clone it. All extant uses benefited from having this done for them
by automatically wrapping values in an `Arc`. But this isn't necessarily
true for all things. For example, a value might have an `Arc` internally
to making cloning cheap in other contexts, and it doesn't make sense to
re-wrap it in an `Arc` just to use it with a `OnceMap`. Or
alternatively, cloning might just be cheap enough on its own that an
`Arc` isn't worth it.
Previously, this would use the "system" temporary directory.
Because we rename the resulting directory to its final destination,
and because renaming is implemented via hardlinking, and because
hardlinking doesn't work across mount points, and because /tmp is
commonly on a different mount point than where `uv` is checked out,
we "fix" this by putting the temporary directory somewhere close to
the final destination of the fetched artifact.
There are alternatives we might consider pursuing. For example,
if the `rename` fails, then we should probably do a recursive
directory copy. But this is a quick fix for now and it also
consistent with colocation of other temporary directories in `uv`.
The main downside of this change is that if a user does ^C while
`uv-dev fetch-python` is running, then there is no mechanism for
cleaning up temporary directories.
## Summary
I initially implemented this by allowing arbitrary
`x86_64-manylinux_x_y`, but it makes the Clap, Serde, and Schemars
definitions more complicated, _and_ makes it harder for the user.
Ultimately, manylinux itself only provides images for 2_17 and 2_28, so
this seems like it should be sufficient.
Closes https://github.com/astral-sh/uv/issues/3222.
## Summary
We now recursively expand any self-dependencies via extras, which lets
us detect conflicts sooner and avoid building unnecessary versions of
packages that are excluded via the extra.
Closes https://github.com/astral-sh/uv/issues/3135.
Previously, uv-auth would fail to compile due to a missing process
feature. I chose to make all tokio features we use top level features,
so we can share the tokio cache between all test invocations.
## Summary
See the diff in the tests. If you have a constraint with an extra, we
should respect it, but we shouldn't _add_ the extra to the requirements.
## Summary
macOS 11 has been EOL for 7 months, and it seems like it's common to
publish ARM-only wheels at 12.0+. I think this is a better default for
ARM macs.
Closes https://github.com/astral-sh/uv/issues/3227.
## Test Plan
`cargo run pip install scikit-learn==1.3.2 --python-platform macos
--no-build`
## Summary
The cli gives <COLOR> as value placeholder which is misleading. I
changed this to <COLOR_CHOICE> to make it obvious you don't supply a
color but a ColorChoice value.
## Test Plan
Compiled and verified --help text was correct
## Summary
This is mainly a cleanup PR to leverage uv-version in uv-virtualenv
instead of passing it via `uv`.
In #1852 I introduced the ability to pass extra cfg to `gourgeist` for
the primary purpose of passing the uv version, but since the dawn of the
uv-version crate dynamically passing more values to pyvenv.cfg is no
longer needed.
## Test Plan
Existing `uv` tests should still verify `uv = <version>` exists in the
venv and make sure no regressions were introduced.
## Summary
The function `find_in_directory` joins `uv.toml`. The initial join in
`user` is redundant.
Also a small comment fix.
## Test Plan
Not really sure how to test this. Please suggest if any tests need to be
added.
## Summary
Avoid removing quotes from markers, e.g. `numpy (>=1.19) ;
python_version >= "3.7"` should not be rewritten. Fixes
https://github.com/astral-sh/uv/issues/2551.
This PR also makes fixups a bit more flexible internally for fixes that
aren't simple to implement with a pure regex replacement, like this one.
https://github.com/astral-sh/uv/pull/1529 fixed a similar problem but
the current regex is still not smart enough to avoid all markers
completely (like `python_version`).
## Test Plan
Added a few unit tests.
Fixes the failure to lookup credentials in
https://github.com/astral-sh/uv/issues/3205
The issue is that we seed the cache with the index URL which includes a
username but no password. We did not ensure that a password was present
in the cached credentials before attempting a request with them. Now,
the cache will not return credentials when a username is provided and
the cached credentials have no password — the cached credentials are
useless in that case.
Tested with a Google Artifact Registry and keyring
```
RUST_LOG=uv=trace cargo run -q -- pip install requests --index-url https://oauth2accesstoken@us-central1-python.pkg.dev/<project>/pypi/simple/ --no-cache --keyring-provider subprocess -v
```
## Summary
I found some of these too bare (e.g., when they _just_ show a package
name with no other information). For me, this makes it easier to
differentiate error message copy from data. But open to other opinions.
Take a look at the fixture changes and LMK!
## Summary
pip supports providing a `--platform` to `pip install`, which can be
used to seed an environment (e.g., for use in a container or otherwise).
This PR adds `--python-platform` to our commands to support a similar
workflow. It has some caveats, which are documented on the CLI.
Closes#2079.
Since we're now using read timeouts and not total timeouts, we can use a
lower threshold, a single read shouldn't take 5 min (and not even 10s).
The 10s value is somewhat arbitrary.
Like #3144, this is a breaking change in some sense.
Adds hidden `--preview` / `--no-preview` flags with `UV_PREVIEW`
environment variable support. Copies the `PreviewMode` type from Ruff.
Does a little bit of extra work to port `uv run` to the new settings
model.
Note we allow `uv run` invocations without preview and only use its
presence to toggle an experimental warning.
## Test plan
```
❯ cargo run -q -- run --no-workspace -- python --version
warning: `uv run` is experimental and may change without warning.
Python 3.12.2
❯ cargo run -q -- run --no-workspace --preview -- python --version
Python 3.12.2
❯ UV_PREVIEW=1 cargo run -q -- run --no-workspace -- python --version
Python 3.12.2
```
In #2976 I made some changes that led to regressions:
- We stopped tracking URLs that we had not seen credentials for in the
cache
- This means the cache no longer returns a value to indicate we've seen
a realm before
- We stopped seeding the cache with URLs
- Combined with the above, this means we no longer had a list of
locations that we would never attempt to fetch credentials for
- We added caching of credentials found on requests
- Previously the cache was only populated from the seed or credentials
found in the netrc or keyring
- This meant that the cache was populated for locations that we
previously did not cache, i.e. GitHub artifacts(?)
Unfortunately this unveiled problems with the granularity of our cache.
We cache credentials per realm (roughly the hostname) but some realms
have mixed authentication modes i.e. different credentials per URL or
URLs that do not require credentials. Applying credentials to a URL that
does not require it can lead to a failed request, as seen in #3123 where
GitHub throws a 401 when receiving credentials.
To resolve this, the cache is expanded to supporting caching at two
levels:
- URL, cached URL must be a prefix of the request URL
- Realm, exact match required
When we don't have URL-level credentials cached, we attempt the request
without authentication first. On failure, we'll search for realm-level
credentials or fetch credentials from external services. This avoids
providing credentials to new URLs unless we know we need them.
Closes https://github.com/astral-sh/uv/issues/3123
## Summary
This PR avoids: (1) using the lookahead resolver when `--no-deps` is
specified (we'll never use those requirements), and (2) including any
transitive requirements when searching for allowed URLs, etc., when
`--no-deps` is specified.
Closes https://github.com/astral-sh/uv/issues/3183.
Previously, we got `pypi_types::DirectUrl` (the pypa spec
direct_url.json format) and `distribution_types::DirectUrl` (an enum of
all the url types we support). This lead me to confusion, so i'm
renaming the latter one to the more appropriate `ParsedUrl`.
Add a dedicated error type for direct url parsing. This change is broken
out from the new uv requirement type, which uses direct url parsing
internally.
## Summary
This PR is adding `UV_CONSTRAINT` environment variable as analogous to
`PIP_CONSTRAINT` to allow providing constraint file via environment
variable. Implementing this will simplify adoption of uv in testing
procedure in projects that I'm involved (testing using tox).
This was my motivation for opening #1841 that is closed in favor of
#1789 which was closed without implementing this feature.
In this implementation, I have used space as a separator as analogous to
`pip`. This introduces an obvious problem if the path contains space.
Another option could be to use standard separator (`:` - UNIX like, `;`
- Windows). Which one did you prefer?
## Test Plan
It is my first contribution and first rust coding experience. It will be
nice if one could point how I should implement testing this.
I can't get this to reproduce on GitHub Actions -- maybe the builds
there differ, or maybe the builds changed since we added this fix? I'll
check locally, but regardless, this is a typo.
Closes#3158.
## Summary
No behavior changes, but the idea here is that we move the argument
normalization code (e.g., create an `Upgrade` struct from `--upgrade`
and `--upgrade-package`) into the `settings.rs` file, where we build the
common settings structs.
This reduces a lot of the logic and duplication across commands in
`main.rs`.
## Summary
pip prefers somewhat-constrained over unconstrained packages... but only
if they're at equal depths in the tree. We don't have a way to track the
latter property yet (I've added a TODO), so for now, we should remove
this constraint -- it seems to be counter-productive.
I've filed https://github.com/astral-sh/uv/issues/3149 as a follow-up.
Closes https://github.com/astral-sh/uv/issues/3143
## Test Plan
- `git clone https://github.com/drivendataorg/zamba.git`
- `cat "-e .[tests]" > req.in`
- `cargo run venv && cargo run pip compile req.in --refresh -n
--python-platform linux --python-version 3.8`
I added distributions to these projects so the commit changed.
We could pin but we want to test for resolution... so we don't. These
are pretty static so this should be rare.
## Summary
This leverages the new `read_timeout` property, which ensures that (like
pip) our timeout is not applied to the _entire_ request, but rather, to
each individual read operation.
Closes: #1921.
See: #1912.
This means that a bare `uv run` invocation drops you into a REPL.
This behavior is internally controversial, and may best be served by a
dedicated `uv repl` command. I would imagine it's important to fail if
no command is given in _some_ circumstances, but those may be resolved
by _not_ doing this if we do not detect a TTY.
Regardless, I'm interested in giving this a try for a bit during this
experimental phase.
In addition to the requested requirements, we include requirements from
a `pyproject.toml` file if it exists and install the current directory.
Closes https://github.com/astral-sh/uv/issues/3104
Holy cow does installation / resolution take a ton of options. We
side-step most of them here.
If the current environment satisfies the requirements, it is used.
Otherwise, we create a new environment with the requested dependencies.
## Summary
Following up
- https://github.com/astral-sh/uv/pull/3113
- https://github.com/astral-sh/uv/pull/3115
It looks like `uv pip compile` command with `UV_SYSTEM_PYTHON` is missed
because these two PRs are close in time. And thus resulting in
```bash
$ uv --version
uv 0.1.34 (9259eceeb 2024-04-19)
$ UV_SYSTEM_PYTHON=1 uv pip compile --upgrade requirements.in -o requirements.txt
error: invalid value '1' for '--system'
[possible values: true, false]
For more information, try '--help'.
```
Signed-off-by: Jack Cherng <jfcherng@gmail.com>
## Summary
I've wanted to try this for a long time, so decided to give it a shot.
The basic idea is that you can provide a target triple (e.g.,
`--platform x86_64-pc-windows-msvc`) and resolve against that platform,
rather than the currently-running platform. It's functionally similar to
`--python-version`, though a bit simpler since there's no need to engage
with `Requires-Python`.
Our infrastructure is well-setup for this and so, in the end, it's
actually pretty straightforward: for each triple, we just need to
override the markers and platform tags.
## Summary
We weren't setting a priority for editables, so they were being visited
last.
I think there's still a problem whereby we're not aggressive enough in
visiting recursive extras (and, in fact, that's making it really hard to
write a test -- I wrote a test, but the most-reduced case still fails,
and I'd need to add a layer of indirection to make it
fail-on-main-but-pass-on-this-branch), but that problem likely already
existed on main prior to #3087, so I just want to get this quick fix out
now.
Closes https://github.com/astral-sh/uv/issues/3127.
## Test Plan
- `git clone https://github.com/cda-tum/mqt-core.git`
- `cargo run venv`
- `cargo run pip install 'scikit-build-core[pyproject]>=0.8.1'
'setuptools_scm>=7' 'pybind11>=2.12' --resolution=lowest-direct`
- `cargo run pip install --no-build-isolation
'-ve.[test,qiskit,evaluation,coverage]' --resolution=lowest-direct`
Given requirements like:
```
black==23.1.0
black[colorama]
```
The resolver will (on `main`) add a dependency on Black, and then try to
use the most recent version of Black to satisfy `black[colorama]`. For
sake of example, assume `black==24.0.0` is the most recent version. Once
the selects this most recent version, it'll fetch the metadata, then
return the dependencies for `black==24.0.0` with the `colorama` extra
enabled. Finally, it will tack on `black==24.0.0` (a dependency on the
base package). The resolver will then detect a conflict between
`black==23.1.0` and `black==24.0.0`, and throw out
`black[colorama]==24.0.0`, trying to next most-recent version.
This is both wasteful and can cause problems, since we're fetching
metadata for versions that will _never_ satisfy the resolver. In the
`apache-airflow[all]` case, I also ran into an issue whereby we were
attempting to build very old versions of `apache-airflow` due to
`apache-airflow[pandas]`, which in turn led to resolution failures.
The solution proposed here is that we create a new proxy package with
exactly two dependencies: one on `black` and one of `black[colorama]`.
Both of these packages must be at the same version as the proxy package,
so the resolver knows much _earlier_ that (in the above example) the
extra variant _must_ match `23.1.0`.
## Summary
This PR adds a test that currently leads to an error, but should
successfully resolve as of https://github.com/astral-sh/uv/pull/3100.
The core idea is that if we have a pinned package, we shouldn't try to
build other versions of that package if we have an unconstrained variant
with an extra.
## Summary
This was unintended. We ended up reverting `Option<bool>` everywhere,
but I missed this once since it's in a separate file.
(If you use `Option<bool>`, Clap requires a value, like `--no-cache
true`.)
## Test Plan
`cargo run pip install flask --no-cache`
## Summary
I think these are useful to have for consistency, though the `--system`
variant requires some new threading.
Closes: https://github.com/astral-sh/uv/issues/2242.
## Summary
Right now, we only accept _exactly `UV_NATIVE_TLS=true` and
`UV_NATIVE_TLS=false`. `BoolishValueParser` accepts a wider range of
values:
```rust
/// True values are `y`, `yes`, `t`, `true`, `on`, and `1`.
pub(crate) const TRUE_LITERALS: [&str; 6] = ["y", "yes", "t", "true", "on", "1"];
/// False values are `n`, `no`, `f`, `false`, `off`, and `0`.
pub(crate) const FALSE_LITERALS: [&str; 6] = ["n", "no", "f", "false", "off", "0"];
```
I tend to use `0` and `1` personally so this surprised me.
## Summary
resolves https://github.com/astral-sh/uv/issues/3106
## Test Plan
added a simple test where the password provided in `UV_INDEX_URL` is
hidden in the output as expected.
## Summary
With an alias for backwards compatibility. It's clearer and matches the
setting in the TOML configuration (where `compile` was deemed too
vague).
## Summary
Now that we can pick up configuration values from persistent files, we
need to enable users to _disable_ those values from the CLI. For
example, if a user has `emit_index_url = true` in the configuration
file, they should be able to do `--no-emit-index-url` on the
command-line. This PR adds support for such negations, following the
same patterns we use in Ruff.
## Summary
Enables `uv` to read configuration from (e.g.)
`/Users/crmarsh/.config/uv/uv.toml` on macOS and Linux, and
`C:\Users\Charlie\AppData\Roaming\uv\uv.toml` on Windows.
This differs slightly from Ruff, which uses the `Application Support`
directory on macOS. But I've deviated here. based on the preferences
expressed in https://github.com/astral-sh/ruff/issues/10739.
## Summary
This PR adds the structs and logic necessary to respect settings from
the workspace. It's a ton of code, but it's mostly mechanical. And,
believe it or not, I pulled out a few refactors in advance to trim down
the code and complexity.
The highlights are:
- All CLI arguments are now `Option`, so that we can detect whether they
were provided (i.e., we can't let Clap fill in the defaults).
- We now have a `*Settings` struct for each command, which merges the
CLI and workspace options (e.g., `PipCompileSettings`).
I've only implemented `PipCompileSettings` for now. If approved, I'll
implement the others prior to merging, but it's very mechanical and I
both didn't want to do the conversion prior to receiving feedback _and_
realized it would make the PR harder to review.
If a virtual environment does not exist, we will create one for the
duration of the invocation.
Adds an `--isolated` flag to force this behavior (ignoring an existing
virtual environment).
Adds `uv run` which executes a command in your current virtual
environment.
This is a simple first milestone, lots of remaining work and behavior.
The command is hidden.
Fixes https://github.com/astral-sh/uv/issues/3060
## Summary
Allows passing a virtual environment (the path to the directory, rather
than the path to the Python interpreter within the directory) to the
`--python` option of the `uv pip` command.
## Test Plan
Tested manually to confirm that the expected new functionality works.
The test suite still passes after this change.
I don't know how to add tests for a new feature like this. I would be
happy to do so if someone can give me some pointers on how to do it.
## Summary
Source distributions in the .tar.bz2 format are still relatively common
within the existing code-bases, namely, the most common examples are the
Twisted source distributions up to the version 20.3.0. As quite so often
the ability to upgrade Twisted to a more recent version is not available
for a given project, we add the support for .tar.bz2 here to still allow
`uv` to be a drop-in replacement for `pip` in these projects.
## Test Plan
The feature was tested both by adding the corresponding test coverage,
and by directly installing a package of interest under a Python version
that doesn't have the corresponding wheel:
```sh
cargo run venv -p python3.8
cargo run pip install Twisted==20.3.0 --no-cache
```
The `--no-cache` argument in the example above serves the purpose of
cleaning the cached information regarding the unsatisfiability of the
requirements, as it may have been cached during some previous attempt to
install this package by `uv` version that didn't implement this feature
yet.
## Summary
This PR adds basic struct definitions along with a "workspace" concept
for discovering settings. (The "workspace" terminology is used to match
Ruff; I did not invent it.)
A few notes:
- We discover any `pyproject.toml` or `uv.toml` file in any parent
directory of the current working directory. (We could adjust this to
look at the directories of the input files.)
- We don't actually do anything with the configuration yet; but those
PRs are large and I want this to be reviewed in isolation.