## Summary
Passing `--upgrade` to `tool run` is confusing, because it doesn't
upgrade the installed tool. It just causes us to use an isolated tool
environment, which seems wrong.
Closes#3683
Note our semantics do not exactly match the specification so we can
perform algebra on the markers. See the caveats in the documentation
(and in the discussion below).
The test output seems to depend on using Python 3.12.1 specifically.
While I'm not sure how it happens, it seems like these can get out of
sync between CI and local testing. In this case, I had a problem where
the marker expressions emitted locally were tied to Python 3.12.4, but
the tests in CI were tied to Python 3.12.1. Changing the test to require
3.12.1 specifically fixes this.
This initial set is meant to be a basic starting point where we
can test the interaction between 'uv' commands more systematically.
And specifically, with a focus on how the lock file changes.
This adds some variations on 'uv add' and 'uv remove' specifically
for testing changes to the lock file (and not anything else).
We also rejigger 'run_and_format' so that we can use it in other
contexts, particularly for error reporting.
And we add a 'diff_lock' helper for returning the changes made to
a lock file after running a command.
This was only being used in the ecosystem tests. Since we now don't do a
resolve when `uv lock` is run and when the lock file satisfies the
`pyproject.toml`, deterministic checking was removed since it's avoided
by construction. It was removed everywhere else, so we remove it here as
well.
Closes https://github.com/astral-sh/uv/issues/6167
We've been seeing intermittent failures in CI, which we thought were
unexpected HTTP 401s but it actually looks like a panic when handling an
expected HTTP error. I believe the problem is that an early client error
can cause the channel to close and we crash when we unwrap the `send`.
## Summary
Fixes#6177
This ensures a `pyproject.toml` file without a `[project]` table is not
a fatal error for `uv venv`, which is just trying to discover/respect
the project's `python-requires` (#5592).
Similarly, any caught `WorkspaceError` is now also non-fatal and instead
prints a warning message (feeback welcome here, felt less surprising
than e.g. a malformed `pyproject.toml` breaking `uv venv`).
## Test Plan
I added two test cases: `cargo test -p uv --test venv`
Also, existing venv tests were failing for me since I use fish and the
printed activation script was `source .venv/bin/activate.fish` (to
repro, just run the tests with `SHELL=fish`). So added an insta filter
to normalize that.
## Summary
PR #4533 introduced (almost) spec compliant parsing of `.egg-info`
filenames, but added the overly strict requirement that the distribution
version must be present. This causes various `uv pip` operations to fail
in environments where there are `.egg-info` files without a version
component, so loosen this check by making the version component optional
and reading the version from the egg metadata when it is not present.
As an example of the issue, running `uv pip list` on my system currently
results in
```
error: Failed to read metadata from: `/usr/lib/python3.12/site-packages/PySide6.egg-info`
Caused by: The `.egg-info` filename "PySide6.egg-info" is missing a version
```
whereas regular `pip list` succeeds:
```
$ pip list | rg -S pyside
PySide6 6.7.2
```
## Test Plan
This has been tested by altering the `.egg-info` filename tests as
needed and ensuring the full test suite passes locally.
Resolve#6151
## Test Plan
Execution result of `cargo run -- help`
```bash
An extremely fast Python package manager.
Usage: uv [OPTIONS] <COMMAND>
Commands:
run Run a command or script (experimental)
init Create a new project (experimental)
add Add dependencies to the project (experimental)
remove Remove dependencies from the project (experimental)
sync Update the project's environment (experimental)
lock Update the project's lockfile (experimental)
tree Display the project's dependency tree (experimental)
tool Run and install commands provided by Python packages (experimental)
python Manage Python versions and installations (experimental)
pip Manage Python packages with a pip-compatible interface
venv Create a virtual environment
cache Manage uv's cache
version Display uv's version
generate-shell-completion Generate shell completion
help Display documentation for a command
...
```
Execution result of `cargo run -- -h` and `cargo run -- --help`
```bash
An extremely fast Python package manager.
Usage: uv [OPTIONS] <COMMAND>
Commands:
run Run a command or script (experimental)
init Create a new project (experimental)
add Add dependencies to the project (experimental)
remove Remove dependencies from the project (experimental)
sync Update the project's environment (experimental)
lock Update the project's lockfile (experimental)
tree Display the project's dependency tree (experimental)
tool Run and install commands provided by Python packages (experimental)
python Manage Python versions and installations (experimental)
pip Manage Python packages with a pip-compatible interface
venv Create a virtual environment
cache Manage uv's cache
version Display uv's version
help Display documentation for a command
...
```
## Summary
In the resolver, we use release-only semantics to normalize
`python_full_version`. So, if we see `python_full_version < '3.13'`, we
treat that as `(Unbounded, Exclude(3.13))`. `3.13b0` evaluates as `true`
to that range, so we were accepting pre-releases for these markers.
Instead, we need to exclude pre-release segments when performing these
evaluations.
Closes https://github.com/astral-sh/uv/issues/6169.
## Test Plan
Hard to write a test for this because you need a pre-release Python
locally... so:
`echo "sqlalchemy==2.0.32" | cargo run pip compile - --python 3.13 -n`
Resolve#6152
## Summary
## Test Plan
Execution result of `cargo run generate-shell-completion --help`
```bash
Generate shell completion
Usage: uv generate-shell-completion <SHELL>
Arguments:
<SHELL> The shell to generate the completion script for [possible values: bash, elvish, fish, nushell, powershell, zsh]
```
Execution result of `cargo run help generate-shell-completion`
```bash
Generate shell completion
Usage: uv generate-shell-completion <SHELL>
Arguments:
<SHELL>
The shell to generate the completion script for
[possible values: bash, elvish, fish, nushell, powershell, zsh]
```
## Summary
Resolves https://github.com/astral-sh/uv/issues/4537
- First commit avoids overwriting dependencies with different markers.
- Second commit supports adding from requirements files.
## Test Plan
`cargo test`
Now that these incompatibilities are collected into a single range
(https://github.com/astral-sh/uv/pull/6154), we can simplify the range
using the known available versions to reduce verbosity.
There were different `PubGrubPackage` types so they never matched the
available versions set! Luckily, the available versions are agnostic to
the markers and optional dependencies so we can just broaden to using
`PackageName` as a lookup key.
Addresses yet another complaint in
https://github.com/astral-sh/uv/issues/5046
I need this for debugging error messages.
I used an environment variable instead of a trace log so you can do
`UV_INTERNAL__SHOW_DERIVATION_TREE=1` and run a test to see the tree in
the test snapshot without further changes.
e.g.
```rust
// Resolving should fail.
uv_snapshot!(context.filters(), context.lock().arg("--preview").current_dir(&workspace), @r###"
success: false
exit_code: 1
----- stdout -----
UV_INTERNAL__SHOW_DERIVATION_TREE
root==0a0.dev0 depends on foo*
root==0a0.dev0 depends on bar[some-extra]*
foo==0.1.0 depends on anyio==4.1.0
bar[some-extra]==0.1.0 depends on anyio==4.2.0
no versions of bar[some-extra]<0.1.0 | >0.1.0
----- stderr -----
Using Python 3.12.[X] interpreter at: [PYTHON-3.12]
× No solution found when resolving dependencies:
╰─▶ Because only bar[some-extra]==0.1.0 is available and bar[some-extra] depends on anyio==4.2.0, we can conclude that all versions of bar[some-extra] depend on anyio==4.2.0.
And because foo depends on anyio==4.1.0, we can conclude that foo and all versions of bar[some-extra] are incompatible.
And because your workspace requires bar[some-extra] and foo, we can conclude that your workspace's requirements are unsatisfiable.
"###
);
```
We have bad error messages for optional (extra) dependencies and
development dependencies in workspaces:
1. We weren't showing the full package, so we'd drop `:dev` and
`[extra]` by accident
2. We didn't include derived packages, e.g., `member[extra]` in tree
processing collapse operation, so we'd include extra clauses like the
ones we removed in #6092
Also
- Reverts
f0de4f71f2
— it turns out it wasn't quite correct and it didn't seem worth using
the custom incompatibility anymore.
- Fixes a bug in the display of `package:dev` which was not showing
`:dev` for some variants (see 94d8020b58)
## Summary
Normalize all `python_version` markers to their equivalent
`python_full_version` form. This avoids false positives in forking
because we currently cannot detect any relationships between the two
forms. It also avoids subtle bugs due to the truncating semantics of
`python_version`. For example, given `requires-python = ">3.12"`, we
currently simplify the marker `python_version <= 3.12` to `false`.
However, the version `3.12.1` will be truncated to `3.12` for
`python_version` comparisons, and thus it satisfies the python
requirement and evaluates to `true`.
It is possible to simplify back to `python_version` when writing markers
to the lockfile. However, the equivalent `python_full_version` markers
are often clearer and easier to simplify, so I lean towards leaving them
as `python_full_version`.
There are *a lot* of snapshot updates from this change. I'd like more
eyes on the transformation logic in `python_version_to_full_version` to
ensure that they are all correct.
Resolves https://github.com/astral-sh/uv/issues/6125.
While it's slightly more convenient to log this where we were, it was
pretty unhelpful e.g.
```
DEBUG Interpreter meets the requested Python: `Python >=3.9`
```
What interpreter are we referring to here?
Includes the changes from https://github.com/astral-sh/uv/pull/6071 but
takes them way further.
When we have the set of available versions for a package, we can do a
much better job displaying an error.
For example:
```
❯ uv add 'httpx>999,<9999'
× No solution found when resolving dependencies:
╰─▶ Because only the following versions of httpx are available:
httpx<=999
httpx>=9999
and example==0.1.0 depends on httpx>999,<9999, we can conclude that example==0.1.0 cannot be used.
And because only example==0.1.0 is available and you require example, we can conclude that the requirements are unsatisfiable.
```
The resolver has demonstrated that the requested range cannot be used
because there are only versions in ranges _outside_ the requested range.
However, the display of the range of available versions is pretty bad!
We say there are versions of httpx available in ranges that definitely
have no versions available.
With this pull request, the error becomes:
```
❯ uv add 'httpx>999,<9999'
× No solution found when resolving dependencies:
╰─▶ Because only httpx<=1.0.0b0 is available and example depends on httpx>999,<9999, we can conclude that example's
requirements are unsatisfiable.
And because your workspace requires example, we can conclude that your workspace's requirements are unsatisfiable.
```
We achieve this by:
1. Dropping ranges disjoint with the range of available versions, e.g.,
this removes `httpx>=9999`
2. Replacing ranges that capture the _entire_ range of available
versions with the smaller range, e.g., this replaces `httpx<=999` with
`<=1.0.0b0`.
~Note that when we perform (2), we may include an additional bound that
is not relevant, e.g., we include the lower bound of `>=0.6.7`. This is
a bit extraneous, but I don't think it's confusing. We can consider some
advanced logic to avoid that later.~ (edit: I did this, it wasn't hard)
We also improve error messages when there is _only_ one version
available by showing that version instead of a range.
## Summary
Gives the caller control over how messages are reported back to the
user. Also merges the index-location validation into the lock, since
we're already iterating over the packages.
## Summary
This is no longer required since we no longer implement `Eq` on `Lock`.
It will also sometimes be "wrong" as of #6076, since we now apply
different `requires-python` filtering to different parts of the tree
during resolution.
## Summary
Using https://github.com/astral-sh/uv/issues/6064 as a motivating
example: at present, on main, we're not properly propagating the
`Requires-Python` simplifications. In that case, for example, we end up
solving for a branch with `python_version < 3.11`, and a branch `>=
3.11`, even though `Requires-Python` is `>=3.11`. Later, when we get to
the graph, we apply version simplification based on `Requires-Python`,
which causes us to _remove_ the `python_version < 3.11` markers
entirely, leaving us with duplicate dependencies for `pylint`.
This PR instead tries to ensure that we always apply this narrowing to
requirements and forks, so that we don't need to apply the same
simplification when constructing the graph at all.
Closes https://github.com/astral-sh/uv/issues/6064.
Closes#6059.
In particular, I added this as a hack to avoid a kinda of
instability that was caused by our marker code not correctly
detecting markers that were always false. But that has since
been fixed.
Removing this code doesn't change any tests. Arguably it
should be possible to come up with a test that failed with
this hack inserted but succeeded without it. In particular,
with this hack, new forks were being prevented from being
added even when they ought to be added, e.g., when preferences
get updated.
## Summary
This PR changes the definition of `--locked` from:
> Produces the same `Lock`
To:
> Passes `Lock::satisfies`
This is a subtle but important difference. Previous, if
`Lock::satisfies` failed, we would run a resolution, then do
`existing_lock == lock`. If the two weren't equal, and `--locked` was
specified, we'd throw an error.
The equality check is hard to get right. For example, it means that we
can't ship #6076 without changing our marker representation, since the
deserialized lockfile "loses" some of the internal marker state that
gets accumulated during resolution.
The downside of this change is that there could be scenarios in which
`uv lock --locked` fails even though the lockfile would actually work
and the exact TOML would be unchanged. But... I think it's ok if
`--locked` fails after the user modifies something?
Extends https://github.com/astral-sh/uv/pull/6092 to improve resolver
error messages for workspaces that have a single member.
As before, this requires a two-step approach of
1. Traversing the derivation tree and collapsing some members. In this
case, we drop the empty root node in favor of the project.
2. Using special-case formatting for packages. In this case, the
workspace package is referred to with "your project" instead of its
name.
An extension of #6090 that replaces #6066.
In brief,
1. Workspace member names are passed to the resolver for no solution
errors
2. There is a new derivation tree pre-processing step that trims
`NoVersion` incompatibilities for workspace members from the derivation
tree. This avoids showing redundant clauses like `Because only
bird==0.1.0 is available and bird==0.1.0 depends on anyio==4.3.0, we can
conclude that all versions of bird depend on anyio==4.3.0.`. As a minor
note, we use a custom incompatibility kind to mark these
incompatibilities at resolution-time instead of afterwards.
3. Root dependencies on workspace members say `your workspace requires
bird` rather than `you require bird`
4. Workspace member package display omits the version, e.g., `bird`
instead of `bird==0.1.0`
5. Instead of reporting a workspace member as unusable we note that its
requirements cannot be solved, e.g., `bird's requirements are
unsatisfiable` instead of `bird cannot be used`.
6. Instead of saying `your requirements are unsatisfiable` we say `your
workspace's requirements are unsatisfiable` when in a workspace, since
we're not in a "provide direct requirements" paradigm.
As an annoying but minor implementation detail, `PackageRange` now
requires access to the `PubGrubReportFormatter` so it can determine if
it is formatting a workspace member or not. We could probably improve
the abstractions in the future.
As a follow-up, we should additional special casing for "single project"
workspaces to avoid mention of the workspace concept in simple projects.
However, it looks like this will require additional tree manipulations
so I'm going to keep it separate.
## Summary
Historically, in order to "resolve from a lockfile", we've taken the
lockfile, used it to pre-populate the in-memory metadata index, then run
a resolution. If the resolution didn't match our existing resolution, we
re-resolved from scratch.
This was an appealing approach because (in theory) it didn't require any
dedicated logic beyond pre-populating the index. However, it's proven to
be _really_ hard to get right, because it's a stricter requirement than
we need. We just need the current lockfile to _satisfy_ the requirements
provided by the user. We don't actually need a second resolution to
produce the exact same result. And it's not uncommon that this second
resolution differs, because we seed it with preferences, which
fundamentally changes its course. We've worked hard to minimize those
"instabilities", but they're still present.
The approach here is intended to be much simpler. Instead of resolving
from the lockfile, we just check if the current resolution satisfies the
state of the workspace. Specifically, we check if the lockfile (1)
contains all the relevant members, and (2) matches the metadata for all
dependencies, recursively. (We skip registry dependencies, assuming that
they're immutable.)
This may actually be too conservative, since we can have resolutions
that satisfy the requirements, even if the requirements have changed
slightly. But we want to bias towards correctness for now.
My hope is that this scheme will be more performant, simpler, and more
robust.
Closes https://github.com/astral-sh/uv/issues/6063.
## Summary
This was added in https://github.com/astral-sh/uv/pull/5405 but is now
the cause of an instability in `github_wikidata_bot`. Specifically, on
the initial run, we fork in `pydantic==2.8.2`, via:
```
Requires-Dist: typing-extensions>=4.12.2; python_version >= '3.13'
Requires-Dist: typing-extensions>=4.6.1; python_version < '3.13'
```
In the end, we resolve a single version of `typing-extensions`
(`4.12.2`)... But we don't recognize the two resolutions as the "same
graph", because we propagate the fork markers, and so the "edges" have
different markers on them...
In the second run through, when we have the forks in advance, we don't
split on Pydantic... We just try to solve from the root with the current
forks. This is fundamentally different and I fear it will be the cause
of many instabilities. But removing this graph check fixes the proximate
issue.
I don't really understand why this was added since there was no test
coverage in the PR.
## Summary
When constructing the `Resolution`, we only propagated the fork markers
to the package node, but not the extras node. This led to cases in which
an extra could be included unconditionally or otherwise diverge from the
base package version.
Closes https://github.com/astral-sh/uv/issues/6062.
## Summary
Right now, we store the environment markers in a `BTreeSet` -- so
they're sorted, but the sort doesn't really tell us anything. I think we
should instead store them in the order in which we solved. I thought
this might fix an instability (it didn't), but I think it's still good
to ensure we solve in the same order.
I also changed from `Option<Vec>` to just `Vec`, since there was no
distinction between `None` and empty.
## Summary
We retain them if you use `--raw-sources`, but otherwise they're
removed. We still respect them in the subsequent `uv.lock` via an
in-process store.
Closes#6056.
## Summary
Now, if you resolve against a registry, then swap it out for another, we
won't reuse the lockfile. (If you don't provide any registry
configuration, then we won't enforce this, so that `uv lock --index-url
foo` and `uv lock` is stable.)
Closes https://github.com/astral-sh/uv/issues/5920.
This example came up in discussion and it was initially unclear whether
we should try to support it. Specifically, by automatically assuming
that the `datasets < 2.19` dependency had a marker corresponding to the
negation of the conjunction of the other sibling markers for that same
package. But this was deemed, I think, a little too magical.
This in turn implies that whenever there are sibling dependencies with
overlapping marker expressions, their version constraints also need to
be overlapping. Otherwise, for any marker environment that matches both
marker expressions, it would be impossible to select a single version.
The test in this case has this comment:
```
/// If a dependency requests a prerelease version with an overlapping marker expression,
/// we should prefer the prerelease version in both forks.
```
With this setup:
```
let pyproject_toml = context.temp_dir.child("pyproject.toml");
pyproject_toml.write_str(indoc! {r#"
[project]
name = "example"
version = "0.0.0"
dependencies = [
"cffi >= 1.17.0rc1 ; os_name == 'Linux'"
]
requires-python = ">=3.11"
"#})?;
let requirements_in = context.temp_dir.child("requirements.in");
requirements_in.write_str(indoc! {"
cffi
.
"})?;
```
The change in this commit _seems_ more correct that what we had,
although it does seem to contradict the comment. Namely, in the `os_name
!= "Linux"` fork, we don't prefer the pre-release version since the
`cffi >= 1.17.0rc1` bound doesn't apply.
It's not quite clear what to do in this instance.
I believe these are all changes that aren't necessarily
expected, but also seem harmless. Like the order in which
fork markers are written to the lock file. (Although one
wonders if we should fix that once and for all by defining
a complete sort function for forks.)
At a high level, this PR adds a smattering of new tests that
effectively snapshot the output of `uv lock` for a selection of
"ecosystem" projects. That is, real Python projects for which we expect
`uv` to work well with.
The main idea with these tests is to get a better idea of how changes
in `uv` impact the lock files of real world projects. For example,
we're hoping that these tests will help give us data for how #5733
differs from #5887.
This has already revealed some bugs. Namely, re-running `uv lock` for a
second time will produce a different lock file for some projects. So to
prioritize getting the tests added, for those projects, we don't do the
deterministic checking.
## Summary
Our current handling of `--find-links` merges the entries in each index.
As a result, we can end up with `AnnotatedDist` entries that reference
distributions across indexes.
I'd like to change `--find-links` such that each `--find-links` entry is
just treated as its own index (so, e.g., if `requests` exists in the
first `--find-links` entry, we don't even check the registry by
default), which would _also_ fix this problem automatically. But that's
a behavior change... So for now, in the lockfile, we filter
distributions that don't match the source index URL.
There are two cases to consider:
- There's a source distribution. Then, for the ID to reference the
`--find-links` registry, the source distribution _must_ have come from
the `--find-links` entry, so it's fine to discard any wheels from the
"wrong" registry without breaking any compatibility guarantees.
- There's no source distribution. Then the best wheel must come from the
`--find-links` registry. We might lose some platform coverage by
discarding the other wheels, but it shouldn't break any of the
"guarantees", since we have at least one wheel that fits in the version
range.
Closes https://github.com/astral-sh/uv/issues/6015.
## Summary
Added the actual error message to the warning when uv fails to parse
`pyproject.toml`.
Resolves https://github.com/astral-sh/uv/issues/5934
## Test Plan
Took the case from the issue:
- have `pyproject.toml` which contains
```
[tool.uv]
foobar = false
```
-
```
$ uv venv --preview -v
```
- Expect the message that contains the actual problem in the
`pyproject.toml` like:
```
warning: Failed to parse `pyproject.toml` during settings discovery: unknown field `foobar`; skipping...
```
## Summary
A lot of the existing tests were no-ops. For convenience, we now use the
trick of: install from Test PyPI (to get an outdated "latest"), then
upgrade from PyPI.
Right now, the URL gets out-of-sync with the install path, since the
install path is canonicalized. This leads to a subtle error on Windows
(in CI) in which we don't preserve caching across resolution and
installation.
Surprisingly, this is a lockfile schema change: We can't store relative
paths in urls, so we have to store a `filename` entry instead of the
whole url.
Fixes#4355
## Summary
We now persist the `ResolverInstallerOptions` when writing out a tool
receipt. When upgrading, we grab the saved options, and merge with the
command-line arguments and user-level filesystem settings (CLI > receipt
> filesystem).
The loose consensus is that "fetch" doesn't have much meaning and that a
boolean flag makes more sense from the command line.
1. Adds `--allow-python-downloads` (hidden, default) and
`--no-python-downloads` to the CLI to quickly enable or disable
downloads
2. Deprecates `--python-fetch` in favor of the options from (1)
3. Removes `python-fetch` in favor of a `python-downloads` setting
5. Adds a `never` variant to the enum, allowing even explicit installs
to be disabled via the configuration file
## Test plan
I tested this with various `pyproject.toml`-level settings and `uv venv
--preview --python 3.12.2` and `uv python install 3.12.2` with and
without the new CLI flags.
Warn when there are missing bounds on transitive dependencies with
`--resolution lowest`.
Implemented as a lazy resolution graph check. Dev deps are odd because
they are missing the edge from the root that extras have (they are
currently orphans in the resolution graph), but this is more complex to
solve properly because we can put dev dep information in a `Requirement`
so i special cased them here.
Closes#2797
Should help with #1718
---------
Co-authored-by: Ibraheem Ahmed <ibraheem@ibraheem.ca>
## Summary
This PR rewrites the `MarkerTree` type to use algebraic decision
diagrams (ADD). This has many benefits:
- The diagram is canonical for a given marker function. It is impossible
to create two functionally equivalent marker trees that don't refer to
the same underlying ADD. This also means that any trivially true or
unsatisfiable markers are represented by the same constants.
- The diagram can handle complex operations (conjunction/disjunction) in
polynomial time, as well as constant-time negation.
- The diagram can be converted to a simplified DNF form for user-facing
output.
The new representation gives us a lot more confidence in our marker
operations and simplification, which is proving to be very important
(see https://github.com/astral-sh/uv/pull/5733 and
https://github.com/astral-sh/uv/pull/5163).
Unfortunately, it is not easy to split this PR into multiple commits
because it is a large rewrite of the `marker` module. I'd suggest
reading through the `marker/algebra.rs`, `marker/simplify.rs`, and
`marker/tree.rs` files for the new implementation, as well as the
updated snapshots to verify how the new simplification rules work in
practice. However, a few other things were changed:
- [We now use release-only comparisons for `python_full_version`, where
we previously only did for
`python_version`](https://github.com/astral-sh/uv/blob/ibraheem/canonical-markers/crates/pep508-rs/src/marker/algebra.rs#L522).
I'm unsure how marker operations should work in the presence of
pre-release versions if we decide that this is incorrect.
- [Meaningless marker expressions are now
ignored](https://github.com/astral-sh/uv/blob/ibraheem/canonical-markers/crates/pep508-rs/src/marker/parse.rs#L502).
This means that a marker such as `'x' == 'x'` will always evaluate to
`true` (as if the expression did not exist), whereas we previously
treated this as always `false`. It's negation however, remains `false`.
- [Unsatisfiable markers are written as `python_version <
'0'`](https://github.com/astral-sh/uv/blob/ibraheem/canonical-markers/crates/pep508-rs/src/marker/tree.rs#L1329).
- The `PubGrubSpecifier` type has been moved to the new `uv-pubgrub`
crate, shared by `pep508-rs` and `uv-resolver`. `pep508-rs` also depends
on the `pubgrub` crate for the `Range` type, we probably want to move
`pubgrub::Range` into a separate crate to break this, but I don't think
that should block this PR (cc @konstin).
There is still some remaining work here that I decided to leave for now
for the sake of unblocking some of the related work on the resolver.
- We still use `Option<MarkerTree>` throughout uv, which is unnecessary
now that `MarkerTree::TRUE` is canonical.
- The `MarkerTree` type is now interned globally and can potentially
implement `Copy`. However, it's unclear if we want to add more
information to marker trees that would make it `!Copy`. For example, we
may wish to attach extra and requires-python environment information to
avoid simplifying after construction.
- We don't currently combine `python_full_version` and `python_version`
markers.
- I also have not spent too much time investigating performance and
there is probably some low-hanging fruit. Many of the test cases I did
run actually saw large performance improvements due to the markers being
simplified internally, reducing the stress on the old `normalize`
routine, especially for the extremely large markers seen in
`transformers` and other projects.
Resolves https://github.com/astral-sh/uv/issues/5660,
https://github.com/astral-sh/uv/issues/5179.
## Summary
This PR adds a `DistExtension` field to some of our distribution types,
which requires that we validate that the file type is known and
supported when parsing (rather than when attempting to unzip). It
removes a bunch of extension parsing from the code too, in favor of
doing it once upfront.
Closes https://github.com/astral-sh/uv/issues/5858.
## Summary
I think this seems reasonable... Otherwise, we might not go back to PyPI
to revalidate the list of available versions despite the user passing
`--upgrade`.
## Summary
Previously, we wouldn't respect configuration files in directories
_above_ a workspace root. But this is somewhat problematic, because any
`pyproject.toml` will define a workspace root...
Instead, I think we should _start_ the search at the workspace root, but
go above it if necessary.
Closes: #5929.
See: https://github.com/astral-sh/uv/pull/4295.
## Summary
Resolves#5188. Most of the changes involve creating a new function in
`tool/common.rs` to contain the common functionality previously found in
`tool/install.rs`.
## Test Plan
`cargo test`
```console
❯ ./target/debug/uv tool upgrade black
warning: `uv tool upgrade` is experimental and may change without warning.
Resolved 6 packages in 25ms
Uninstalled 1 package in 3ms
Installed 1 package in 19ms
- black==23.1.0
+ black==24.4.2
Installed 2 executables: black, blackd
```
e.g.
```
❯ cargo run -- venv --no-system
Blocking waiting for file lock on build directory
Compiling uv v0.2.34 (/Users/zb/workspace/uv/crates/uv)
Finished `dev` profile [unoptimized + debuginfo] target(s) in 19.85s
Running `target/debug/uv venv --no-system`
warning: The `--no-system` flag has no effect, a system Python interpreter is always used in `uv venv`
Using Python 3.12.4 interpreter at: /opt/homebrew/opt/python@3.12/bin/python3.12
Creating virtualenv at: .venv
Activate with: source .venv/bin/activate
❯ cargo run -- venv --system
Finished `dev` profile [unoptimized + debuginfo] target(s) in 0.15s
Running `target/debug/uv venv --system`
warning: The `--system` flag has no effect, a system Python interpreter is always used in `uv venv`
Using Python 3.12.4 interpreter at: /opt/homebrew/opt/python@3.12/bin/python3.12
Creating virtualenv at: .venv
Activate with: source .venv/bin/activate
```
## Summary
This _used_ to be true but we now require fetching metadata for all
distributions even with `--no-deps` since, e.g., we validate that any
declared extras exist.
## Summary
Initially, we showed _all_ resolver and installer output in `uv run` and
`uv tool run`, since it was way too much for workhorse commands. Then,
we moved to showing _no_ output by default, which was way too little --
you had no idea why anything was happening, and commands appeared to
hang.
This PR adds a more nuanced middle-ground. With `--verbose`, we continue
to show everything. But by default, in `uv run` and `uv tool run`...
- During resolution, we show any "Building" and "Build" messages, if you
need to build a source distribution. But we don't show any other output.
(This _could_ be too little for expensive resolutions; we may want to
show a spinner.)
- If there are no changes to be made after resolving, we don't show any
other output.
- If we have to install, we show the progress bars for downloads (which
disappear on completion) followed by a single summary line stating the
number of packages installed.
This feels pretty good, in my limited testing. When everything is built
/ cached, you don't get _any_ additional output. When there's work to
do, you have a sense for what's happening, and we leave you with a
single summary line ("Installed X packages") at the end.
Closes https://github.com/astral-sh/uv/issues/5758.
## Test Plan
Notice that the first `tool run` ends with an install line; the second
shows no additional output:

If you run `uv run` in a package for the first time, we _do_ tell you
that we're building / built it:

But on the second run, there's no output:

If you add a `--with`, we'll show you all the installer progress bars
(which disappear once they're done), and then a single summary line:

Currently, the entry for a package+version+source table is called
`distribution`. That is incorrect, the `sdist` and `wheel` fields inside
of that table are distributions, the table itself is for a package. We
also align ourselves closer with PEP 751.
I went through `lock.rs` and renamed all occurrences of "distribution"
that actually referred to a "package".
This change invalidates all existing lockfiles.
Bikeshedding: Do we call it `package` or `packages`? See also
https://github.com/python/peps/pull/3877
`package` is nice because it looks like a header:
```toml
[[package]]
name = "anyio"
version = "4.3.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "idna" },
{ name = "sniffio" },
]
sdist = { url = "3970183622d484d08e3285104333d3/anyio-4.3.0.tar.gz", hash = "sha256:f75253795a87df48568485fd18cdd2a3fa5c4f7c5be8e5e36637733fce06fed6", size = 159642 }
wheels = [
{ url = "2f20c40b45242c0b33774da0e2e34f/anyio-4.3.0-py3-none-any.whl", hash = "sha256:048e05d0f6caeed70d731f3db756d35dcc1f35747c8c403364a8332c630441b8", size = 85584 },
]
```
`packages` is nice because the field is not a single entry, but a list.
2/3 for https://github.com/astral-sh/uv/issues/4893
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
## Summary
Whenever we call `resolve`, we immediately call `fetch` after. And in
some cases `resolve` actually calls `fetch` internally. It seems a lot
simpler to just merge these into one method that returns a `Fetch`
(which itself contains the fully-resolved URL).
Closes https://github.com/astral-sh/uv/issues/5876.
There are three options that determine resolver behavior:
* resolution mode
* prerelease mode
* exclude newer
They are different from the other top level options: If they mismatch,
we recreate the resolution. To distinguish them from the rest of the
lockfile, we group them under an `[options]` header.
1/3 for #4893
Following #5869, the documentation has some less-than-helpful
suggestions to use `uv help python` for details — we should link to the
`uv python` section instead.
## Summary
We were dropping the query and fragment in the wrong place, so the URLs
didn't match up after resolving from an existing lockfile.
Closes https://github.com/astral-sh/uv/issues/5851.
## Summary
Very subtle bug. The scenario is as follows:
- We resolve: `elmer-circuitbuilder = { git =
"https://github.com/ElmerCSC/elmer_circuitbuilder.git" }`
- The user then changes the request to: `elmer-circuitbuilder = { git =
"https://github.com/ElmerCSC/elmer_circuitbuilder.git", rev =
"44d2f4b19d6837ea990c16f494bdf7543d57483d" }`
- When we go to re-lock, we note two facts:
1. The "default branch" resolves to
`44d2f4b19d6837ea990c16f494bdf7543d57483d`.
2. The metadata for `44d2f4b19d6837ea990c16f494bdf7543d57483d` is
(whatever we grab from the lockfile).
- In the resolver, we then ask for the metadata for
`44d2f4b19d6837ea990c16f494bdf7543d57483d`. It's already in the cache,
so we return it; thus, we never add the
`44d2f4b19d6837ea990c16f494bdf7543d57483d` ->
`44d2f4b19d6837ea990c16f494bdf7543d57483d` mapping to the Git resolver,
because we never have to resolve it.
This would apply for any case in which a requested tag or branch was
replaced by its precise SHA. Replacing with a different commit is fine.
It only applied to `tool.uv.sources`, and not PEP 508 URLs, because the
underlying issue is that we aren't consistent about "automatically"
extracting the precise commit from a Git reference.
Closes https://github.com/astral-sh/uv/issues/5860.
## Summary
This is an experimental PR to replace more unsafe calls with more rust
while still trying to keep the binary size small enough. These changes
roughly increase the size of the trampolines to about 40kb~. This is a
alternate PR to https://github.com/astral-sh/uv/pull/5751.
The primary changes here include
* Switch to use rust path components for ease of path management
* Leverage `std::process::exit` for process exit and cleanup
* Use `std::io::Error::last_os_error` for IO Errors to remove
`FormatMessage` complexity
* Use `std::env::current_exe` to get the current executable instead of
`GetModuleFileNameA`
## Test Plan
Added one more existing test case to trampoline tests.
Still need to verify dunce::canonicalize is desired or not on
find_python_exe.
---------
Co-authored-by: konstin <konstin@mailbox.org>
## Summary
We need to avoid using incompatible versions for build dependencies that
are also part of the resolved
environment. This is a very subtle issue, but: when locking, we don't
enforce platform
compatibility. So, if we reuse the resolver state to install, and the
install itself has to
preform a resolution (e.g., for the build dependencies of a source
distribution), that
resolution may choose incompatible versions.
The key property here is that there's a shared package between the build
dependencies and the
project dependencies.
Closes https://github.com/astral-sh/uv/issues/5836.
This already rejects `pyproject.toml`... but because the schema
validation is relaxed (we allow unknown fields, and all fields are
optional), a `pyproject.toml` doesn't get properly rejected here.
This PR makes the schema stricter, but in a safe way (by adding the
other `tool.uv` fields, like `workspace`, as any).
Closes#5832.
## Summary
We allow the use of (e.g.) `.whl.metadata` files when `--no-binary` is
enabled, so it makes sense that we'd also also allow wheels to be
downloaded for metadata extraction. So now, we validate `--no-binary` at
install time, rather than metadata-fetch time.
Closes https://github.com/astral-sh/uv/issues/5699.
## Summary
This fixes a bug introduced by
https://github.com/astral-sh/uv/pull/5232. It turns out that the
`universal_disjoint_base_or_local_requirement` test does not actually do
what it was meant to because of the incorrect python requirement. With a
valid python requirement, it fails on `main`. The problem is that we try
to exclude the original base version from the range of allowed versions
to try and prefer local versions. However, in the test, there is a
branch that depends on the non-local version, with no applicable local
in its fork. We should remove this exclusion as prioritization is
handled by the candidate resolver.
I don't think this will save any time in serialization, but it should
save us some deserialization, since we only need to parse URLs for the
packages we use...
## Summary
Okay, I tested this against...
- Our public "private" proxy
- Fury
- AWS CodeArtifact
- Azure Artifacts
It took a long time.
All of them work as expected with this approach: we omit the credentials
from the lockfile, then wire them back up when the index URL is provided
during subsequent operations.
Closes https://github.com/astral-sh/uv/issues/5119.
Part of #4454
e.g.
```
$ uv add --help
Add one or more packages to the project requirements
Usage: uv add [OPTIONS] <REQUIREMENTS>...
Arguments:
<REQUIREMENTS>... The packages to add, as PEP 508 requirements (e.g., `ruff==0.5.0`)
Options:
--dev Add the requirements as development dependencies
--optional <OPTIONAL> Add the requirements to the specified optional dependency group
--no-editable Don't add the requirements as editables
--raw-sources Add source requirements to `project.dependencies`, rather than `tool.uv.sources`
--rev <REV> Specific commit to use when adding from Git
--tag <TAG> Tag to use when adding from git
--branch <BRANCH> Branch to use when adding from git
--extra <EXTRA> Extras to activate for the dependency; may be provided more than once
--locked Assert that the `uv.lock` will remain unchanged
--frozen Add the requirements without updating the `uv.lock` file
--package <PACKAGE> Add the dependency to a specific package in the workspace
-p, --python <PYTHON> The Python interpreter into which packages should be installed. [env: UV_PYTHON=]
Index options:
-i, --index-url <INDEX_URL> The URL of the Python package index (by default: <https://pypi.org/simple>) [env: UV_INDEX_URL=]
--extra-index-url <EXTRA_INDEX_URL> Extra URLs of package indexes to use, in addition to `--index-url` [env: UV_EXTRA_INDEX_URL=]
-f, --find-links <FIND_LINKS> Locations to search for candidate distributions, in addition to those found in the registry indexes
--no-index Ignore the registry index (e.g., PyPI), instead relying on direct URL dependencies and those provided via `--find-links`
--index-strategy <INDEX_STRATEGY> The strategy to use when resolving against multiple index URLs [env: UV_INDEX_STRATEGY=] [possible values: first-index, unsafe-first-match, unsafe-best-match]
--keyring-provider <KEYRING_PROVIDER> Attempt to use `keyring` for authentication for index URLs [env: UV_KEYRING_PROVIDER=] [possible values: disabled, subprocess]
Resolver options:
-U, --upgrade Allow package upgrades, ignoring pinned versions in any existing output file
-P, --upgrade-package <UPGRADE_PACKAGE> Allow upgrades for a specific package, ignoring pinned versions in any existing output file
--resolution <RESOLUTION> The strategy to use when selecting between the different compatible versions for a given package requirement [env: UV_RESOLUTION=] [possible values: highest, lowest, lowest-direct]
--prerelease <PRERELEASE> The strategy to use when considering pre-release versions [env: UV_PRERELEASE=] [possible values: disallow, allow, if-necessary, explicit, if-necessary-or-explicit]
--exclude-newer <EXCLUDE_NEWER> Limit candidate packages to those that were uploaded prior to the given date [env: UV_EXCLUDE_NEWER=]
Installer options:
--reinstall Reinstall all packages, regardless of whether they're already installed. Implies `--refresh`
--reinstall-package <REINSTALL_PACKAGE> Reinstall a specific package, regardless of whether it's already installed. Implies `--refresh-package`
--link-mode <LINK_MODE> The method to use when installing packages from the global cache [env: UV_LINK_MODE=] [possible values: clone, copy, hardlink, symlink]
--compile-bytecode Compile Python files to bytecode after installation
Build options:
-C, --config-setting <CONFIG_SETTING> Settings to pass to the PEP 517 build backend, specified as `KEY=VALUE` pairs
--no-build Don't build source distributions
--no-build-package <NO_BUILD_PACKAGE> Don't build source distributions for a specific package
--no-binary Don't install pre-built wheels
--no-binary-package <NO_BINARY_PACKAGE> Don't install pre-built wheels for a specific package
Cache options:
-n, --no-cache Avoid reading from or writing to the cache, instead using a temporary directory for the duration of the operation [env: UV_NO_CACHE=]
--cache-dir <CACHE_DIR> Path to the cache directory [env: UV_CACHE_DIR=]
--refresh Refresh all cached data
--refresh-package <REFRESH_PACKAGE> Refresh cached data for a specific package
Python options:
--python-preference <PYTHON_PREFERENCE> Whether to prefer using Python installations that are already present on the system, or those that are downloaded and installed by uv [possible values: only-managed, managed, system, only-system]
--python-fetch <PYTHON_FETCH> Whether to automatically download Python when required [possible values: automatic, manual]
Global options:
-q, --quiet Do not print any output
-v, --verbose... Use verbose output
--color <COLOR_CHOICE> Control colors in output [default: auto] [possible values: auto, always, never]
--native-tls Whether to load TLS certificates from the platform's native certificate store [env: UV_NATIVE_TLS=]
--offline Disable network access, relying only on locally cached data and locally available files
--no-progress Hides all progress outputs when set
--config-file <CONFIG_FILE> The path to a `uv.toml` file to use for configuration [env: UV_CONFIG_FILE=]
--no-config Avoid discovering configuration files (`pyproject.toml`, `uv.toml`) in the current directory, parent directories, or user configuration directories [env: UV_NO_CONFIG=]
-h, --help Print help
-V, --version Print version
Use `uv help add` for more details.
```
## Summary
`uv tree` will now filter to the current platform by default. You can
pass `--universal` to show the entire tree.
Closes https://github.com/astral-sh/uv/issues/5760.
## Summary
It's fine for this to be in the cache, I think, since we don't
necessarily need to colocate it with the Python directory.
Closes https://github.com/astral-sh/uv/issues/5747.
## Summary
After referring to https://github.com/astral-sh/uv/pull/5637 and doing
additional testing.
The default value in a stable state seems more reasonable to be
``only-system``. ``managed`` in preview.
```
cpython-3.11.9-windows-x86_64-none C:\Users\name\AppData\Local\Programs\Python\Python311\python.exe
cpython-3.10.14-windows-x86_64-none C:\Users\name\AppData\Roaming\uv\data\python\cpython-3.10.14-windows-x86_64-none\install\python.exe
cpython-3.10.11-windows-x86_64-none C:\Users\name\AppData\Local\Programs\Python\Python310\python.exe
cpython-3.9.19-windows-x86_64-none C:\Users\name\AppData\Roaming\uv\data\python\cpython-3.9.19-windows-x86_64-none\python.exe
```
test on uv 0.2.33 (build from
257007ccaf)
### Stable version
``uv venv -p 3.10`` is ``3.10.11`` (System Python)
``uv venv -p 3.9`` is ``No interpreter found``(3.9.19 for managed
Python)
``uv venv -p 3.9 --python-preference only-system`` is ``No interpreter
found``(fail)
``uv venv -p 3.9 --python-preference only-managed`` is
``3.9.19``(success)
Do not use managed Python, only use the system Python, so it can be
determined as ``only-system``.
### Preview mode
**Note:** ``3.10.14`` is managed python, ``3.10.11`` is system python.
``uv venv -p 3.11 --preview`` is ``3.11.9`` (System Python)
``uv venv -p 3.10 --preview`` is ``3.10.14``
``uv venv -p 3.10 --preview --python-preference only-managed`` is
``3.10.14``
``uv venv -p 3.10 --preview --python-preference managed`` is ``3.10.14``
``uv venv -p 3.10 --preview --python-preference system`` is ``3.10.11``
``venv -p 3.10 --preview --python-preference only-system`` is
``3.10.11``
Prioritize the managed Python and then select the system Python, so it
can be determined as ``managed``.
-----
fixed#5754
## Test Plan
Run website in local.

## Summary
Right now, if you have a `requirements.txt` with a pre-release, but the
`requirements.in` does not have a pre-release marker for that dependency
we drop the pre-release. (In the selector, we end up returning
`AllowPrerelease::IfNecessary`, the default.)
I played with a few ways of solving this... The first was to remove that
guard altogether. But if we do that,
`universal_transitive_disjoint_prerelease_requirement` fails (we use
`1.17.0rc1` in both forks, when it should only apply to one of the two).
The second was to do that, but also avoid pushing pre-releases as
preferences when we solve a fork. But then
`universal_disjoint_prereleases` fails, because we return a different
pre-release in each fork.
Finally, I settled on allowing existing pre-releases in forks if they
have no markers on them, i.e., they are "global" preferences. I believe
this is true IFF the preference came from an existing lockfile.
Closes https://github.com/astral-sh/uv/issues/5729.
## Summary
If _both_ nodes in the derivation tree are proxies, we need to remove
the _entire_ node. So, the function now returns an `Option<Tree>`
instead of taking `&mut Tree`.
Closes https://github.com/astral-sh/uv/issues/5618.
To enforce the 100 character line limit in markdown files introduced in
https://github.com/astral-sh/uv/pull/5635, and to automate the
formatting of markdown files, i've added prettier and formatted our
markdown files with it.
I've excluded the changelog and the generated references documentation
from this for having too many changes, but we can also include them.
I'm not particular on which style we use. My main motivations are
(major) not having to reflow markdown files myself anymore and (minor)
consistence between all markdown files. I've chosen prettier for similar
reason as we chose black, it's a single good style that's automated and
shared in the community. I do prefer prettier's style of not breaking
inside of a link name though.
This PR is in two parts, the first adds prettier to CI and documents
using it, while the second actually formats the docs. When merge
conflicts arise, we can drop the last commit and regenerate it with `npx
prettier --prose-wrap always --write BENCHMARKS.md CONTRIBUTING.md
README.md STYLE.md docs/*.md docs/concepts/**/*.md docs/guides/**/*.md
docs/pip/**/*.md`.
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
## Summary
Partially resolves#5561. Haven't added overrides support yet but I can
add it tomorrow if the current approach for constraints is ok.
## Test Plan
`cargo test`
Manually checked trace logs after changing the constraints.
## Summary
Gives you a nice error message if you attempt to sync with, e.g., `-p
3.8` when that version is supported by at least one workspace member,
but your project's minimum requirement is `>=3.12`
Closes https://github.com/astral-sh/uv/issues/5662.
## Summary
I think it's reasonable to only sync the affected group, e.g., `uv add`
on its own should not require syncing all extras.
Closes https://github.com/astral-sh/uv/issues/4418.
Part of #4454
e.g. for `uv help pip compile`
```
Python options:
--python <PYTHON>
The Python interpreter against which to compile the requirements.
By default, uv uses the virtual environment in the current working directory or any parent
directory, falling back to searching for a Python executable in `PATH`. The `--python`
option allows you to specify a different interpreter.
Supported formats:
- `3.10` looks for an installed Python 3.10 using `py --list-paths` on Windows, or
`python3.10` on Linux and macOS.
- `python3.10` or `python.exe` looks for a binary with the given name in `PATH`.
- `/home/ferris/.local/bin/python3.10` uses the exact Python at the given path.
-p, --python-version <PYTHON_VERSION>
The minimum Python version that should be supported by the resolved requirements (e.g., `3.8` or `3.8.17`).
If a patch version is omitted, the minimum patch version is assumed. For example, `3.8` is mapped to `3.8.0`.
--python-preference <PYTHON_PREFERENCE>
Whether to prefer using Python installations that are already present on the system, or those that are downloaded and installed by uv
Possible values:
- only-managed: Only use managed Python installations; never use system Python installations
- managed: Prefer managed Python installations over system Python installations
- system: Prefer system Python installations over managed Python installations
- only-system: Only use system Python installations; never use managed Python installations
--python-fetch <PYTHON_FETCH>
Whether to automatically download Python when required
Possible values:
- automatic: Automatically fetch managed Python installations when needed
- manual: Do not automatically fetch managed Python installations; require explicit installation
```
## Summary
In #5494, I made breaking changes to the tool receipt format. This would
break existing tools for all users. This PR makes the change
backwards-compatible by supporting deserialization for the deprecated
format.
Closes https://github.com/astral-sh/uv/issues/5680.
## Test Plan
Beyond the automated tests, you can run `cargo run tool list` on your
existing machine.
Before:
```
warning: `uv tool list` is experimental and may change without warning
warning: Ignoring malformed tool `black` (run `uv tool uninstall black` to remove)
warning: Ignoring malformed tool `poetry` (run `uv tool uninstall poetry` to remove)
warning: Ignoring malformed tool `ruff` (run `uv tool uninstall ruff` to remove)
```
After:
```
warning: `uv tool list` is experimental and may change without warning
black v0.1.0
- black
poetry v1.8.3
- poetry
ruff v0.0.60
- ruff
```
## Summary
When we add a new optional group in `uv add`, we never to update the
`pyproject.toml` before locking. Otherwise, we use the stale
`pyproject.toml` and omit the optional group.
Closes https://github.com/astral-sh/uv/issues/5687.
## Summary
Fixes a bug in #5494. The `RequirementSourceWire` representation was
ambiguous, and so the order of the fields meant that all variants were
mapped to `Registry` when deserializing. (So the snapshots were right,
but behaviors were wrong.)
## Summary
This could still be made more robust, but it's not critical, since you
can always `--force`. It's good to handle this case, though, since we
have an explicit error for it.
Closes https://github.com/astral-sh/uv/issues/5490.
It transpires that detecting the directory a script was sourced from is
non-trivial across `bash`, `ksh` and `zsh`.
The previous version was a one-liner and supported `bash` and `zsh` but
not `ksh`.
It is possible to keep the one-liner and add `ksh` support, but that is
mutually-exclusive with `zsh`.
Therefore, the only way to square this circle is to add an `if` block. A
silver lining here is that although longer, the script is probably
easier to follow as there is less code-golfing going on.
## Summary
The current receipt doesn't capture quite enough information. For
example, it doesn't differentiate between editable and non-editable
requirements. This PR instead uses the full `Requirement` type. I think
we should use a custom representation like we do in the lockfile, but
I'm just using the default representation to demonstrate the idea.
## Summary
As-is, if you have a workspace with mixed `requires-python`
requirements, resolution will _never_ succeed, since we'll use the union
as the `requires-python` bound (i.e., take the lowest value), and fail
when we see the package that only supports some more narrow range.
This PR modifies the behavior to take the intersection (i.e., the
highest value), so if you have one package that supports Python 3.12 and
later, and another that supports Python 3.8 and later, we lock for
Python 3.12. If you try to sync or run with Python 3.8, we raise an
error, since the lockfile will be incompatible with that request.
Konsti has a write-up in https://github.com/astral-sh/uv/issues/5594
that outlines what could be a longer-term strategy.
Closes https://github.com/astral-sh/uv/issues/5578.
By resolving for each fork from the lockfile individually and by adding
using preferences for the current fork, we solve the instability #5180.
I've tested the locally and will add the packse test scenarios upstack.
Part of
https://github.com/astral-sh/uv/issues/5180#issuecomment-2247696198
The comment in the code explains the bulk of this:
```rust
// We previously computed this heuristic freshness lifetime by
// looking at the difference between the last modified header and
// the response's date header. We then asserted that the cached
// response ought to be "fresh" for 10% of that interval.
//
// It turns out that this can result in very long freshness
// lifetimes[1] that lead to uv caching too aggressively.
//
// Since PyPI sets a max-age of 600 seconds and since we're
// principally just interacting with Python package indices here,
// we just assume a freshness lifetime equal to what PyPI has.
//
// Note though that a better solution here is for the index to
// support proper HTTP caching headers (ideally Cache-Control, but
// Expires also works too, as above).
```
We also remove the `heuristic_percent` field on `CacheConfig`. Since
that's actually part of the cache itself, we bump the simple cache
version.
Finally, we add some more `trace!` calls that should hopefully make
diagnosing issues related to the freshness lifetime a bit easier in the
future.
Fixes#5351
## Summary
Given a fork like:
```
pylint < 3 ; sys_platform == 'darwin'
pylint > 2 ; sys_platform != 'darwin'
```
Solving the top branch will typically yield a solution that also
satisfies the bottom branch, due to maximum version selection (while the
inverse isn't true).
To quote an example from the docs:
```rust
// If there's no difference, prioritize forks with upper bounds. We'd prefer to solve
// `numpy <= 2` before solving `numpy >= 1`, since the resolution produced by the former
// might work for the latter, but the inverse is unlikely to be true due to maximum
// version selection. (Selecting `numpy==2.0.0` would satisfy both forks, but selecting
// the latest `numpy` would not.)
```
Closes https://github.com/astral-sh/uv/issues/4926 for now.
## Summary
First part of: https://github.com/astral-sh/uv/issues/4926. We should
solve forks that _don't_ expand the world of supported versions (e.g.,
`python_version >= '3.11'` enables us to select new packages, since we
narrow the supported version range).
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
`uv venv` should support adopting python version specified in
`requires-python` from `pyproject.toml`. This allows customization on
the venv setup when syncing from python project.
Closes https://github.com/astral-sh/uv/issues/5552.
It also serves as a workaround to close
https://github.com/astral-sh/uv/issues/5258.
## Test Plan
<!-- How was it tested? -->
1. Run `uv venv` in folder with `pyroject.toml` specifying
`requries-python = "<3.10"`. Python 3.9 is selected for venv.
2. Change to `requries-python = "<3.11"` and run `uv venv` again. Python
3.10 is selected now.
3. Switch to a folder without `pyproject.toml` then run `uv venv`.
Python 3.12 is selected now.
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
Co-authored-by: Zanie Blue <contact@zanie.dev>
Collapses the previous default into "managed" and makes the "managed"
behavior match "installed". People should use "only-managed" if they
want that behavior, it seems overly complicated otherwise.
## Summary
This PR deprecates the `--isolated` flag. The treatment varies across
the APIs:
- For non-preview APIs, we warn but treat it as equivalent to
`--no-config`.
- For preview APIs, we warn and ignore it, with two exceptions...
- For `tool run` and `run` specifically, we don't even warn, because we
can't differentiate the command-specific `--isolated` from the global
`--isolated`.
## Summary
The culmination of #4730. We now have `uv run --isolated` which always
uses a fresh environment (but includes the workspace dependencies as
needed). This enables you to test with strict isolation (e.g., `uv run
--isolated -p foo` will ensure that `foo` is unable to import anything
that isn't an actual dependency).
Closes#5430.
## Summary
This PR gets rid of the global `--isolated` flag (which serves a bunch
of independent responsibilities right now) on `uv tool run` in favor of
a dedicated `--isolated` flag, which tells uv to avoid re-using an
existing tool environment for this invocation. We'll add the same thing
to `uv run`, to avoid using the base project environment.
This will become a bit clearer in #5466, when we deprecate the
`--isolated` flag on the preview APIs.
## Summary
The idea here is that we hide all resolver output (the grayed out
resolver messages, plus the list of environment modifications) by
default in `uv run` and `uv tool run`. You can pass `--show-resolution`
to re-enable them.
Closes https://github.com/astral-sh/uv/issues/5458.
## Summary
Right now, `--isolated` is read from `uv run` and `uv init` to avoid
discovering the current workspace (or project). This PR moves that
behavior to a dedicated `--no-workspace` flag for `uv init`, and
`--no-project` for `uv run`. They could use the same flag, but
`--no-project` feels confusing for `uv init`, and `--no-workspace` seems
confusing for `uv run` (especially so once you read the documentation,
where we refer to the thing you're omitting as the project).
Closes https://github.com/astral-sh/uv/issues/5429.
This PR represents a different approach to marker propagation in an
attempt to unblock #4640. In particular, instead of propagating markers
when forks are created, we wait until resolution is complete to
propagate all markers to all dependencies in each fork. This ends up
being both more robust (we should never miss anything) and simpler to
implement because it doesn't require mutating a `PubGrubPackage` (which
was pretty annoying). I think the main downside here is that this can
sometimes add markers where they aren't needed.
This actually winds up making quite a few snapshot changes. I went
through each of them. Some of them look like legitimate bug fixes. Some
of them look like superfluous additions. And some of them look like they
would be removed if we had perfect marker normalization. But I don't
think any of the changes are _wrong_.
## Summary
If we just created an entrypoint script, we can of course set the
permissions (we just created it). However, if we're copying from the
cache, we might _not_ own the file. In that case, if we need to change
the permissions (we shouldn't, since the script is likely already
executable -- we set the permissions when we unzip, but I guess they
could _not_ be properly set in the zip itself), we have to copy it.
Closes https://github.com/astral-sh/uv/issues/5581.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
use windows-sys bindings maintained by microsoft devs. winapi didn't has
any updates for more than 3 years
## Test Plan
cargo test. it failed locally because I don't have Python 3.12 installed
## Summary
uv run --directory <path> means that one doesn't have to change to a
project's directory to run programs from it. It makes it possible to use
projects as if they are tool installations.
To support this, first the code reading .python-version was updated so
that
it can read such markers outside the current directory. Note the minor
change this causes (if I'm right), described in the commit.
## Test Plan
One test has been added.
## --directory
Not sure what the name of the argument should be, but it's following uv
sync's directory for now.
Other alternatives could be "--project". Uv run and uv tool run should
probably find common agreement on this (relevant for project-locked
tools).
I've implemented this same change in Rye, some time ago, and then we
went
with --pyproject `<`path to pyproject.toml file`>`. I think using
pyproject.toml file path and not directory was probably a mistake, an
overgeneralization one doesn't need.
Every packse version update is currently causing a huge diff (the size
of the `lock_scenarios.rs` diff in this PR). By redacting the version
from the snapshots, we will only have the actual change in the diff and
not the redundant version change noise.
The second commit moves all remaining packse url arg values to
`common/mod.rs`, which acts as a single source of truth for the packse
version.
## Summary
The package was being installed as editable, but it wasn't marked as
such in `uv pip list`, as the `direct-url.json` was wrong.
Closes https://github.com/astral-sh/uv/issues/5543.
## Summary
The idea here is similar to what we do for wheels: we create the
`CachedEnvironment` in the `archive-v0` bucket, then symlink it to its
content-addressed location. This ensures that we can always recreate
these environments without concern for whether anyone else is accessing
them.
Part of the challenge here is that we want the virtual environments to
be relocatable, because we're now building them in one location but
persisting them in another. This requires that we write relative (rather
than absolute) paths to scripts and entrypoints. The main risk with
relocatable virtual environments is that the scripts and entrypoints
_themselves_ are not relocatable, because they use a relative shebang.
But that's fine for cached environments, which are never intended to
leave the cache.
Closes https://github.com/astral-sh/uv/issues/5503.
## Summary
Adds a `--relocatable` CLI arg to `uv venv`. This flag does two things:
* ensures that the associated activation scripts do not rely on a
hardcoded
absolute path to the virtual environment (to the extent possible; `.csh`
and
`.nu` left as-is)
* persists a `relocatable` flag in `pyvenv.cfg`.
The flag in `pyvenv.cfg` in turn instructs the wheel `Installer` to
create script
entrypoints in a relocatable way (use `exec` trick + `dirname $0` on
POSIX;
use relative path to `python[w].exe` on Windows).
Fixes: #3863
## Test Plan
* Relocatable console scripts covered as additional scenarios in
existing test cases.
* Integration testing of boilerplate generation in `venv`.
* Manual testing of `uv venv` with and without `--relocatable`
## Summary
This PR adds support for `uv lock` and `uv sync` in the standardized
benchmarks script.
Part of: https://github.com/astral-sh/uv/issues/5263.
## Test Plan
For example:
```sh
python scripts/bench/__main__.py --uv-project --benchmark resolve-cold ./scripts/requirements/trio.in --verbose
```
## Summary
Closes#2187.
The [xz
backdoor](https://gist.github.com/thesamesam/223949d5a074ebc3dce9ee78baad9e27)
is still fairly recent, but luckily the [Rust `xz2` crate bundles
version 5.2.5 of the C `xz`
package](https://github.com/alexcrichton/xz2-rs/tree/main/lzma-sys),
which is before the backdoor was introduced.
It's worth noting that a security risk still exists if you have a
compromised version of `xz` installed on your system, but that risk is
not introduced by `uv` or the Rust packages in general.
## Test Plan
Tried installing the package mentioned in the linked issue: `python-apt
@
https://launchpad.net/ubuntu/+archive/primary/+sourcefiles/python-apt/2.7.6/python-apt_2.7.6.tar.xz`
(Note that this will only work on Ubuntu - I tried on a Mac and while
the archive was extracted properly, the package did not install because
of some missing files)
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
## Summary
I think it makes no sense to allow `--editable` and `--exclude-editable`
at the same time.
## Test Plan
```console
$ cargo run -- pip list --editable --exclude-editable
error: the argument '--editable' cannot be used with '--exclude-editable'
Usage: uv.exe pip list --editable
For more information, try '--help'.
```
Consider the following packse scenario:
```toml
[root]
requires = [
"a>=1.0.0 ; python_version < '3.10'",
"a>=1.1.0 ; python_version >= '3.10'",
"a>=1.2.0 ; python_version >= '3.11'",
]
[packages.a.versions."1.0.0"]
[packages.a.versions."1.1.0"]
[packages.a.versions."1.2.0"]
```
On current `main`, this produces a dependency on `a` that looks like
this:
```toml
dependencies = [
{ name = "fork-overlapping-markers-basic-a", marker = "python_version < '3.10' or python_version >= '3.11'" },
]
```
But the marker expression is clearly wrong here, since it implies that
`a` isn't installed at all for Python 3.10. With this PR, the above
dependency becomes:
```toml
dependencies = [
{ name = "fork-overlapping-markers-basic-a" },
]
```
That is, it's unconditional. Which is I believe correct here since there
aren't any other constraints on which version to select.
The specific bug here is that when we found overlapping dependency
specifications for the same package *within* a pre-existing fork, we
intersected all of their marker expressions instead of unioning them.
That in turn resulted in incorrect marker expressions.
While this doesn't fix any known bug on the issue tracker (like #4640),
it does appear to fix a couple of our snapshot tests. And fixes a basic
test case I came up with while working on #4732.
For the packse scenario test: https://github.com/astral-sh/packse/pull/206
When a fork occurs, we divide not just the dependencies that
provoked a fork into distinct groups, but we also add the
corresponding sibling dependencies to each fork. Previously,
while we track markers on the fork itself, the individual
dependencies that had markers only corresponded to markers
written from the dependency specification.
This meant that the sibling dependencies that got added to
each fork would not themselves have markers attached to them.
This in turn meant they would not have markers associated with
them in the lock file.
In many cases, this is actually okay, because the resolver will
pick a version that is "universal" across all forks in most
cases. But in some cases, this just simply isn't possible as
the marker expressions in the fork can and do influence resolution.
In which case, it is possible for the same package with different
versions to show up in the lock file unconditionally. Which is a
big no-no.
So in this commit, after we determine the forks, we intersect the
markers on each fork with each of its dependencies.
This does seem to balloon the marker expressions in some cases.
I plucked one low hanging fruit to avoid doing `x and x` in
trivial cases. (And this eliminated a portion of the snapshot
diffs.) But some pretty gnarly diffs remain.
This commit also fixes another bug: previously, when we created a fork
to capture the "remaining" universe of an incomplete set of markers, we
left out dependencies that should be included in that fork. We rectify
that here.
Fixes#5086
Partially addresses #4732
Interestingly, the empty string appears to be valid for these
types. I'm not sure if that's intended, but having a Default
impl is useful for use with `std::mem::take`.
Basically, and'ing or or'ing the same expression can be entire
skipped. And we try harder to avoid singleton conjunctions or
disjunctions, as these are considered unequal otherwise. (Thus
defeating our attempts to avoid and'ing or or'ing a superfluous
marker.)
## Summary
If you have an executable path on a network share path (like
`\\some-host\some-share\...\python.exe`), canonicalizing it adds the
`\\?` prefix, but dunce cannot safely strip it.
This PR changes the Windows logic to avoid canonicalizing altogether. We
don't really expect symlinks on Windows, so it seems unimportant to
resolve them.
Closes: https://github.com/astral-sh/uv/issues/5440.
## Summary
After consultation with @carljm, we learned that modifying `PYTHONPATH`
is insufficient, because Python won't resolve `.pth` files (editables)
in the base environment. We also saw in
https://github.com/astral-sh/uv/issues/5459 that continuously appending
to `PYTHONPATH` can have some unintended effects.
This PR instead uses a `sitecustomize.py` in the ephemeral environment
to add the base environment's `site-packages`.
Closes https://github.com/astral-sh/uv/issues/5459.
It looks like we had a bad merge where the result caused some test
failures. This commit just updates the snapshots to the new reality. I
haven't found the root cause of the bad merge yet.
## Summary
It's hard for me to imagine a scenario in which a user passed
`--reinstall`, but wanted us to keep respecting cached data for a
package. For example, to actually "rebuild and reinstall" an editable
today, you have to pass both `--reinstall` and `--refresh`.
This PR makes `--reinstall` imply `--refresh`, so we always validate
that the cached data is fresh.
Closes https://github.com/astral-sh/uv/issues/5424.
## Summary
This PR re-introduces caching for source trees. In short, we treat the
metadata as cached unless the `pyproject.toml`, `setup.py`, or
`setup.cfg` file changes. This is a heuristic and not a good one,
especially for extension modules, but without it, we have to rebuild
every project every time (unless you have static metadata, like a
`pyproject.toml` that we can read directly).
Now that we support persistent configuration, users should add:
```toml
[tool.uv]
reinstall = ["foo"]
```
If they want a package to always be refreshed (ignore cache) and
reinstalled (ignore environment).
Closes https://github.com/astral-sh/uv/issues/5420.
Two changes split out from the instability work:
* Break `ResolutionGraph::from_state` into methods before adding new
logic to it.
* `ResolutionGraph`: Convert `NodeKey` type to `PackageRef` struct:
Another small refactoring to make subsequent changes easier.
No functional changes.
@BurntSushi I hope this doesn't interfere with your work too much, the
`PackageRef` should at least make debugging panics here easier.
In preparation for the preferences changes with forking, change the
method structure in `CandidateSelector`. Split out into its own PR to
avoid merge conflicts with main. No functional changes.
Consider these requirements from pylint 3.2.5:
```
Requires-Dist: dill >=0.3.6 ; python_version >= "3.11"
Requires-Dist: dill >=0.3.7 ; python_version >= "3.12"
```
We will split on the python version, but then we may pick a version of
`dill` that's `>=0.3.7` in both branches and also have an otherwise
identical resolution in both forks. In this case, we merge both forks
and store only their conjoined markers.
## Summary
Normalize the order of marker expressions on construction. This removes
the distinction between expressions like `os_name == 'Linux'` vs.
`'Linux' == os_name` throughout the codebase. One caveat here is that
the `in` operator does not have a direct inverse, so we introduce
`MarkerOperator::Contains` to handle that case.
I wanted to land this smaller change before some more intrusive changes
as it simplifies the existing code quite a bit.
## Summary
Prior to this change, the resolver would panic if we ran with
`--offline` and `--no-deps` and we had cached metadata for a _package_
(i.e., the versions) but no cached metadata for the _distribution_
(i.e., the specific wheel), since we weren't validating that the
returned metadata in the `--no-deps` case was actually successful. (We
need metadata, even for `--no-deps`, so that we can validate extras.)
## Test Plan
The added test panics on the previous branch.
## Summary
Users can now run `uv cache prune --ci` (open to feedback on the name of
that flag) to remove all pre-built wheels from the cache, leaving behind
zipped, built wheels (which tend to be the most expensive assets to
re-create). This should greatly increase cache performance in CI
environments, since uploading unzipped wheels can actually hurt
performance if you're persisting the uv cache.
Closes https://github.com/astral-sh/uv/issues/5282.
The test asserts that 28 files were removed. But on my system, 27 files
are removed.
This PR is first about debugging what the difference is (since CI
presumably passes with the status quo snapshot). And then I'm thinking
the right way to fix the test failure is with a filter that replaces the
specific number of files removed (limited to what we know to be
correct) with a placeholder.
## Summary
This is surprisingly complex because we need to decide what happens if
you run `uv run` from within a hidden folder, etc. For now, I did the
simplest thing: we just ignore workspace members that are hidden
directories if they lack a `pyproject.toml`, so you can still include
hidden members, they're just ignored if they don't seem to be projects.
Closes https://github.com/astral-sh/uv/issues/5403.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
Implements #5340
## Test Plan
<!-- How was it tested? -->
## Summary
This is a bit simpler than #5333, but seems to work in my testing on
macOS and Windows. It's based on implementations that I found in
[Pixi](36f1bb297d/src/cli/exec.rs (L99))
and
[Wasmer](49e60af8df/lib/wasix/src/state/builder.rs (L1058)).
Closes https://github.com/astral-sh/uv/issues/5257.
## Test Plan
On both macOS and Windows:
- `cargo run -- tool run --from jupyterlab jupyter-lab` -- hit Ctrl-C;
verify that the process exits and the terminal is left in a good state.
- `cargo run -- run python` -- hit Ctrl-C; verify that the process does
_not_ exit, but does on Ctrl-D.
## Summary
Closes https://github.com/astral-sh/uv/issues/5359.
## Test Plan
Unfortunately, the only packages I know of that use this are Ruff and
uv, and both are too heavy to install in a recurring test, so:
`uv tool install hatch==1.12.0 --with uv==0.2.27 --force
--link-mode=symlink`
> DEBUG Found `cpython-3.12.1-macos-aarch64-none` at
`/Users/zb/Library/Application
Support/uv/python/cpython-3.12.1-macos-aarch64-none/bin/python3`
(managed installations)
Instead of `<implementation> <version>`
> DEBUG Found cpython 3.12.1 at `/Users/zb/Library/Application
Support/uv/python/cpython-3.12.1-macos-aarch64-none/bin/python3`
(managed installations)
## Summary
Prefers, in order:
- The major-minor version of an interpreter discovered via `--python`.
- The `requires-python` from the workspace.
- The major-minor version of the default interpreter.
If the `--python` request is a version or a version range, we use that
without fetching an interpreter.
Closes https://github.com/astral-sh/uv/issues/5299.
The `RequirementsTxtComparator` was written assuming there is one
distribution per package name. This changed with the universal
resolution, which allows multiple versions or urls for the same package
name. The sorting we emitted for these new entries was incidental.
With this change, we properly sort these entries by name, version and
then url in universal mode.
This is an output format change for `--universal` users.
## Summary
This PR avoids an `Invalid package name` error that occurs when using
`uv init .`. This is achieved by slightly reorganizing the code block to
determine the name after the path is canonicalized. The dot path is
expanded to the current directory, and the `file_name` then works as
expected.
Resolve#5329 .
---------
Co-authored-by: konstin <konstin@mailbox.org>
## Summary
1. Fixes errors from https://github.com/astral-sh/uv/pull/4878
2. More cleanup. Removed the need for `MaybeUninit` and `SizeOf`
helpers. Renamed main entrypoint to expected default in windows
`mainCRTStartup` to avoid re-declaring /ENTRY in build.rs.
3. Adds a small basic test harness that >>on windows<< will generate
both types of launchers and run them. I've had been using this locally
to test changes and edge cases, but it might be useful for others. It's
based on core parts of install-wheel-rs.
## Test Plan
Tested locally on a couple of script/gui apps.
---------
Co-authored-by: konsti <konstin@mailbox.org>
## Summary
Similiar to https://github.com/astral-sh/rye/pull/680, I have made a
major refactor to the `fetch-download-metadata.py` script.
Some notable changes:
- Use PEP 723 inline scripts
- Fully type annotated the script
- Implemented async HTTP fetching
- Introduced a `Finder` base class and move finder logic under
`CPythonFinder` subclass, which will make it easier to add a
`PyPyFinder` later.
- Instead of fetching `xxx.sha256` for each file, the script now fetches
a single `SHA256SUMS` file containing checksums for all files in the
release.
As a result, the script now takes around 10 seconds instead of 10+
minutes.
## Plan for Future PRs
- [ ] Implement the `PyPyFinder`
- [ ] Add an GitHub Action to run `fetch-download-metadata.py` daily and
create PR automatically
## Test Plan
```sh
cargo run -- run --isolated -- ./crates/uv-python/fetch-download-metadata.py
```
## Summary
In my setup, I have a directory of wheels symlinked from different
directories. I can point `--find-links` at it with `pip` and it works
but not `uv`.
Currently, `uv` checks if a candidate file `is_file` which is for
regular files. By also checking `is_symlink` I was able to install a
symlinked wheel. I'm not *exactly* sure where, but some other place is
eventually resolving the absolute path of the wheel. (`uv`? The OS?)
## Test Plan
Manually tested - I didn't see any tests for `FlatIndexClient` in the
`uv-client` crate.
```
mkdir /tmp/a /tmp/b # Create a directory of wheels (/tmp/a) and a directory of symlinked wheels (/tmp/b)
cp test-0.0.1-py3-none-any.whl /tmp/a # Add a wheel to the directory of wheels
ln -s /tmp/a/test-0.0.1-py3-none-any.whl /tmp/b/ # Create a symlink to that wheel
uv pip install test --find-links /tmp/b # Install pointing at the symlinked wheels directory
```
## Summary
Otherwise, if the path is already a member, discovery fails.
Also adds a failing test for "adding members that are already covered by
`members`".
## Summary
Excellent find from @konstin. If we have a package that's included in
two forks at the same version, but with different URLs, we need to avoid
collapsing them in the lockfile.
Closes https://github.com/astral-sh/uv/issues/5294.
## Summary
You can still generate instabilities, but at least it's consistent
between including and excluding the extra.
For example, this resolves to 54 and then 52 packages on re-run:
```toml
[project]
name = "transformers"
version = "4.39.0.dev0"
description = "State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow"
requires-python = ">=3.9.0"
dependencies = []
[project.optional-dependencies]
flax = ["jaxlib>=0.4.1,<=0.4.13"]
onnxruntime = ["onnxruntime>=1.4.0"]
ray = ["ray[tune]>=2.7.0"]
deepspeed-testing = [
"dill<0.3.5",
"datasets!=2.5.0",
]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
```
I think the difference is just somewhere in PubGrub -- like, we add an
extra dependency, so the iteration order gets changed, and we end up
with a different resolution at the end.
Closes https://github.com/astral-sh/uv/issues/5285.
## Summary
Per https://github.com/astral-sh/uv/pull/5250#issuecomment-2242137762
> It would also be great to have an argument (perhaps leveraging the
global isolated option) that allows us to disable workspace discovery
when we don't want to add a project as a member.
## Test Plan
```sh
$cargo run -- init work
$ cargo run -- init work/pkg-a --isolated
warning: `uv init` is experimental and may change without warning
Initialized project sub-c in /tmp/work
```
Looking at how to merge identical forks, i found that the `Resolution`
can be simplified by treating it as a nodes and edges store (plus pins,
they are separate since they are per name-version, not per
(virtual-)package-version). This should also make #5294 more apparent,
which i didn't touch here.
I additionally added some doc comments to the `Resolution` types.
Our everything-method `solve` tends to grow large, so before i'm adding
more logic, i'm moving some code and some logging statements around to
keep it manageable.
I made minor changes to the logging, otherwise no logic changes, only
refactoring.
## Summary
`dearpygui==1.9.1 has no wheels are available with a matching Python
ABI` is way better than `he requested Python version (>=3.12.3) does not
satisfy Python>=3.12.3`.
Closes https://github.com/astral-sh/uv/issues/5284.
This fixes resolving packages that publish an invalid stub to pypi, such
as tensorrt-llm.
## Summary
In https://github.com/astral-sh/uv/pull/3138 , we implemented
`unsafe-best-match`. However, it seems to not quite work as expected.
When multiple indices contain the same version, it's not clear which
index the current code uses. This PR fixes that to use the first index
the package is in.
## Test Plan
```console
$ echo 'tensorrt-llm==0.11.0' | ./target/debug/uv pip compile - --extra-index-url https://pypi.nvidia.com --python-version=3.10 --index-strategy=unsafe-best-match --annotation-style=line
```
## Summary
Given `Requires-Python: ">=3.12.3"`, we were rejecting wheels like
`dearpygui-1.11.1-cp312-cp312-win_amd64.whl`, since `3.12.0` is not
included in `>=3.12.3`. We instead need to test against the major-minor
version of `Requires-Python`.
The easiest way to do this, I think, is the use the `RequiresPython`
struct, which has a single bound that we can truncate the major-minor.
This also means that we now allow
`dearpygui-1.11.1-cp312-cp312-win_amd64.whl` for specifiers like
`Requires-Python: "==3.10.*"`. This is incorrect on the surface, but it
does match our semantics for `Requires-Python` elsewhere: we treat it as
a lower bound.
Closes https://github.com/astral-sh/uv/issues/5287.
Emulate `pip`'s behaviour and find `pythonw` executable by doing an
`s/python/pythonw/g` style transformation, as opposed to assuming a
constant `pythonw.exe` path.
See #5256 for more detail e.g. why this is a useful behaviour to
emulate.
Fixes: #5256
## Summary
This PR modifies the lockfile to include the impactful resolution
settings, like the resolution and pre-release mode. If any of those
values change, we want to ignore the existing lockfile. Otherwise,
`--resolution lowest-direct` will typically have no effect, which is
really unintuitive.
Closes https://github.com/astral-sh/uv/issues/5226.
## Summary
I don't think that "always reinstall" is tenable for `uv run`. My
perspective on this is that if you want "always reinstall", you can now
set it persistently in your `pyproject.toml` or `uv.toml`.
As a smaller change, we could instead disable this _only_ for the
Project API.
Closes https://github.com/astral-sh/uv/issues/4946.
Resolves#4467.
## Summary
This PR implements the following
1. Add `tool.uv.constraint-dependencies` to pyproject.toml
1. Support to refer `tool.uv.constraint-dependencies` in `uv lock`
1. Support to refer `tool.uv.constraint-dependencies` in `uv pip
compile/install`
These are analogues of the override features implemented in #3839 and
#4369.
## Test Plan
Add test.
## Summary
resolves https://github.com/astral-sh/uv/issues/5217
## Test Plan
existing tests pass (should be perfectly backwards compatible) + added a
few tests to cover the new functionality. in particular, in addition to
the simple use of `--show-version-specifiers`, its interaction with
`--invert` and `--package` flags are tested.
## Summary
This fixes a few bugs introduced by
https://github.com/astral-sh/uv/pull/5104. I previously thought we could
track conflicting locals the same way we track conflicting URLs in
forks, but it turns out that ends up being very tricky. URL forks work
because we prioritize directly URL requirements. We can't prioritize
locals in the same way without conflicting with the URL prioritization
(this may be possible but it's not trivial), so we run into issues where
a correct resolution depends on the order in which dependencies are
traversed.
Instead, we track local versions across all forks in `Locals`. When
applying a local version, we apply all locals with markers that
intersect with the current fork. This way we end up applying some local
versions without creating a fork. For example, given:
```
// pyproject.toml
dependencies = [
"torch==2.0.0+cu118 ; platform_machine == 'x86_64'",
]
// requirements.in
torch==2.0.0
.
```
We choose `2.0.0+cu118` in all cases. However, if a disjoint fork is
created based on local versions, the resolver will choose the most
compatible local when it narrows to a specific fork. Thus we correctly
respect local versions when forking:
```
// pyproject.toml
dependencies = [
"torch==2.0.0+cu118 ; platform_machine == 'x86_64'",
"torch==2.0.0+cpu ; platform_machine != 'x86_64'"
]
// requirements.in
torch==2.0.0
.
```
We should also be able to use a similar strategy for
https://github.com/astral-sh/uv/pull/5150.
## Test Plan
This fixes https://github.com/astral-sh/uv/issues/5220 locally for me,
as well as a few other bugs that were not reported yet.
## Summary
The current code was checking every constraint against every
requirement, regardless of whether they were applicable. In general,
this isn't a big deal, because this method is only used as a fast-path
to skip resolution -- so we just had way more false-negatives than we
should've when constraints were applied. But it's clearly wrong :)
## Test Plan
- `uv venv`
- `uv pip install flask`
- `uv pip install --verbose flask -c constraints.txt` (with `numpy<1.0`)
Prior to this change, Flask was reported as not satisfied.
## Summary
Implements the `uv init` command, which initializes a project
(`pyproject.toml`, `README.md`, `src/__init__.py`) in the current
directory, or in the given path. `uv init` also does workspace
discovery.
Resolves https://github.com/astral-sh/uv/issues/1360.
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
## Summary
Spotted some issues in the settings documentation, and room for small
improvements by linking to PEPs/RFCs.
Also updating contribution documentation to mention that it's necessary
to enable the virtual environment before running `mkdocs serve`.
## Test Plan
Running documentation locally.
Fix#5211
## Summary
Change to show only the python installed on the system if
`--python-preference only-system` is specified.
Below is an example of running the command before the change, showing
Python not installed on the system.
#### Before
```bash
# Check system python
$ uv python --preview list --python-preference only-system
cpython-3.12.4-linux-x86_64-gnu <download available>
cpython-3.12.3-linux-x86_64-gnu /usr/bin/python3.12
cpython-3.12.3-linux-x86_64-gnu /usr/bin/python3
cpython-3.12.3-linux-x86_64-gnu /bin/python3.12
cpython-3.12.3-linux-x86_64-gnu /bin/python3
cpython-3.11.9-linux-x86_64-gnu <download available>
cpython-3.10.14-linux-x86_64-gnu <download available>
cpython-3.9.19-linux-x86_64-gnu <download available>
cpython-3.8.19-linux-x86_64-gnu <download available>
cpython-3.7.9-linux-x86_64-gnu <download available>
```
This PR changes the display to show only Python installed on the system.
#### After
```bash
$ cargo run python --preview list --python-preference only-system
Finished `dev` profile [unoptimized + debuginfo] target(s) in 0.11s
Running `target/debug/uv python list --python-preference only-system --preview`
cpython-3.12.3-linux-x86_64-gnu /usr/bin/python3.12
cpython-3.12.3-linux-x86_64-gnu /usr/bin/python3
cpython-3.12.3-linux-x86_64-gnu /bin/python3.12
cpython-3.12.3-linux-x86_64-gnu /bin/python3
```
## Test Plan
- `cargo run python list --python-preference only-system` in Ubuntu
24.04 to verify the display.
- `cargo run python list` in Ubuntu 24.04 to verify that the results
before and after the change were the same.
## Summary
This ensures that we process Python installs and uninstalls as soon as
they complete, rather than waiting for them all to complete, then
processing them sequentially. In practice, it shouldn't be much of a
difference (since the processing is code is fairly light), but it
strikes me as more correct.
## Summary
Addressing this [issue](https://github.com/astral-sh/uv/issues/5147) by
adding the capability for Symbolic linking as a link mode when
installing or syncing dependencies.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
Currently, `uv` refuses to install anything on GraalPy. This is
currently blocking GraalPy testing with cibuildwheel, since manylinux
includes both `uv` and `graalpy` (but doesn't test with `uv`), whereas
cibuildwheel defaults to `uv`. See e.g.
https://github.com/pypa/cibuildwheel/actions/runs/9956369360/job/27506182952?pr=1538
where it gives
```
+ python -m build /project/sample_proj --wheel --outdir=/tmp/cibuildwheel/built_wheel --installer=uv
* Creating isolated environment: venv+uv...
* Using external uv from /usr/local/bin/uv
* Installing packages in isolated environment:
- setuptools >= 40.8.0
> /usr/local/bin/uv pip install "setuptools >= 40.8.0"
< error: Unknown implementation: `graalpy`
```
## Test Plan
I simply based the GraalPy support on PyPy and added some small tests.
I'm open to discussing how to test this. GraalPy is available for
manylinux images and with setup-python, so we should be able to add
tests against it to the CI. I locally confirmed by installing `uv` into
a GraalPy venv and then trying things like `uv pip install Pillow` and
testing those extensions.
## Summary
Closes https://github.com/astral-sh/uv/issues/5124.
## Test Plan
Ran `cargo run -- help pip compile` on my Windows machine, which failed
before but succeeds after this change.
## Summary
You can now use `uv run --locked` to assert that the lockfile doesn't
change, or `uv run --frozen` to run without attempting to update the
lockfile at all.
Closes https://github.com/astral-sh/uv/issues/5185.
As user, you specify a list of extras. Internally, we decompose this
into one virtual package per extra. We currently leak this abstraction
by writing one entry per extra to the lockfile:
```toml
[[distribution]]
name = "foo"
version = "4.39.0.dev0"
source = { editable = "." }
dependencies = [
{ name = "pandas" },
{ name = "pandas", extra = "excel" },
{ name = "pandas", extra = "hdf5" },
{ name = "pandas", extra = "html", marker = "os_name != 'posix'" },
{ name = "pandas", extra = "output-formatting", marker = "os_name == 'posix'" },
{ name = "pandas", extra = "plot", marker = "os_name == 'posix'" },
]
```
Instead, we should merge the extras into a list of extras, creating a
more concise lockfile:
```toml
[[distribution]]
name = "foo"
version = "4.39.0.dev0"
source = { editable = "." }
dependencies = [
{ name = "pandas", extra = ["excel", "hdf5"] },
{ name = "pandas", extra = ["html"], marker = "os_name != 'posix'" },
{ name = "pandas", extra = ["output-formatting", "plot"], marker = "os_name == 'posix'" },
]
```
The base package is now implicitly included, as it is in PEP 508.
Fixes#4888
## Summary
Makes the `tools()` return value include per-tool errors. This makes it
easy to skip (rather than failing) in `uv tool list`, _and_ improves `uv
tool uninstall` to remove those invalid tools, rather than leaving them
around. (We already had that behavior for `uv tool uninstall ruff` with
an invalid `ruff`, but `uv tool uninstall --all` just left them.)
Closes https://github.com/astral-sh/uv/issues/5151.
Search for all `python3.x` minor versions in PATH, skipping those we
already know we can use.
For example, let's say `python` and `python3` are Python 3.10. When a
user requests `>= 3.11`, we still need to find a `python3.12` in PATH.
We do so with a regex matcher.
Fixes#4709
Warn when there is a direct dependency without a lower bound and
`--resolution lowest` is set.
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
## Summary
"Bare" made sense when we had a variant that seeded the environment, but
now that the crate _only_ creates a bare environment, lets drop that
terminology.
## Summary
Hashes will be validated if present, but aren't required (since, e.g.,
some registries will omit them, as will Git dependencies and such).
Closes https://github.com/astral-sh/uv/issues/5168.
## Summary
This is an alternative to `--require-hashes` which will validate a hash
if it's present, but ignore requirements that omit hashes or are absent
from the lockfile entirely.
So, e.g., transitive dependencies that are missing will _not_ error; nor
will dependencies that are included but lack a hash.
Closes https://github.com/astral-sh/uv/issues/3305.
## Summary
We only need to store one hash -- it should be the "strongest" hash. In
practice, most registries (like PyPI) only serve one, and we only
compute a SHA256 hash for direct URLs.
Part of: https://github.com/astral-sh/uv/issues/4924
## Test Plan
I verified that changing:
```diff
diff --git a/crates/distribution-types/src/hash.rs b/crates/distribution-types/src/hash.rs
index 553a74f55..d36c62286 100644
--- a/crates/distribution-types/src/hash.rs
+++ b/crates/distribution-types/src/hash.rs
@@ -31,7 +31,7 @@ impl<'a> HashPolicy<'a> {
pub fn algorithms(&self) -> Vec<HashAlgorithm> {
match self {
Self::None => vec![],
- Self::Generate => vec![HashAlgorithm::Sha256],
+ Self::Generate => vec![HashAlgorithm::Sha256, HashAlgorithm::Sha512],
Self::Validate(hashes) => {
let mut algorithms = hashes.iter().map(HashDigest::algorithm).collect::<Vec<_>>();
algorithms.sort();
```
Then running `uv lock` with a URL gave me:
```toml
[[distribution]]
name = "iniconfig"
version = "2.0.0"
source = { url = "62565a6e1ceac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl" }
wheels = [
{ url = "62565a6e1ceac6173dc9db836a5b46/iniconfig-2.0.0-py3-none-any.whl", hash = "sha512:44cc53a6c8dd7cf4d6d52bded308bcc4b4f85fff2ed081f60f7d4beaa86a7cde6d099e3976331232d4cbd472ad5d1781064725b0999c7cd3a2a4d42df687ee81" },
]
```
## Summary
It turns out that if `path` is a symlink,
`File::create(path)?.write_all(content.as_ref())?` will overwrite the
_target_ file. That means an entrypoint named `python` would actually
overwrite the user's source Python executable, which is symlinked into
the virtual environment.
This PR replaces that code with our atomic write method.
Closes https://github.com/astral-sh/uv/issues/5152.
## Test Plan
I ran through the test plan
`https://github.com/astral-sh/uv/issues/5152`, but used an executable
named `bar` linked to `foo.txt` instead...
* Use a dedicated `ResolverMarkers` check in the fork state. This is
better than the `MarkerTree::And(Vec::new())` check.
* Report the timing correct naming universal resolution instead of two
spaces around an empty string when there are no markers.
* On resolution error, show the split that we're in. I'm not sure how to
word this, since we're doing a universal resolution until we fork, so
the trace may contain information from requirements that are not part of
this fork.
## Summary
Resolves#5139
`PythonInstallationKey` was sorted as a string, which caused `3.8` to
appear before `3.11`. This update changes the sorting of
`PythonInstallationKey` to be a descending order by version.
## Test Plan
```sh
$ cargo run -- python install 3.8 3.12
$ cargo run -- tool run -v python -V
DEBUG uv 0.2.25
warning: `uv tool run` is experimental and may change without warning.
DEBUG Searching for Python interpreter in managed installations, system path, or `py` launcher
DEBUG Searching for managed installations at `C:\Users\xx\AppData\Roaming\uv\data\python`
DEBUG Found managed Python `cpython-3.12.3-windows-x86_64-none`
DEBUG Found cpython 3.12.3 at `C:\Users\xx\AppData\Roaming\uv\data\python\cpython-3.12.3-windows-x86_64-none\install\python.exe` (managed installations)
DEBUG Using request timeout of 30s
DEBUG Using request timeout of 30s
DEBUG Acquired lock for `C:\Users\nigel\AppData\Roaming\uv\data\tools`
DEBUG Using existing environment for tool `httpx`: C:\Users\xx\AppData\Roaming\uv\data\tools\httpx
DEBUG Using existing tool `httpx`
DEBUG Running `httpx -v`
```