## Summary
This PR deprecates the `--isolated` flag. The treatment varies across
the APIs:
- For non-preview APIs, we warn but treat it as equivalent to
`--no-config`.
- For preview APIs, we warn and ignore it, with two exceptions...
- For `tool run` and `run` specifically, we don't even warn, because we
can't differentiate the command-specific `--isolated` from the global
`--isolated`.
Every packse version update is currently causing a huge diff (the size
of the `lock_scenarios.rs` diff in this PR). By redacting the version
from the snapshots, we will only have the actual change in the diff and
not the redundant version change noise.
The second commit moves all remaining packse url arg values to
`common/mod.rs`, which acts as a single source of truth for the packse
version.
## Summary
Removes the legacy `benchmark` directory (we'll always have it in Git)
and renames `bench` to `benchmark` for clarity. Fixes a variety of
commands and references.
## Summary
This PR adds support for `uv lock` and `uv sync` in the standardized
benchmarks script.
Part of: https://github.com/astral-sh/uv/issues/5263.
## Test Plan
For example:
```sh
python scripts/bench/__main__.py --uv-project --benchmark resolve-cold ./scripts/requirements/trio.in --verbose
```
## Summary
You can now add `managed = false` under `[tool.uv]` in a
`pyproject.toml` to explicitly opt out of the project and workspace
APIs.
If a project sets `managed = false`, we will (1) _not_ discover it as a
workspace root, and (2) _not_ discover it as a workspace member (similar
to using `exclude` in the workspace parent).
Closes https://github.com/astral-sh/uv/issues/4551.
This PR refactors the command creation in the test suite to remove the
duplication.
**1)** We add the same set of test stubbing args to almost any uv
invocation in the tests:
```rust
command
.arg("--cache-dir")
.arg(self.cache_dir.path())
.env("VIRTUAL_ENV", self.venv.as_os_str())
.env("UV_NO_WRAP", "1")
.env("HOME", self.home_dir.as_os_str())
.env("UV_TOOLCHAIN_DIR", "")
.env("UV_TEST_PYTHON_PATH", &self.python_path())
.current_dir(self.temp_dir.path());
if cfg!(all(windows, debug_assertions)) {
// TODO(konstin): Reduce stack usage in debug mode enough that the tests pass with the
// default windows stack of 1MB
command.env("UV_STACK_SIZE", (8 * 1024 * 1024).to_string());
}
```
Centralizing these into a `TestContext::add_shared_args` method removes
them from everywhere.
**2)** Prefix all `TextContext` methods of the pip interface with
`pip_`. This is now necessary due to `uv sync` vs. `uv pip sync`.
**3)** Move command creation in the various test files into dedicated
functions or methods to avoid repeating the arguments. Except for error
message tests, there should be at most one `Command::new(get_bin())`
call per test file. `EXCLUDE_NEWER` is exclusively used in
`TestContext`.
---
I'm considering adding a `TestCommand` on top of these changes (in
another PR) that holds a reference to the `TextContext`, has
`add_shared_args` as a method and uses `Fn(Self) -> Self` instead of
`Fn(&mut Self) -> Self` for methods to improved chaining.
Make use of GitHub's markdown rendering
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
## Test Plan
<!-- How was it tested? -->
packse has the ability to specify a project wide Requires-Python
constraint, but our lock template wasn't forwarding this to the
corresponding pyproject.toml. This update makes that happen.
I tweaked rooster to allow sections to be overridden from the CLI so we
can generate a separate preview changelog
See https://github.com/zanieb/rooster/pull/43 for the rooster changes
needed
I tested `./scripts/release.sh` as well.
The basic idea here is to make it so forking can only ever result in a
resolution that, for a particular marker environment, will only install
at most one version of a package. We can guarantee this by ensuring we
only fork on conflicting dependency specifications only when their
corresponding markers are completely disjoint. If they aren't, then
resolution _must_ find a single version of the package in the
intersection of the two dependency specifications.
A test for this case has been added to packse here:
https://github.com/astral-sh/packse/pull/182. Previously, that test
would result in a resolution with two different unconditional versions
of the same package. With this change, resolution fails (as it should).
A commit-by-commit review should be helpful here, since the first commit
is a refactor to make the second commit a bit more digestible.
## Summary
If `Requires-Python` is omitted in `uv lock` or `uv run`, we now warn
and default to `>=` the current minor version.
Closes https://github.com/astral-sh/uv/issues/4050.
This commit adds a template and does some light surgery on `generate.py`
to make use of that template. In particular, the universal tests require
using the "workspace"-aware version of `uv`, so we can't use the
existing `uv pip {compile,install}` tests.
This is just the result of running
./scripts/sync_scenarios.sh
From the root of the `uv` repository.
When I initially ran this, it produced some tests with snapshots that
weren't being updated. It turned out this was because the tests weren't
running, as they were gated behind the `python-patch` feature. In this
commit, we add `python-patch` to our `cargo insta` command, which should
update all relevant snapshots.
There are still some superfluous updates as a result of a spell checker
being run on generated files, but
The workspace test directories can be used both in tests and directly
for developing/debugging. In the latter, we shouldn't copy the venv and
the lockfile when running tests. Using the ignore crate over manual
recursion we exclude those files.
## Summary
There are a few behavior changes in here:
- We now enforce `--require-hashes` for editables, like pip. So if you
use `--require-hashes` with an editable requirement, we'll reject it. I
could change this if it seems off.
- We now treat source tree requirements, editable or not (e.g., both `-e
./black` and `./black`) as if `--refresh` is always enabled. This
doesn't mean that we _always_ rebuild them; but if you pass
`--reinstall`, then yes, we always rebuild them. I think this is an
improvement and is close to how editables work today.
Closes#3844.
Closes#2695.
Add workspace support when using `-r <path>/pyproject.toml` or `-e
<path>` in the pip interface. It is limited to all-editable
static-metadata workspaces, and tests only include a single main
workspace, ignoring path dependencies in another workspace. This can be
considered the MVP for workspace support: You can create a workspace,
you can install from it, but some options and conveniences are still
missing. I'll file follow-up tickets (support in lockfiles, support path
deps in other workspace, #3625)
There is also support in `uv run`, but we need
https://github.com/astral-sh/uv/issues/3700 first to properly support
using different current projects in the bluejay interface, currently the
resolution and therefore the lockfile depends on the current project.
I'd do this change first (it's big enough already), then #3700, and then
add workspace support properly to bluejay.
Fixes#3404
## Summary
This seems to be one of the most consistent benchmark cases we have in
terms of standard deviation:
```
➜ hyperfine "target/profiling/main pip compile scripts/requirements/airflow.in" --runs 200
Benchmark 1: target/profiling/main pip compile scripts/requirements/airflow.in
Time (mean ± σ): 292.6 ms ± 6.6 ms [User: 414.1 ms, System: 194.2 ms]
Range (min … max): 282.7 ms … 320.1 ms 200 runs
```
For smaller benchmarks, scispacy and dtlssocket seem to be a bit more
consistent than our current jupyter benchmark, but it hasn't given us
any problems so I'll leave it for now.
Add minimal support for workspace discovery, only used for determining
paths in the bluejay commands.
We can now discover the workspace structure, namely that the
`pyproject.toml` of a package belongs to a workspace `pyproject.toml`
with members and exclusion. The globbing logic is inspired by cargo. We
don't resolve `workspace = true` metadata declarations yet.
## Summary
This PR adds a test that currently leads to an error, but should
successfully resolve as of https://github.com/astral-sh/uv/pull/3100.
The core idea is that if we have a pinned package, we shouldn't try to
build other versions of that package if we have an unconstrained variant
with an extra.
## Summary
This PR adds system install tests to verify the behavior described in
#2798. It turns out this behavior _also_ affects Fedora and Amazon
Linux, we just didn't have the right conditions enabled (specifically,
you need to create the virtualenv with `python -m venv` to get these
symlinks), so the test suite was expanded to capture that.
The issue itself is also fixed by way of deduplicating the
`site-packages` entries.
Closes: https://github.com/astral-sh/uv/issues/2798
The sync scenarios script is broken, so i did the updates manually
```
$ ./scripts/sync_scenarios.sh
Setting up a temporary environment...
Using Python 3.12.1 interpreter at: /home/konsti/projects/uv/.venv/bin/python3
Creating virtualenv at: .venv
Activate with: source .venv/bin/activate
× No solution found when resolving dependencies:
╰─▶ Because docutils==0.21.post1 is unusable because the package metadata was inconsistent and you require docutils==0.21.post1, we can conclude that the requirements are unsatisfiable.
hint: Metadata for docutils==0.21.post1 was inconsistent:
Package metadata version `0.21` does not match given version `0.21.post1`
```
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
See https://github.com/astral-sh/uv/issues/2617
Note this also includes:
- #2918
- #2931 (pending)
A first step towards Python toolchain management in Rust.
First, we add a new crate to manage Python download metadata:
- Adds a new `uv-toolchain` crate
- Adds Rust structs for Python version download metadata
- Duplicates the script which downloads Python version metadata
- Adds a script to generate Rust code from the JSON metadata
- Adds a utility to download and extract the Python version
I explored some alternatives like a build script using things like
`serde` and `uneval` to automatically construct the code from our
structs but deemed it to heavy. Unlike Rye, I don't generate the Rust
directly from the web requests and have an intermediate JSON layer to
speed up iteration on the Rust types.
Next, we add add a `uv-dev` command `fetch-python` to download Python
versions per the bootstrapping script.
- Downloads a requested version or reads from `.python-versions`
- Extracts to `UV_BOOTSTRAP_DIR`
- Links executables for path extension
This command is not really intended to be user facing, but it's a good
PoC for the `uv-toolchain` API. Hash checking (via the sha256) isn't
implemented yet, we can do that in a follow-up.
Finally, we remove the `scripts/bootstrap` directory, update CI to use
the new command, and update the CONTRIBUTING docs.
<img width="1023" alt="Screenshot 2024-04-08 at 17 12 15"
src="https://github.com/astral-sh/uv/assets/2586601/57bd3cf1-7477-4bb8-a8e9-802a00d772cb">
## Summary
Is this, perhaps, not totally necessary? It doesn't show up in any
fixtures beyond those that I added recently.
Closes https://github.com/astral-sh/uv/issues/2846.
## Summary
Demonstrates some suboptimal behavior in how we handle invalid metadata,
which are fixed in https://github.com/astral-sh/uv/pull/2834.
The included wheels were modified by-hand to include invalid structures.
## Summary
In working on `--require-hashes`, I noticed that we're missing some
incompatibility tracking for `--find-links` distributions. Specifically,
we don't respect `--no-build` or `--no-binary`, so if we select a wheel
due to `--find-links`, we then throw a hard error when trying to build
it later (if `--no-binary` is provided), rather than selecting the
source distribution instead.
Closes https://github.com/astral-sh/uv/issues/2827.
Previously, we did not consider installed distributions as candidates
while performing resolution. Here, we update the resolver to use
installed distributions that satisfy requirements instead of pulling new
distributions from the registry.
The implementation details are as follows:
- We now provide `SitePackages` to the `CandidateSelector`
- If an installed distribution satisfies the requirement, we prefer it
over remote distributions
- We do not want to allow installed distributions in some cases, i.e.,
upgrade and reinstall
- We address this by introducing an `Exclusions` type which tracks
installed packages to ignore during selection
- There's a new `ResolvedDist` wrapper with `Installed(InstalledDist)`
and `Installable(Dist)` variants
- This lets us pass already installed distributions throughout the
resolver
The user-facing behavior is thoroughly covered in the tests, but
briefly:
- Installing a package that depends on an already-installed package
prefers the local version over the index
- Installing a package with a name that matches an already-installed URL
package does not reinstall from the index
- Reinstalling (--reinstall) a package by name _will_ pull from the
index even if an already-installed URL package is present
- To reinstall the URL package, you must specify the URL in the request
Closes https://github.com/astral-sh/uv/issues/1661
Addresses:
- https://github.com/astral-sh/uv/issues/1476
- https://github.com/astral-sh/uv/issues/1856
- https://github.com/astral-sh/uv/issues/2093
- https://github.com/astral-sh/uv/issues/2282
- https://github.com/astral-sh/uv/issues/2383
- https://github.com/astral-sh/uv/issues/2560
## Test plan
- [x] Reproduction at `charlesnicholson/uv-pep420-bug` passes
- [x] Unit test for editable package
([#1476](https://github.com/astral-sh/uv/issues/1476))
- [x] Unit test for previously installed package with empty registry
- [x] Unit test for local non-editable package
- [x] Unit test for new version available locally but not in registry
([#2093](https://github.com/astral-sh/uv/issues/2093))
- ~[ ] Unit test for wheel not available in registry but already
installed locally
([#2282](https://github.com/astral-sh/uv/issues/2282))~ (seems
complicated and not worthwhile)
- [x] Unit test for install from URL dependency then with matching
version ([#2383](https://github.com/astral-sh/uv/issues/2383))
- [x] Unit test for install of new package that depends on installed
package does not change version
([#2560](https://github.com/astral-sh/uv/issues/2560))
- [x] Unit test that `pip compile` does _not_ consider installed
packages
The snapshot filtering situation has gotten way out of hand, with each
test hand-rolling it's own filters on top of copied cruft from previous
tests.
I've attempted to address this holistically:
- `TestContext.filters()` has everything you should need
- This was introduced a while ago, but needed a few more filters for it
to be generalized everywhere
- Using `INSTA_FILTERS` is **not recommended** unless you do not want
the context filters
- It is okay to extend these filters for things unrelated to paths
- If you have to write a custom path filter, please highlight it in
review so we can address it in the common module
- `TestContext.site_packages()` gives cross-platform access to the
site-packages directory
- Do not manually construct the path to site-packages from the venv
- Do not turn off tests on Windows because you manually constructed a
Unix path to site-packages
- `TestContext.workspace_root` gives access to uv's repository directory
- Use this for installing from `scripts/packages/`
- If you need coverage for relative paths, copy the test package into
the `temp_dir` don't change the working directory of the test fixture
There is additional work that can be done here, such as:
- Auditing and removing additional uses of `INSTA_FILTERS`
- Updating manual construction of `Command` instances to use a utility
- The `venv` tests are particularly frightening in their lack of a test
context and could use some love
- Improving the developer experience i.e. apply context filters to
snapshots by default
## Summary
Hatch allows for highly dynamic customization of metadata via hooks. In
such cases, Hatch
can't upload the PEP 517 contract, in that the metadata Hatch would
return by
`prepare_metadata_for_build_wheel` isn't guaranteed to match that of the
built wheel.
Hatch disables `prepare_metadata_for_build_wheel` entirely for pip.
We'll instead disable
it on our end when metadata is defined as "dynamic" in the
pyproject.toml, which should
allow us to leverage the hook in _most_ cases while still avoiding
incorrect metadata for
the remaining cases.
Closes: https://github.com/astral-sh/uv/issues/2130.
We put a `.gitignore` with `*` at the top of our cache. When maturin was
building a source distribution inside the cache, it would walk up the
tree to find a gitignore, see that and ignore all python files. We now
add an (empty) `.git` directory one directory below, in the root of
built-wheels cache. This prevents ignore walking further up (it marks
the top level a git repository).
Deptry (from #2490) is a mid sized rust package with additional python
packages, so instead of using it in the test i've replaced it with a
small (44KB total) reproducer that uses cffi for faster building, the
entire test taking <2s on my machine.
Fixes#2490
## Summary
For example: `cargo run pip install .`
The strategy taken here is to attempt to extract the package name from
the distribution without executing the PEP 517 build steps. We could
choose to do that in the future if this proves lacking, but it adds
complexity.
Part of: https://github.com/astral-sh/uv/issues/313.
Add a single job for for fast lint tools. Rustfmt for rust, ruff for
python formatting and linting, prettier avoids inconsistent formatter
changes between pycharm and vscode.
Installing and importing numpy tests for two cases:
* The python architecture and the package architecture don't match
(https://github.com/astral-sh/uv/issues/2326)
* The libc of python and that of the package don't match on linux
(musllinux vs manylinux, picking a compatible manylinux version)
All pylint deps are py3-none-any, so they don't catch those cases.
## Summary
If a package uses Hatch's `root.uri` feature, we currently error:
```toml
dependencies = [
"black @ {root:uri}/../black_editable"
]
```
Even though we're using PEP 517 hooks to get the metadata, which
_should_ support this. The problem is that we load the full
`PyProjectToml`, which means we parse the requirements, which means we
reject what looks like a relative URL in dependencies.
Instead, we should only enforce a limited subset of `pyproject.toml`
(arguably none).
Closes https://github.com/astral-sh/uv/issues/2475.
## Summary
This PR adds limited support for PEP 440-compatible local version
testing. Our behavior is _not_ comprehensively in-line with the spec.
However, it does fix by _far_ the biggest practical limitation, and
resolves all the issues that've been raised on uv related to local
versions without introducing much complexity into the resolver, so it
feels like a good tradeoff for me.
I'll summarize the change here, but for more context, see [Andrew's
write-up](https://github.com/astral-sh/uv/issues/1855#issuecomment-1967024866)
in the linked issue.
Local version identifiers are really tricky because of asymmetry.
`==1.2.3` should allow `1.2.3+foo`, but `==1.2.3+foo` should not allow
`1.2.3`. It's very hard to map them to PubGrub, because PubGrub doesn't
think of things in terms of individual specifiers (unlike the PEP 440
spec) -- it only thinks in terms of ranges.
Right now, resolving PyTorch and friends fails, because...
- The user provides requirements like `torch==2.0.0+cu118` and
`torchvision==0.15.1+cu118`.
- We then match those exact versions.
- We then look at the requirements of `torchvision==0.15.1+cu118`, which
includes `torch==2.0.0`.
- Under PEP 440, this is fine, because `torch @ 2.0.0+cu118` should be
compatible with `torch==2.0.0`.
- In our model, though, it's not, because these are different versions.
If we change our comparison logic in various places to allow this, we
risk breaking some fundamental assumptions of PubGrub around version
continuity.
- Thus, we fail to resolve, because we can't accept both `torch @ 2.0.0`
and `torch @ 2.0.0+cu118`.
As compared to the solutions we explored in
https://github.com/astral-sh/uv/issues/1855#issuecomment-1967024866, at
a high level, this approach differs in that we lie about the
_dependencies_ of packages that rely on our local-version-using package,
rather than lying about the versions that exist, or the version we're
returning, etc.
In short:
- When users specify local versions upfront, we keep track of them. So,
above, we'd take note of `torch` and `torchvision`.
- When we convert the dependencies of a package to PubGrub ranges, we
check if the requirement matches `torch` or `torchvision`. If it's
an`==`, we check if it matches (in the above example) for
`torch==2.0.0`. If so, we _change_ the requirement to
`torch==2.0.0+cu118`. (If it's `==` some other version, we return an
incompatibility.)
In other words, we selectively override the declared dependencies by
making them _more specific_ if a compatible local version was specified
upfront.
The net effect here is that the motivating PyTorch resolutions all work.
And, in general, transitive local versions work as expected.
The thing that still _doesn't_ work is: imagine if there were _only_
local versions of `torch` available. Like, `torch @ 2.0.0` didn't exist,
but `torch @ 2.0.0+cpu` did, and `torch @ 2.0.0+gpu` did, and so on.
`pip install torch==2.0.0` would arbitrarily choose one one `2.0.0+cpu`
or `2.0.0+gpu`, and that's correct as per PEP 440 (local version
segments should be completely ignored on `torch==2.0.0`). However, uv
would fail to identify a compatible version. I'd _probably_ prefer to
fix this, although candidly I think our behavior is _ok_ in practice,
and it's never been reported as an issue.
Closes https://github.com/astral-sh/uv/issues/1855.
Closes https://github.com/astral-sh/uv/issues/2080.
Closes https://github.com/astral-sh/uv/issues/2328.
## Summary
This PR attempts to use a similar trick to that we added in
https://github.com/astral-sh/uv/pull/1878, but for post-releases.
In https://github.com/astral-sh/uv/pull/1878, we added a fake "minimum"
version to enable us to treat `< 1.0.0` as _excluding_ pre-releases of
1.0.0.
Today, on `main`, we accept post-releases and local versions in `>
1.0.0`. But per PEP 440, that should _exclude_ post-releases and local
versions, unless the specifier is itself a pre-release, in which case,
pre-releases are allowed (e.g., `> 1.0.0.post0` should allow `>
1.0.0.post1`).
To support this, we add a fake "maximum" version that's greater than all
the post and local releases for a given version. This leverages our last
remaining free bit in the compact representation.
This update pulls in https://github.com/pubgrub-rs/pubgrub/pull/177,
optimizing common range operations in pubgrub. Please refer to this PR
for a more extensive description and discussion of the changes.
The changes optimize that last remaining pathological case,
`bio_embeddings[all]` on python 3.12, which has to try 100k versions,
from 12s to 3s in the cached case. It should also enable smarter
prefetching in batches (https://github.com/astral-sh/uv/issues/170),
even though a naive attempt didn't show better network usage.
**before** 12s

**after** 3s

```
$ taskset -c 0 hyperfine --warmup 1 "../uv/target/profiling/main-uv pip compile ../uv/scripts/requirements/bio_embeddings.in" "../uv/target/profiling/branch-uv pip compile ../uv/scripts/requirements/bio_embeddings.in"
Benchmark 1: ../uv/target/profiling/main-uv pip compile ../uv/scripts/requirements/bio_embeddings.in
Time (mean ± σ): 12.321 s ± 0.064 s [User: 12.014 s, System: 0.300 s]
Range (min … max): 12.224 s … 12.406 s 10 runs
Benchmark 2: ../uv/target/profiling/branch-uv pip compile ../uv/scripts/requirements/bio_embeddings.in
Time (mean ± σ): 3.109 s ± 0.004 s [User: 2.782 s, System: 0.321 s]
Range (min … max): 3.103 s … 3.116 s 10 runs
Summary
../uv/target/profiling/branch-uv pip compile ../uv/scripts/requirements/bio_embeddings.in ran
3.96 ± 0.02 times faster than ../uv/target/profiling/main-uv pip compile ../uv/scripts/requirements/bio_embeddings.in
```
It also adds `bio_embeddings[all]` as a requirements test case.
- Now that `packse` is being published to PyPI we can install it from
there.
- Tweaks the tooling around scenario updates to manage a temporary
virtual environment for you.
- Makes use of a new index URL
- Includes local version segment scenarios (supersedes
https://github.com/astral-sh/uv/pull/2022)
## Summary
This is essentially a wrapper around something like `--python $(which
python3)`, but gives users a portable and streamlined way to solve the
common pain point of using `uv` in GitHub Actions or a Docker container.
See: https://github.com/astral-sh/uv/issues/1526.
* Document good first issues
* Document `scripts` directory, as far as useful for contributors
* Remove compare with pip script, we don't need it anymore
I think this closes#817.
---------
Co-authored-by: Jo <10510431+j178@users.noreply.github.com>
## Summary
When you invoke `python -c`, an empty string is prepended to `sys.path`,
which allows loading modules in the current directory
(https://docs.python.org/3/using/cmdline.html#cmdoption-P). However, in
PEP 517 builds, the current directory should _not_ be part of the path.
There's a flag we can use to disable this behavior (`-P`), but it's only
available in Python 3.11 and later, so instead, I'm doing something
similar to pip's `__main__.py`, which avoids this for `python -m pip`
invocations.
Closes https://github.com/astral-sh/uv/issues/1972.
## Summary
Even when pre-releases are "allowed", per PEP 440, `pydantic<2.0.0`
should _not_ include pre-releases. This PR modifies the specifier
translation to treat `pydantic<2.0.0` as `pydantic<2.0.0.min0`, where
`min` is an internal-only version segment that's invisible to users.
Closes https://github.com/astral-sh/uv/issues/1641.
Previously, only glibc builds were tracked in the bootstrap script. A
new field `libc` tracks if `gnu` or `musl` are used on linux, while it
is `"none"` everywhere else. I've confirmed that the updated script
works on ubuntu and alpine.
Add a `UV_BOOTSTRAP_DIR` option to configure the python bootstrap
directory. This is helpful when working across multiple platforms in a
single IDE session.
Uses `--find-links` to discover vendored scenario build dependencies and
allows us to use `--index-url` instead of `--extra-index-url` to avoid
hitting the real PyPI in scenario tests.
## Summary
This fixes https://github.com/astral-sh/uv/issues/1704 by removing the
version from the produced header.
## Test Plan
Checked with clippy, and tests are updated too.
## Summary
If an editable package declares a direct URL requirement, we currently
error since it's not considered an "allowed" requirement. We need to add
those URLs to the allow-list.
Closes https://github.com/astral-sh/uv/issues/1603.
## Summary
If you're developing on a package like `attrs` locally, and it has a
recursive extra like `attrs[dev]`, it turns out that we then try to find
the `attrs` in `attrs[dev]` from the registry, rather than recognizing
that it's part of the editable.
This PR fixes the issue by making editables slightly more first-class
throughout the resolver. Instead of mocking metadata, we explicitly
check for extras in various places. Part of the problem here is that we
treated editables as URL dependencies, but when we saw an _extra_ like
`attrs[dev]`, we didn't map that back to the URL. So now, we treat them
as registry dependencies, but with the appropriate guardrails
throughout.
Closes https://github.com/astral-sh/uv/issues/1447.
## Test Plan
- Cloned `attrs`.
- Ran `cargo run venv && cargo run pip install -e ".[dev]" -v`.
There was not much benefit to avoiding the new download (and it was
broken in some Windows compatibility work) and this ensures there are
_only_ the versions we specified
First, replace all usages in files in-place. I used my editor for this.
If someone wants to add a one-liner that'd be fun.
Then, update directory and file names:
```
# Run twice for nested directories
find . -type d -print0 | xargs -0 rename s/puffin/uv/g
find . -type d -print0 | xargs -0 rename s/puffin/uv/g
# Update files
find . -type f -print0 | xargs -0 rename s/puffin/uv/g
```
Then add all the files again
```
# Add all the files again
git add crates
git add python/uv
# This one needs a force-add
git add -f crates/uv-trampoline
```
Instead of dropping versions without a compatible distribution, we track
them as incompatibilities in the solver. This implementation follows
patterns established in https://github.com/astral-sh/puffin/pull/1290.
This required some significant refactoring of how we track incompatible
distributions. Notably:
- `Option<TagPriority>` is now `WheelCompatibility` which allows us to
track the reason a wheel is incompatible instead of just `None`.
- `Candidate` now has a `CandidateDist` with `Compatible` and
`Incompatibile` variants instead of just `ResolvableDist`; candidates
are not strictly compatible anymore
- `ResolvableDist` was renamed to `CompatibleDist`
- `IncompatibleWheel` was given an ordering implementation so we can
track the "most compatible" (but still incompatible) wheel. This allows
us to collapse the reason a version cannot be used to a single
incompatibility.
- The filtering in the `VersionMap` is retained, we still only store one
incompatible wheel per version. This is sufficient for error reporting.
- A `TagCompatibility` type was added for tracking which part of a wheel
tag is incompatible
- `Candidate::validate_python` moved to
`PythonRequirement::validate_dist`
I am doing more refactoring in #1298 — I think a couple passes will be
necessary to clarify the relationships of these types.
Includes improved error message snapshots for multiple incompatible
Python tag types from #1285 — we should add more scenarios for coverage
of behavior when multiple tags with different levels are present.
We use
- An arbitrary ABI hash: `MMMMMM` (six base64 characters)
- An unlikely Jython27 Python tag
For cases that are valid but are never going to be available during
tests.
See https://github.com/zanieb/packse/pull/109
Run `cargo test` on windows in CI, pulling the switch on tier 1 windows
support.
These changes make the bootstrap script virtually required for running
the tests. This gives us consistency between and CI, but it also locks
our tests to python-build-standalone and an articificial `PATH`.
I've deleted the shell bootstrap script in favor of only the python one,
which also runs on windows. I've left the (sym)link creation of the
bootstrap in place, even though it is not used by the tests anymore.
I've reactivated the three tests that would previously stack overflow by
doubling their stack sizes. The stack overflows only happen in debug
mode, so this is neither a user facing problem nor an actual problem
with our code and this workaround seems better than optimizing our code
for case that the (release) compiler can optimize much better for.
The handling of patch versions will be fixed in a follow-up PR.
Closes#1160Closes#1161
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
There are no binary installers for the latests patch versions of cpython
for windows, and building them is hard. As an alternative, we download
python-build-standanlone cpythons and put them into `<project
root>/bin`. On unix, we can symlink `pythonx.y.z` into this directory
and point `PUFFIN_PYTHON_PATH` to it. On windows, all pythons are called
`python.exe` and they don't like being linked. Instead, we add the path
to each directory containing a `python.exe` to `PUFFIN_PYTHON_PATH`,
similar to the regular `PATH`. The python discovery on windows was
extended to respect `PUFFIN_PYTHON_PATH` where needed.
These changes mean that we don't need to (sym)link pythons anymore and
could drop that part to the script.
435 tests run: 389 passed (21 slow), 46 failed, 1 skipped
Windows doesn't support symlinks, doesn't use a `bin` directory and all
pythons are called `python.exe`.
Note that this is still broken, `.\bin\python3.10.13` is missing its
.exe extension and renaming it to `.\bin\python3.10.13.exe` makes it
complain about not finding python310.dll.
In the scenario tests, we want to make sure we're actually conforming to
the scenario's expectations, so we now have an extra assertion on
whether resolution failed or succeeded as well as that it includes the
given packages.
Closes#1112Closes#1030
We need more flexible filters than those `inta` offers, and `insta_cmd`
makes it impossible to plug in programmatic filters. At the same time we
use barely any of `insta_cmd`'s features. We can replace the subset we
need in about 50 loc.
Mostly a mechanical refactor to use the `puffin_snapshot!` and
`TestContext` infrastructure in the add, remove, venv and pip uninstall
tests, in preparation for adding programmatic windows testing filters.
The is only one remaining usage of `assert_cmd_snapshot!` now in the
`puffin_snapshot!` macro.
Mostly a mechanical refactor to use the `puffin_snapshot!` and
`TestContext` infrastructure in the pip compile and pip install
scenarios, in preparation for adding programmatic windows testing
filters.
I originally used Python 3.10, since 3.10 and 3.11 are by far the most
common (at least for [Ruff](https://pypistats.org/packages/ruff)). But
3.12 should give Python tools the most favorable benchmarks.
## Summary
Overall, similar to Poetry, with some simplifications (e.g., we don't
need to translate to Poetry's dependency syntax), and the need to adjust
how we manage the cache and virtual environment.
Instrument the main function as anchor span for checking overhead and
update tracing-durations-export to 0.2.0 for differentiating
blocking/non-blocking tasks.
Add a `jupyter.in` requirement since `pip install jupyter` is a common
operation. I tried `jupyterlab` too but there is no difference in
performance (1.00 ± 0.07).
A 1:1 port of the Bash script to Python for use on Windows.
Pulls some parts of #1068 but much more minimal. Avoids an additional
dependency on `requests`. Because we require `zstandard` to unzip the
distributions we unfortunately cannot be dependency free and cannot have
`bootstrap.sh` download the Python version needed to run this script
without it doing a non-trivial amount of work.
Retains the Bash script for now so you can bootstrap without Python
available. I may drop it in the future?
This is apparently necessary to permit Python 3.8.12 to run. Namely, it
needs to link to libcrypt.so.1, and without libxcrypt-compat, that
linking step fails.
In https://github.com/astral-sh/puffin/pull/1040 we broke the pip
compile scenarios designed to test failure when a required Python
version is not available — resolution succeeded because all of the
Python versions were available in CI. Following #1105 we have the
ability to isolate tests from Python versions available in the system.
Here, we limit the scenarios to only the Python version in the current
environment, restoring our ability to test the error messages.
With https://github.com/zanieb/packse/pull/95, we will be able to
specify scenarios with access to additional system Python versions. This
will allow us to include test coverage where resolution can succeed by
using a version available elsewhere on the system. See #1111 for this
follow-up.
Replaces https://github.com/astral-sh/puffin/pull/1068 and #1070 which
were more complicated than I wanted.
- Introduces a `.python-versions` file which defines the Python versions
needed for development
- Adds a Bash script at `scripts/bootstrap/install` which installs the
required Python versions from `python-build-standalone` to `./bin`
- Checks in a `versions.json` file with metadata about available
versions on each platform and a `fetch-version` Python script derived
from `rye` for updating the versions
- Updates CI to use these Python builds instead of the `setup-python`
action
- Updates to the latest packse scenarios which require Python 3.8+
instead of 3.7+ since we cannot use 3.7 anymore and includes new test
coverage of patch Python version requests
- Adds a `PUFFIN_PYTHON_PATH` variable to prevent lookup of system
Python versions for isolation during development
Tested on Linux (via CI) and macOS (locally) — presumably it will be a
bit more complicated to do proper Windows support.
## Summary
First batch of changes for windows support. Notable changes:
* Fixes all compile errors and added windows specific paths.
* Working venv creation on windows, both from a base interpreter and
from a venv. This requires querying `stdlib` from the sysconfig paths to
find the launcher.
* Basic url/path conversion handling for windows.
* `if cfg!(...)` instead of `#[cfg()]`. This should make it easier to
keep everything compiling across platforms.
## Outlook
Test summary: 402 tests run: 299 passed (15 slow), 103 failed, 1 skipped
There are various reason for the remaining test failure:
* Windows-specific colorama and tzdata dependencies that change the
snapshot slightly. This is by far the biggest batch.
* Some url-path handling issues. I fixed some in the PR, some remain.
* Lack of the latest python patch versions for older pythons on my
machine, since there are no builds for windows and we need to register
them in the registry for them to be picked up for `py --list-paths` (CC
@zanieb RE #1070).
* Lack of entrypoint launchers.
* ... likely more
In windows, `python3.9` and `python3.11` are not in `PATH`. Instead, we
should pass only the python version to `puffin venv -p` in packse
scenarios (#1039).