## Summary
This will make it easier to use the paths returned by `distutils.py`
(for some cases). No code or behavior changes; just removing some fields
we don't need.
## Summary
Closes#1977
This allows us to send uv's version in the `uv-client` User Agent
header.
Here's how request headers look like to a server now:
```
...
Accept: application/vnd.pypi.simple.v1+json, application/vnd.pypi.simple.v1+html;q=0.2, text/html;q=0.01
User-Agent: uv/0.1.13
...
```
~~I went for a mix of Option 1 and 2 from #1977.~~ Open to alternative
naming as well, not tied too strongly here to the names picked.
~~Another possibility for this new crate is that we can use it to
consolidate metadata that exists across crates to ultimately be able to
create linehaul information described in #1958, but I haven't looked
into what those changes might look like.~~
<!-- What's the purpose of the change? What does it do, and why? -->
## Test Plan
<!-- How was it tested? -->
Added initial tests in the new crate to exercise its public API and
added a new test to uv-client to validate the headers using a 1-time
disposable server.
## Summary
Ensures that local dependencies function similarly to editables, in that
if they're `uv pip install`ed, we invalidate them.
Closes https://github.com/astral-sh/uv/issues/1651.
## Summary
Internal-only refactor to consolidate multiple codepaths we have for
checking whether a cached or installed entry is up-to-date with a local
requirement.
Error for `uv pip compile scripts/requirements/jupyter.in` without
internet:
**Before**
```
error: error sending request for url (https://pypi.org/simple/jupyter/): error trying to connect: dns error: failed to lookup address information: No such host is known. (os error 11001)
Caused by: error trying to connect: dns error: failed to lookup address information: No such host is known. (os error 11001)
Caused by: dns error: failed to lookup address information: No such host is known. (os error 11001)
Caused by: failed to lookup address information: No such host is known. (os error 11001)
```
**After**
```
error: Could not connect, are you offline?
Caused by: error sending request for url (https://pypi.org/simple/django/): error trying to connect: dns error: failed to lookup address information: Temporary failure in name resolution
Caused by: error trying to connect: dns error: failed to lookup address information: Temporary failure in name resolution
Caused by: dns error: failed to lookup address information: Temporary failure in name resolution
Caused by: failed to lookup address information: Temporary failure in name resolution
```
On linux, it would be "Temporary failure in name resolution" instead of
"No such host is known. (os error 11001)".
The implementation checks for "dne error" stringly as hyper errors are
opaque. The danger is that this breaks with a hyper update. We still get
the complete error trace since reqwest eagerly inlines errors
(https://github.com/seanmonstar/reqwest/issues/2147).
No test since i wouldn't know how to simulate this in cargo test.
Fixes#1971
## Summary
This PR expands environment variables in `-r` and `-c` paths _within_
requirements files. We already do this for `@` URL references and
others.
Closes https://github.com/astral-sh/uv/issues/1473.
## Summary
After https://github.com/astral-sh/uv/pull/2121, the only remaining
issue is that calling `canonicalize` on these Pythons returns an error.
Closes https://github.com/astral-sh/uv/issues/2105.
## Test Plan
Uninstalled all python.org Pythons on my Windows machine, then created a
virtualenv. The resulting config file:
```
Using Python 3.11.8 interpreter at: C:\Users\crmar\AppData\Local\Microsoft\WindowsApps\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\python.exe
Creating virtualenv at: .venv
Activate with: .venv\Scripts\activate
PS C:\Users\crmar\workspace\puffin> cat .\.venv\pyvenv.cfg
home = C:\Users\crmar\AppData\Local\Microsoft\WindowsApps\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0
implementation = CPython
version_info = 3.11.8
include-system-site-packages = false
uv = 0.1.13
prompt = puffin
```
Prior to this PR, it would fail with a canonicalization error.
Prior to #2121, it would leave a "bad" Python in the config file:
```
Using Python 3.11.8 interpreter at: C:\Users\crmar\AppData\Local\Microsoft\WindowsApps\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\python.exe
Creating virtualenv at: .venv
Activate with: .venv\Scripts\activate
PS C:\Users\crmar\workspace\puffin> cat .\.venv\pyvenv.cfg
home = C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.11_3.11.2288.0_x64__qbz5n2kfra8p0
implementation = CPython
version_info = 3.11.8
include-system-site-packages = false
uv = 0.1.13
prompt = puffin
```
Which, once activated, would fail with:
```
(venv) PS C:\Users\crmar\workspace\puffin> python
No Python at '"C:\Users\crmar\AppData\Local\Programs\Python\Python312\python.exe'
```
## Summary
When I install via the Windows Store, `interpreter.base_prefix` contains
a bunch of resolved information that leads to a broken environment.
Instead, we now use `sys._base_executable` on Windows by default,
falling back to `sys.base_prefix` if it doesn't exist. (There are some
issues with `sys.base_executable` that lead to complexity in
`virtualenv`, but they only affect POSIX.) Admittedly, I don't know when
`sys._base_executable` wouldn't exist. It exists in all the environments
I've tested.
Additionally, we use the system interpreter directly if we're outside of
a virtualenv.
## Summary
Closes#2012
This changes `RequirementsTxtParserError::Parser` to take a line, column
instead of cursor location to improve reporting of parser errors. A new
function was added to compute the line and column based on the content
and cursor location when a parser error occurs for simplicity.
Given `uv pip compile .\requirements.txt` of below
```
numpy>=1,<2
--borken
tqdm
```
Before:
```
error: Unexpected '-', expected '-c', '-e', '-r' or the start of a requirement in `.\requirements.txt` at position 14
```
After:
```
error: Unexpected '-', expected '-c', '-e', '-r' or the start of a requirement in `.\requirements.txt` at position 2:3
```
Open Question: Do we want to support `line:column` for other types of
errors? I didn't look dig other potential error types where this might
be desired.
## Test Plan
New test was added to `requirements-txt` crate with this example.
## Summary
It looks like these have been included since the very first gourgeist
commit, but `virtualenv` and `venv` don't include them, and they only
add complexity AFAICT.
## Summary
I think Camino is nice but it makes it much harder to work in
`uv-virtualenv`, since it's the _only_ crate that uses it. If we want to
use Camino, we should use it everywhere IMO.
## Summary
Right now, we have virtualenv construction encoded in a few different
places. Namely, it happens in both `gourgeist` and
`virtualenv_layout.rs` -- _and_ `interpreter.rs` also encodes some
knowledge about how they work, by way of reconstructing the
`SysconfigPaths`.
Instead, `gourgeist` now returns the complete layout, enumerating all
the directories it created. So, rather than returning a root directory,
and re-creating all those paths in `uv-interpreter`, we pass the data
directly back to it.
## Summary
Adds support for `--system-site-packages`. Unlike `pip`, we won't take
the system site packages into account in subsequent commands. I think
this is ok.
Closes https://github.com/astral-sh/uv/issues/1483.
Update #1918 to handle https://github.com/pypa/pip/pull/12536, where pip
removed their python minor entrypoint. The pip test is semi-functional
since it builds pip from source instead of using a wheel with the wrong
entrypoint, we have to update it when this pip version has a release.
Closes#1593.
## Summary
This is based on Pradyun's installer branch
(d01624e5f2/src/installer/scripts.py (L54)),
which is itself based on pip
(0ad4c94be7/src/pip/_vendor/distlib/scripts.py (L136)).
The gist of it is: on Posix platforms, if a path contains a space (or is
too long), we wrap the shebang in a `/bin/sh` invocation.
Closes https://github.com/astral-sh/uv/issues/2076.
## Test Plan
```
❯ cargo run venv "foo"
Finished dev [unoptimized + debuginfo] target(s) in 0.14s
Running `target/debug/uv venv foo`
Using Python 3.12.0 interpreter at: /Users/crmarsh/.local/share/rtx/installs/python/3.12.0/bin/python3
Creating virtualenv at: foo
Activate with: source foo/bin/activate
❯ source "foo bar/bin/activate"
❯ which black
black not found
❯ cargo run pip install black
Resolved 6 packages in 177ms
Installed 6 packages in 17ms
+ black==24.2.0
+ click==8.1.7
+ mypy-extensions==1.0.0
+ packaging==23.2
+ pathspec==0.12.1
+ platformdirs==4.2.0
❯ which black
/Users/crmarsh/workspace/uv/foo bar/bin/black
❯ black
Usage: black [OPTIONS] SRC ...
One of 'SRC' or 'code' is required.
❯ cat "foo bar/bin/black"
#!/bin/sh
'''exec' '/Users/crmarsh/workspace/uv/foo bar/bin/python' "$0" "$@"
' '''
# -*- coding: utf-8 -*-
import re
import sys
from black import patched_main
if __name__ == "__main__":
sys.argv[0] = re.sub(r"(-script\.pyw|\.exe)?$", "", sys.argv[0])
sys.exit(patched_main())
```
I'm not at all sure whether this is a correct fix or not, but it does
seem to make `pypy` work in at least some cases with `uv`. Previously,
I couldn't get it to work at all. Namely the virtualenv was created
with a `lib/python3.10/site-packages`, but whenever I did a `uv
pip install` in that virtualenv, it was looking for a non-existent
`lib/pypy3.10/site-packages` directory.
With this PR, the workflow reported as not working in #1488 now works
for me:
```
$ pypy3 --version
Python 3.10.13 (fc59e61cfbff, Jan 17 2024, 05:35:45)
[PyPy 7.3.15 with GCC 13.2.1 20230801]
$ uv venv --python $(which pypy3) --seed
Using Python 3.10.13 interpreter at: /usr/bin/pypy3
Creating virtualenv at: .venv
+ pip==24.0
+ setuptools==69.1.1
+ wheel==0.42.0
Activate with: source .venv/bin/activate
$ uv pip install 'alembic==1.0.11'
Resolved 9 packages in 8ms
Installed 9 packages in 14ms
+ alembic==1.0.11
+ greenlet==3.0.3
+ mako==1.3.2
+ markupsafe==2.1.5
+ python-dateutil==2.8.2
+ python-editor==1.0.4
+ six==1.16.0
+ sqlalchemy==2.0.27
+ typing-extensions==4.10.0
```
Where as previously (current `main`), I was hitting this error:
```
$ uv venv --python $(which pypy3) --seed
Using Python 3.10.13 interpreter at: /usr/bin/pypy3
Creating virtualenv at: .venv
+ pip==24.0
+ setuptools==69.1.1
+ wheel==0.42.0
Activate with: source .venv/bin/activate
$ uv pip install 'alembic==1.0.11'
error: Failed to list installed packages
Caused by: failed to read directory `/home/andrew/astral/issues/uv/i1488/.venv/lib/pypy3.10/site-packages`
Caused by: No such file or directory (os error 2)
```
Notice though that neither outcome above matches the error reported in #1488,
so this is likely not a complete fix. There are perhaps other lurking
issues.
Ref #1488
## Summary
This was a missed find-and-replace. We shouldn't assume `layout.platlib`
here, since `RECORD` will be written to `site_packages` (which could be
`layout.purelib`).
This is hard to reproduce. You need a _fresh_ environment where
`purelib` and `platlib` differ (which isn't the case for virtualenvs, at
least typically), and you need to be installing a new package that is a
purelib. I tested it by manually changing `platlib` to point to a
different path.
Closes https://github.com/astral-sh/uv/issues/2064.
Previously, `uv` would always prioritize the index given by
`--index-url`. It would then try any indexes after that given by zero
or more `--extra-index-url` flags. This differed from `pip` in that any
priority was given at all, where `pip` doesn't guarantee any priority
ordering of indexes.
We could go in the direction of mimicing `pip`'s behavior here, but it
at present has issues with dependency confusion attacks where packages
may get installed from indexes you don't control. More specifically,
there is an issue of different trust levels. See discussion in #171 and
[PEP-0708] for more on the security impact.
In contrast, `uv` will only select versions for a package from a single
index. That is, even if `foo` is in indexes `a` and `b`, it will
only consider the versions from the index that it checks first. This
probably helps with respect to dependency confusion attacks, but also
means that `uv` doesn't quite cover all of the same use cases as `pip`.
In this PR, we retain the notion of prioritizing indexes, but
tweak it so that PyPI is preferred last as opposed to first. Or
more precisely, the `--index-url` flag specifies a fallback index,
not the primary index, and is deprioritized beneath every index
specified by `--extra-index-url`. The ordering among indexes given by
`--extra-index-url` remains the same: earlier indexes are prioritized
over later indexes.
While this tweak likely won't hit all use cases, I believe it will
resolve some of the most common pain points without exacerbating
dependency confusion problems.
Ref #171, Fixes#1377, Fixes#1451, Fixes#1600
[PEP-0708]: https://peps.python.org/pep-0708/
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
Expose the uv_normalize types from pep508, so that these can be used
with a crates.io version.
---------
Co-authored-by: konsti <konstin@mailbox.org>
`uv --system` is failing in GitHub Actions, because `py --list-paths`
returns all the pre-cached Pythons:
```
-V:3.12 * C:\hostedtoolcache\windows\Python\3.12.2\x64\python.exe
-V:3.12-32 C:\hostedtoolcache\windows\Python\3.12.2\x86\python.exe
-V:3.11 C:\hostedtoolcache\windows\Python\3.11.8\x64\python.exe
-V:3.11-32 C:\hostedtoolcache\windows\Python\3.11.8\x86\python.exe
-V:3.10 C:\hostedtoolcache\windows\Python\3.10.11\x64\python.exe
-V:3.10-32 C:\hostedtoolcache\windows\Python\3.10.11\x86\python.exe
-V:3.9 C:\hostedtoolcache\windows\Python\3.9.13\x64\python.exe
-V:3.9-32 C:\hostedtoolcache\windows\Python\3.9.13\x86\python.exe
-V:3.8 C:\hostedtoolcache\windows\Python\3.8.10\x64\python.exe
-V:3.8-32 C:\hostedtoolcache\windows\Python\3.8.10\x86\python.exe
-V:3.7 C:\hostedtoolcache\windows\Python\3.7.9\x64\python.exe
-V:3.7-32 C:\hostedtoolcache\windows\Python\3.7.9\x86\python.exe
```
So, our default selector returns the first entry here. But none of these
are actually in `PATH` except the one that the user installed via
`actions/setup-python@v5` -- that's the point of the action, that it
puts the correct versions in `PATH`.
It seems to me like we should prioritize `PATH` over `py --list-paths`.
Is there a good reason not to do this?
Closes: https://github.com/astral-sh/uv/issues/2056
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
With this PR I've added the option environment variables to the wheel
building process, through the `BuildDispatch`. When integrating uv with
our project pixi (https://github.com/prefix-dev/pixi/pull/863). We ran
into this missing requirement, I've made a rough version here, could
maybe use some refinement.
### Why do we need this?
Because pixi allow the user to use a conda activated prefix for wheel
building, this comes with a number of environment variables, like `PATH`
but also `CONDA_PREFIX` amongst others. This allows the user to use
system dependencies from conda-forge to use during an sdist build.
Because we use `uv` as a library we need to pass in the options
programatically. Additionally, in general there is nothing holding a
python sdist back from actually depending on an environment variable,
see
e.g the test package: https://pypi.org/project/env-test-package/
### What about `ConfigSettings`
I think `ConfigSettings` does not suffice because e.g. CMake could
function differently when the `CONDA_PREFIX` is set. Also, we do not
know if the user supplied backend actually support these settings.
### Path handling
Because the user can now also supply a PATH in the environment map, the
logic I had was the following, I format the path so that it has the
following precedence
1. venv scripts dir.
2. user supplied path.
3. system path.
### Improvements
There is some path modification and copying happening everytime we use
the `run_python_script` function, I think we could improve this but
would like some pointers where to best put the maybe split and cached
version, we might also want to use some types to split these things up.
### Finally
I did not add any of these options to the uv executables, I first would
like to know if this is a direction we would want to go in. I'm happy to
do this or make any changes that you feel would benefit this project.
Also tagging @wolfv to keep track of this as well.
## Test Plan
<!-- How was it tested? -->
---------
Co-authored-by: konsti <konstin@mailbox.org>
## Summary
We shouldn't be resolving symlinks on the provided interpreter;
otherwise we break `pyenv`, since running `cargo run pip install mypy
--python .venv/bin/python` will immediately resolve to (e.g.)
`/Users/crmarsh/.pyenv/versions/3.10.2/bin/python3.10`, and pyenv relies
on the path to do its lookups.
Instead, the canonicalizing happens when we query the interpreter
metadata.
Closes https://github.com/astral-sh/uv/issues/2068.
## Test Plan
Ran `cargo run pip install mypy --python .venv/bin/python -v -n` with a
virtualenv created using a pyenv Python; verified that Mypy was
installed into the virtual environment, rather than into the global
environment.
## Summary
If a pre-release marker is present on a requirement in a constraint
file, we should allow pre-releases for that package.
Closes https://github.com/astral-sh/uv/issues/2063.
## Summary
Temporarily disabling `install_git_private_https_pat_and_username` since
running this test can break your local Git authentication for other
projects. I experienced this today and keep finding myself needing to
ignore it locally.
See: https://github.com/astral-sh/uv/issues/1980.
## Summary
Internal refactor to `PrioritizedDistribution` that I think should
reduce the size? Although the motivation here is simplicity, not perf.
Instead of storing:
```rust
/// The highest-priority, installable wheel for the package version.
compatible_wheel: Option<(DistMetadata, TagPriority)>,
/// The most-relevant, incompatible wheel for the package version.
incompatible_wheel: Option<(DistMetadata, IncompatibleWheel)>,
```
We now store:
```rust
wheel: Option<(DistMetadata, WheelCompatibility)>,
```
Where `WheelCompatibility` is an enum of `TagPriority` or
`IncompatibleWheel`.
## Summary
This is essentially a wrapper around something like `--python $(which
python3)`, but gives users a portable and streamlined way to solve the
common pain point of using `uv` in GitHub Actions or a Docker container.
See: https://github.com/astral-sh/uv/issues/1526.
## Summary
This also preserves the environment variables in the output file, e.g.:
```
Resolved 1 package in 216ms
# This file was autogenerated by uv via the following command:
# uv pip compile requirements.in --emit-index-url
--index-url https://test.pypi.org/${SUFFIX}
requests==2.5.4.1
```
I'm torn on whether that's correct or undesirable here.
Closes#2035.
## Summary
`PythonPlatform` only exists to format paths to directories within
virtual environments based on a root and an OS, so it's now
`VirtualenvLayout`.
`Virtualenv` is now used for non-virtual environment Pythons, so it's
now `PythonEnvironment`.
## Summary
Now that we have the ability to introspect the installed packages for
arbitrary Pythons, we can allow `pip freeze` and `pip list` to fall back
to the "default" Python, if no virtualenv is present.
Closes https://github.com/astral-sh/uv/issues/2005.
## Summary
This PR aligns the `uv pip install --python` flag with the `uv venv
--python` flag, such that the former now accepts binary names and Python
versions by way of using the same `find_requested_python` method under
the hood.
## Summary
This PR adds a `--python` flag that allows users to provide a specific
Python interpreter into which `uv` should install packages. This would
replace the `VIRTUAL_ENV=` workaround that folks have been using to
install into arbitrary, system environments, while _also_ actually being
correct for installing into non-virtual environments, where the bin and
site-packages paths can differ.
The approach taken here is to use `sysconfig.get_paths()` to get the
correct paths from the interpreter, and then use those for determining
the `bin` and `site-packages` directories, rather than constructing them
based on hard-coded expectations for each platform.
Closes https://github.com/astral-sh/uv/issues/1396.
Closes https://github.com/astral-sh/uv/issues/1779.
Closes https://github.com/astral-sh/uv/issues/1988.
## Test Plan
- Verified that, on my Windows machine, I was able to install `requests`
into a global environment with: `cargo run pip install requests --python
'C:\\Users\\crmarsh\\AppData\\Local\\Programs\\Python\\Python3.12\\python.exe`,
then `python` and `import requests`.
- Verified that, on macOS, I was able to install `requests` into a
global environment installed via Homebrew with: `cargo run pip install
requests --python $(which python3.8)`.
In this context, we already know (as the comment says) that `self` does
not have a local segment, so we don't need to strip it.
This change isn't motivated by anything other than making the code and
comment in sync. For example, when I first looked at it, I wondered
whether the extra stripping was somehow necessary. But it isn't.
## Summary
When a `pyproject.toml` is provided directly to `uv pip compile`, we
were failing to resolve recursive extras. The solution I settled on here
is to flatten them recursively when determining the requirements
upfront.
Closes https://github.com/astral-sh/uv/issues/1987.
## Test Plan
`cargo test`
## Summary
For a venv created by `virtualenv`, the `pyvenv.cfg` file specifies the
full version string in the `version_info` field:
```
home = /Users/x/.rye/py/cpython@3.12.1/install/bin
implementation = CPython
version_info = 3.12.1.final.0
virtualenv = 20.25.0
include-system-site-packages = false
base-prefix = /Users/x/.rye/py/cpython@3.12.1/install
base-exec-prefix = /Users/x/.rye/py/cpython@3.12.1/install
base-executable = /Users/x/.rye/py/cpython@3.12.1/install/bin/python3
```
The relevant code can be found here:
4ca8a20c17/src/virtualenv/create/creator.py (L167)
This PR changes `pyvenv.cfg` created by uv for better compatibility with
`virtualenv`.
## Test Plan
```sh
uv venv
cat .venv/pyvenv.cfg
```
Thank you for writing `uv`! We're already using it internally on some
container image builds and finding that it's noticeably faster 💯
## Summary
I was attempting to use `uv` alongside [modal](https://modal.com/)'s
internal PyPi mirror and ran into some issues. The first issue was the
following error:
```
error: Failed to download: nltk==3.8.1
Caused by: content-length header is missing from response
```
This error was coming from within
`RegistryClient::wheel_metadata_no_pep658`. By logging requests on the
client (uv) and server (internal mirror) sides I've concluded that it's
occurring because `uv` is sending a header suggesting that it can accept
a gzip'd response, but decompressing the gzip'd response strips the
`content-length` header:
https://github.com/seanmonstar/reqwest/issues/294.
**Logged request, client-side:**
```
0.981664s 0ms INFO uv_client::registry_client JONO, REQ: Request { method: HEAD, url: Url { scheme: "http", cannot_be_a_base: false, username: "", password: None, host: Some(Ipv4(172.21.0.1)), port: Some(5555), path: "/simple/joblib/joblib-1.3.2-py3-none-any.whl", query: None, fragment: None }, headers: {} }
```
No headers set explicitly by `uv`.
**Logged request, server-side:**
```
2024-02-26T03:45:08.598272Z DEBUG pypi_mirror: origin request = Request { method: HEAD, uri: /simple/joblib/joblib-1.3.2-py3-none-any.whl, version: HTTP/1.1, headers: {"accept": "*/*", "user-agent": "uv", "accept-encoding": "gzip, br", "host": "172.21.0.1:5555"}, body: Body(Empty) }
```
Server receives `"accept-encoding": "gzip, br",`.
My change adding the header to the request fixed this issue. But our
internal mirror is just passing through PyPI responses and PyPI
responses do contain PEP 658 data, and so `wheel_metadata_no_pep658`
shouldn't execute.
The issue there is that the PyPi response field has _dashes_ not
_underscores_ (https://peps.python.org/pep-0691/).
<img width="1261" alt="image"
src="https://github.com/astral-sh/uv/assets/12058921/35230f27-441a-457a-827b-870a1a16c16a">
After changing the `alias` the PEP 658 codepath now runs correctly :)
## Test Plan
I tested by installing against both our mirror and against PyPi:
```
RUST_LOG="uv=trace" UV_NO_CACHE=true UV_INDEX_URL="http://172.21.0.1:5555/simple" target/release/uv pip install -v nltk
RUST_LOG="uv=trace" UV_NO_CACHE=true UV_INDEX_URL="http://localhost:5555/simple" target/release/uv pip uninstall -v nltk
```
```
target/release/uv pip install -v nltk
target/release/uv pip uninstall -v nltk
```
## Summary
When you invoke `python -c`, an empty string is prepended to `sys.path`,
which allows loading modules in the current directory
(https://docs.python.org/3/using/cmdline.html#cmdoption-P). However, in
PEP 517 builds, the current directory should _not_ be part of the path.
There's a flag we can use to disable this behavior (`-P`), but it's only
available in Python 3.11 and later, so instead, I'm doing something
similar to pip's `__main__.py`, which avoids this for `python -m pip`
invocations.
Closes https://github.com/astral-sh/uv/issues/1972.
## Summary
Closes#1922
When a timeout occurs, it hints to the user to configure the
`UV_HTTP_TIMEOUT` env var.
Before
```
error: Failed to download distributions
Caused by: Failed to fetch wheel: torch==2.2.0
Caused by: Failed to extract source distribution
Caused by: request or response body error: operation timed out
Caused by: operation timed out
```
After
```
error: Failed to download distributions
Caused by: Failed to fetch wheel: torch==2.2.0
Caused by: Failed to extract source distribution
Caused by: Failed to download distribution due to network timeout. Try increasing UV_HTTP_TIMEOUT.
```
## Test Plan
<!-- How was it tested? -->
Wasn't sure if we'd want a test. If we do, is there a existing mechanism
or preferred approach to force a timeout to occur in tests? Maybe set
the timeout to 1 and add torch as an install check (although it's
possible that could become flaky)?
Hi, love your work on `uv` 👋!
Opening a Draft PR early to check if there are any existing rust table
formatting libs that I am unaware of (either already in `uv`/`ruff`, or
the rust ecosystem) before spending much time on inventing the wheel
myself and cleaning it up. Any other pointers are also welcome (e.g. on
the editable filtering).
Editable project locations in `uv pip list` include the file scheme
(`file://`), where they are omitted in `pip list`. Is this desired, or
should it replicate pip?
## Summary
Implementation for #1401
`--editable` flag is implemented.
`--outdated` and `--uptodate` out of scope for this PR (requires latest
version information, and type wheel/sdist)
## Test Plan
Not yet implemented as I couldn't locate the tests for `uv pip freeze`.
We can compare to `pip` in
`scripts/compare_with_pip/compare_with_pip.py`?
Address a few pedantic lints
lints are separated into separate commits so they can be reviewed
individually.
I've not added enforcement for any of these lints, but that could be
added if desirable.
## Summary
When the user provides an output file, avoid writing the `pip compile`
output to `stdout` when `-q` is specified. (We still write to `stdout`
if no output file is provided, since otherwise, the resolution won't be
printed _anywhere_.)
## Summary
Instead of looking at _either_ `pyproject.toml` or `setup.py`, we should
just be conservative and take the most-recent timestamp out of
`pyproject.toml`, `setup.py`, and `setup.cfg`. That will help prevent
staleness issues like those described in
https://github.com/astral-sh/uv/issues/1913#issuecomment-1961544084.
## Summary
Even when pre-releases are "allowed", per PEP 440, `pydantic<2.0.0`
should _not_ include pre-releases. This PR modifies the specifier
translation to treat `pydantic<2.0.0` as `pydantic<2.0.0.min0`, where
`min` is an internal-only version segment that's invisible to users.
Closes https://github.com/astral-sh/uv/issues/1641.
## Summary
We were applying every constraint to every dependency. This is
"harmless" in practice since this is just an optimization, but we thus
had false negatives ~every time which could lead to wasted work.
## Summary
If a `pyproject.toml` or similar is changed within an editable, we
should avoid passing our audit check (and thus re-install the package).
Closes https://github.com/astral-sh/uv/issues/1913.
## Summary
For context, we have three extraction paths:
- untar (async) - used for any `.tar.gz`, local or remote.
- unzip (async) - used to unzip remote wheels, or local or remote source
distributions.
- unzip (sync) - used to untar locally-available wheels into the cache.
We use three different crates for these:
- [`tokio-tar`](https://github.com/vorot93/tokio-tar)
- [`async-zip`](https://github.com/Majored/rs-async-zip)
- [`zip-rs`](https://github.com/zip-rs/zip)
These all seem to have different support for symlinks:
- `tokio-tar` tries to create a symlink (which works fine on Unix but
errors on Windows, since we typically don't have elevated permissions).
- `async-zip` _seems_ to write the target contents directly to the file
(which is what we want).
- `zip-rs` _apparently_ writes the _name_ of the target to the file
(which isn't what we want).
Thankfully, symlinks are not allowed in wheels
(https://github.com/pypa/pip/issues/5919,
https://discuss.python.org/t/symbolic-links-in-wheels/1945), so we can
ignore `zip-rs`.
For `tokio-tar`, we now _skip_ (and warn) if we see a symlink on
Windows. We could do what pip does, and recursively copy, but it's
difficult because we don't have `Seek` on the file. (Alternatively, we
could use hard links and junctions, though those also might need to
exist already.) Let's see how far this gets us.
(We also no longer attempt to set permissions on symlinks on Unix, which
caused another failure.)
Closes https://github.com/astral-sh/uv/issues/1858.
## Summary
We're printing the `Display` representation of `InstalledDist`, which
isn't guaranteed to be (and in fact isn't) a valid PEP 508 requirement,
making it impossible to use the `freeze` output as an input to an
install.
Closes https://github.com/astral-sh/uv/issues/1931.
## Summary
Like `platform.python_implementation`, we should support the
`python_implementation` "alias" for `platform_python_implementation`.
Closes https://github.com/astral-sh/uv/issues/1906.
## Test Plan
```shell
❯ cargo run pip install "pynacl==1.4.0"
Finished dev [unoptimized + debuginfo] target(s) in 7.02s
Running `target/debug/uv pip install pynacl==1.4.0`
Resolved 4 packages in 9ms
Built pynacl==1.4.0 Downloaded 1 package in 31.51s
Installed 4 packages in 3ms
+ cffi==1.16.0
+ pycparser==2.21
+ pynacl==1.4.0
+ six==1.16.0
```
Similar to https://github.com/astral-sh/ruff/pull/8034
Adds more version information so it's clear what revision the user is on
```
❯ cargo run -q -- --version
uv 0.1.10 (daa8565a7 2024-02-23)
❯ cargo run -q -- -V
uv 0.1.10
❯ cargo run -q -- version
uv 0.1.10 (daa8565a7 2024-02-23)
❯ cargo run -q -- version --output-format json
{
"version": "0.1.10",
"commit_info": {
"short_commit_hash": "daa8565a7",
"commit_hash": "daa8565a75249305821fdc34ace085060c082ba3",
"commit_date": "2024-02-23",
"last_tag": null,
"commits_since_last_tag": 0
}
}
```
## Summary
When a virtual environment contains multiple packages with the same
name, we no longer throw a hard error. Instead:
- In `uv pip freeze`, we list all versions.
- In `uv pip uninstall`, we uninstall all versions.
- In `uv pip install`, we uninstall all versions prior to installing a
new version.
Closes https://github.com/astral-sh/uv/issues/1848.
## Summary
In uv, we're going to use `--no-emit-package` for this, to convey that
the package will be included in the resolution but not in the output
file. It also mirrors flags like `--emit-index-url`.
We're also including an `--unsafe-package` alias.
Closes https://github.com/astral-sh/uv/issues/1415.
Users expect pip to have `pip`, `pip3` and `pip3.x` entrypoints. But pip
is a universal wheel, so it contains the `pip3.x` entrypoint where it
was built on. To fix this, pip special cases itself when installing
(3898741e29/src/pip/_internal/operations/install/wheel.py (L283)),
replacing the wheel entrypoint with one for the current version. We now
do the same.
Fixes#1593
Read the key read for uv from `pyenv.cfg` from `gourgeist` instead of
`uv`. I missed that we're also reading pyenv.cfg when reviewing #1852.
We could check for gourgeist for backwards compatibility, but i think
it's fine this way.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
On Windows `10.0.19045` the `py` command prints to `stderr` even when
working correctly. This means that uv should not treat this as a
failure.
Fixes https://github.com/astral-sh/uv/issues/1904
## Test Plan
I ran the modified code and it worked. I expect the pull request to run
automated tests.
## Summary
Fixes https://github.com/astral-sh/uv/issues/1693
`uv` currently fails when a user has `python` 2 or older installed on
their system without a `python3` or `python3.exe` on their path because
the `get_interpreter_info.py` script fails executing (it uses some
Python 3+ APIs).
This PR fixes this by:
* Returning an explicit error code in `get_interpreter_info` if the
Python version isn't supported
* Skipping over this error in `python_query` if the user requested ANY
python version or a version >= 3.
* Error if the user requested a Python 2 version.
## Test Plan
Error if the user requests a legacy python version.
```
uv venv -p 2
× Python 2 or older is not supported. Please use Python 3 or newer.
```
Ignore any python 2 installation when querying newer python
installations (using v4 here because I have python3 on the path and that
takes precedence over querying python)
```
uv_interpreter::python_query::find_python selector=Major(4)
0.005541s 0ms DEBUG uv_interpreter::interpreter Detecting markers for: /home/micha/.pyenv/shims/python
0.059730s 54ms DEBUG uv_interpreter::python_query Found a Python 2 installation that isn't supported by uv, skipping.
0.059983s 54ms DEBUG uv_interpreter::interpreter Using cached markers for: /usr/bin/python
× No Python 4 In `PATH`. Is Python 4 installed?
```
## Summary
This matches `pip-compile` and is, I think, intuitive. If you want to
suppress output, you can always pipe it away.
Closes https://github.com/astral-sh/uv/issues/1895.
## Summary
Hey guys! The motivation described in #1834
## Test Plan
Changed snapshot of the existing tests. `--index-url` and
`--extra-index-url` occur pretty often, so no extra testing is required,
imo.
I'm confused that we have this separate specification of `reqwests`? I'm
not sure this has any effect, but it seems like it should be done for
correctness.
Follows #1512
Closes https://github.com/astral-sh/uv/issues/1709
Closes https://github.com/astral-sh/uv/issues/1371
Tested with the reproduction provided in #1709 which gets past the HTTP
401.
Reuses the same copying logic we introduced in
https://github.com/astral-sh/uv/pull/1874 to ensure authentication is
attached to file URLs with a realm that matches that of the index. I had
to move the authentication logic into a new crate so it could be used in
`distribution-types`.
We will want to something more robust in the future, like track all
realms with authentication in a central store and perform lookups there.
That's what `pip` does and it allows consolidation of logic like netrc
lookups. That refactor feels significant though, and I'd like to get
this fixed ASAP so this is a minimal fix.
A couple moons ago, I introduced an optimization for version comparisons
by devising a format where *most* versions would be represented by a
single `u64`. This in turn meant most comparisons (of which many are
done during resolution) would be extremely cheap.
Unfortunately, when I did that, I screwed up the preservation of
ordering as defined by the [Version Specifiers spec]. I think I messed
it up because I had originally devised the representation so that we
could pack things like `1.2.3.dev1.post5`, but later realized it would
be better to limit ourselves to a single suffix. However, I never
updated the binary encoding to better match "up to 4 release versions
and up to precisely 1 suffix." Because of that, there were cases where
versions weren't ordered correctly. For example, this fixes a bug where
`1.0a2 < 1.0dev2`, even though all dev releases should order before
pre-releases.
We also update a test so that it catches these kinds of bugs in the
future. (By testing all pairs of versions in a sequence instead of just
the adjacent versions.)
[Version Specifiers spec]:
https://packaging.python.org/en/latest/specifications/version-specifiers/#summary-of-permitted-suffixes-and-relative-ordering
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
This modifies `gourgeist` to allow passing additional k,v pairs to add
to the `pyvenv.cfg` file as proposed in #1697.
I made it allow an arbitrary set of pairs (to decouple from `uv` since
this is mainly a change to `gourgeist`) , but I can slim it down to just
allow just a name and version strings if that's desired.
The `pyvenv.cfg` will also have a `uv = <uv-crate-version>` when a venv
is created via `uv venv` ~~and `uv-build = <uv-build-crate-version>`
when it's created via `SourceBuild::setup`~~.
Example below via `uv venv`:
```ini
home = ...
implementation = CPython
version_info = 3.12
include-system-site-packages = false
base-prefix = ...
base-exec-prefix = ...
base-executable = ...
uv = 0.1.6
prompt = uv
```
Open to any suggestions, thanks!
Closes#1697
## Test Plan
Added new test in `tests/venv.rs` called `verify_pyvenv_cfg` to verify
that it contains the right uv version string. I didn't see tests
configured in `gourgeist` itself, so I didn't add any there.
We don't have test coverage for this, but a term can reference an
incompatibility with root and then we'll display the internal 'root'
package to the user.
Raised in https://github.com/astral-sh/uv/issues/1855
Closes https://github.com/astral-sh/uv/issues/1860
In https://github.com/astral-sh/uv/pull/1816, we started using the URL
attached to a response instead of the request URL for subsequent
requests — this fixes various bugs but has the side-effect of dropping
credentials from the URL. Here, we transfer credentials from the request
URL to the response URL. We perform RFC compliant checks for safety.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
To integrate `uv` into `pixi` I need to specify a custom
`ResolverProvider` to be able to specify that some packages are already
installed by conda and should not be touched. However, some of the types
required to implement your own `ResolverProvider` were not accessible
through the public API. This PR basically adds them.
## Test Plan
I didnt add an explicit test for this.
## Summary
Fixes#1444.
In situations where the installer fails to perform a reflink, a regular
file copy is also attempted, as a fallback. This circumvents issues with
linking files across filesystems or volumes.
## Test Plan
N/A
## Summary
This revives a PR from long ago
(https://github.com/astral-sh/uv/pull/383 and
https://github.com/zanieb/pubgrub/pull/24) that modifies how we deal
with dependencies that are declared multiple times within a single
package.
To quote from the originating PR:
> Uses an experimental pubgrub branch (#370) that allows us to handle
multiple version ranges for a single dependency to the solver which
results in better error messages because the derivation tree contains
all of the relevant versions. Previously, the version ranges were merged
(by us) in the resolver before handing them to pubgrub since only one
range could be provided per package. Since we don't merge the versions
anymore, we no longer give the solver an empty range for conflicting
requirements; instead the solver comes to that conclusion from the
provided versions. You can see the improved error message for direct
dependencies in [this
snapshot](https://github.com/astral-sh/puffin/pull/383/files#diff-a0437f2c20cde5e2f15199a3bf81a102b92580063268417847ec9c793a115bd0).
The main issue with that PR was around its handling of URL dependencies,
so this PR _also_ refactors how we handle those. Previously, we stored
URL dependencies on `PubGrubPackage`, but they were omitted from the
hash and equality implementations of `PubGrubPackage`. This led to some
really careful codepaths wherein we had to ensure that we always visited
URLs before non-URL packages, so that the URL-inclusive versions were
included in any hashmaps, etc. I considered preserving this approach,
but it would require us to rely on lots of internal details of PubGrub
(since we'd now be relying on PubGrub to merge those packages in the
"right" order).
So, instead, we now _always_ set the URL on a given package, whenever
that package was _given_ a URL upfront. I think this is easier to reason
about: if the user provided a URL for `flask`, then we should just
always add the URL for `flask`. If we see some other URL for `flask`, we
error, like before. If we see some unknown URL for `flask`, we error,
like before.
Closes https://github.com/astral-sh/uv/issues/1522.
Closes https://github.com/astral-sh/uv/issues/1821.
Closes https://github.com/astral-sh/uv/issues/1615.
## Summary
We need to take care to keep wheel tags in "priority order" (e.g., we
should prefer ARM wheels over universal wheels). However... it looks
like we've had a `.sort()` in here all along, that risks throwing off
the ordering?
Closes https://github.com/astral-sh/uv/issues/1840.
## Test Plan
ensure that `rlax` uses the ARM wheel rather than the universal wheel:
- `cargo run venv`
- `cargo run pip install rlax`
- `import rlax`
Closes https://github.com/astral-sh/uv/issues/1775
Closes https://github.com/astral-sh/uv/issues/1452
Closes https://github.com/astral-sh/uv/issues/1514
Follows https://github.com/astral-sh/uv/pull/1717
libgit2 does not support host names with extra identifiers during SSH
lookup (e.g. [`github.com-some_identifier`](
https://docs.github.com/en/authentication/connecting-to-github-with-ssh/managing-deploy-keys#using-multiple-repositories-on-one-server))
so we use the `git` command instead for fetching. This is required for
`pip` parity.
See the [Cargo
documentation](https://doc.rust-lang.org/nightly/cargo/reference/config.html#netgit-fetch-with-cli)
for more details on using the `git` CLI instead of libgit2. We may want
to try to use libgit2 first in the future, as it is more performant
(#1786).
We now support authentication with:
```
git+ssh://git@<hostname>/...
git+ssh://git@<hostname>-<identifier>/...
```
Tested with a deploy key e.g.
```
cargo run -- \
pip install uv-private-pypackage@git+ssh://git@github.com-test-uv-private-pypackage/astral-test/uv-private-pypackage.git \
--reinstall --no-cache -v
```
and
```
cargo run -- \
pip install uv-private-pypackage@git+ssh://git@github.com/astral-test/uv-private-pypackage.git \
--reinstall --no-cache -v
```
with a ssh config like
```
Host github.com
Hostname github.com
IdentityFile=/Users/mz/.ssh/id_ed25519
Host github.com-test-uv-private-pypackage
Hostname github.com
IdentityFile=/Users/mz/.ssh/id_ed25519
```
It seems quite hard to add test coverage for this to the test suite, as
we'd need to add the SSH key and I don't know how to isolate that from
affecting other developer's machines.
Previously, we were only checking /bin/sh. While that works in most
cases, it seems like there are still scenarios where /bin/sh isn't an
executable itself, and is instead just a shell script that calls
/bin/dash. (See #1810 for example.)
In this PR, we make the `ld` detection a bit more robust by trying
multiple paths. As with previous changes, we emit copious logs to help
debug this in the future.
It's not totally clear how to test this. I'm not sure how to reproduce
the environment mentions in #1810 specifically since it seems like an
internal variant of WSL Ubuntu.
Fixes#1810
## Summary
We still need to wait for the distribution metadata (for direct
dependencies), even when resolving with `--no-deps`, since we rely on it
to report diagnostics to the user.
Closes https://github.com/astral-sh/uv/issues/1801.
## Summary
We currently maintain separate untar methods for sync and async, but we
only use the sync version when the user provides a local source
distribution. (Otherwise, we untar as we download the distribution.) In
my testing, this is actually slower anyway:
```
❯ python -m scripts.bench \
--uv-path ./target/release/main \
--uv-path ./target/release/uv \
./requirements.in --benchmark resolve-cold --min-runs 50
Benchmark 1: ./target/release/main (resolve-cold)
Time (mean ± σ): 835.2 ms ± 107.4 ms [User: 346.0 ms, System: 151.3 ms]
Range (min … max): 639.2 ms … 1051.0 ms 50 runs
Benchmark 2: ./target/release/uv (resolve-cold)
Time (mean ± σ): 750.7 ms ± 91.9 ms [User: 345.7 ms, System: 149.4 ms]
Range (min … max): 637.9 ms … 905.7 ms 50 runs
Summary
'./target/release/uv (resolve-cold)' ran
1.11 ± 0.20 times faster than './target/release/main (resolve-cold)'
```
## Summary
Allows the corresponding `pypi_types` struct to use any URL, since other
installers can put those into the environment, and Poetry seems to write
invalid URLs.
If we see a distribution with an invalid URL, we just treat it as a
registry distribution, which isn't ideal, but is better than (1)
erroring, and (2) changing `Url` to `String` everywhere internally. (I'm
torn on this second option.)
Closes https://github.com/astral-sh/uv/issues/1744.
## Test Plan
- Added `flask = { git = "git@github.com:pallets/flask.git", rev =
"b90a4f1f4a370e92054b9cc9db0efcb864f87ebe" }` to
`scripts/editable-installs/poetry_editable/pyproject.toml`.
- Ran `poetry install`.
- Ran `cargo pip freeze`. Verified that it errored on `main`, but passed
here.
- Ran `cargo run pip install "flask==3.0.0"`. Verified that it
uninstalled the existing Flask, and installed a new version from the
registry.
Add a `UV_BOOTSTRAP_DIR` option to configure the python bootstrap
directory. This is helpful when working across multiple platforms in a
single IDE session.
## Summary
If a registry doesn't support range requests, then today, we download
the entire wheel to disk and then read the metadata from the downloaded
archive. This PR instead modifies the registry client to stream the
zipfile and stop as soon as it's seen the metadata, which should be more
efficient.
Closes https://github.com/astral-sh/uv/issues/1596.
## Test Plan
Made this the _only_ path for downloading metadata; verified that the
test suite passed.
## Summary
Hello there! The motivation for this feature is described here #1678
## Test Plan
I've added unit tests and also tested this manually on my work project
by comparing it to the original `pip-compile` output - it looks much
like the `pip-compile` generated lock file.
Fixes handling of GitHub PATs in HTTPS URLs, which were otherwise
dropped. We now supporting the following authentication schemes:
```
git+https://<user>:<token>/...
git+https://<token>/...
```
On Windows, the username is required. We can consider adding a
special-case for this in the future, but this just matches libgit2's
behavior.
I tested with fine-grained tokens, OAuth tokens, and "classic" tokens.
There's test coverage for fine-grained tokens in CI where we use a real
private repository and PAT. Yes, the PAT is committed to make this test
usable by anyone. It has read-only permissions to the single repository,
expires Feb 1 2025, and is in an isolated organization and GitHub
account.
Does not yet address SSH authentication.
Related:
- https://github.com/astral-sh/uv/issues/1514
- https://github.com/astral-sh/uv/issues/1452
## Summary
The generated `pyvenv.cfg` file hardcodes `implementation = CPython`
even for PyPy venvs, created with `uv venv venv --python pypy3.10`, for
example.
```ini
home = /path/to/.pyenv/versions/pypy3.10-7.3.15/bin
implementation = CPython
version_info = 3.10
gourgeist = 0.0.4
include-system-site-packages = false
base-prefix = /path/to/.pyenv/versions/pypy3.10-7.3.15
base-exec-prefix = /path/to/.pyenv/versions/pypy3.10-7.3.15
base-executable = /path/to/.pyenv/versions/pypy3.10-7.3.15/bin/pypy3.10
```
## Test Plan
Manually verified that `pyvenv.cfg` now contains `implementation =
PyPy`. I can try refactoring `create_bare_venv` to make it more easily
testable, though.
## Summary
Some packages encode logic to embed the current commit SHA in the
version tag, when built within a Git repo. This typically results in an
invalid (non-compliant) version. Here's an example from `pylzma`:
ccb0e7cff3/version.py (L45).
This PR adds a phony, empty `.git` to the cache root, to ensure that any
`git` commands fail.
Closes https://github.com/astral-sh/uv/issues/1768.
## Test Plan
- Create a tag on the current commit, like `v0.5.0`.
- Build `pylzma`, using a cache _within_ the repo:
```
rm -rf foo
cargo run venv
cargo run pip install "pylzma @ 10ef072c3c3b9ea77ebe9546499975/pylzma-0.5.0.tar.gz" --verbose --cache-dir bar
```
(This PR message is mostly copied from the comment in the code.)
For local builds of Python, at time of writing, the version numbers end
with
a `+`. This makes the version non-PEP-440 compatible since a `+`
indicates
the start of a local segment which must be non-empty. Thus, `uv` chokes
on it
and [spits out an error][1] when trying to create a venv using a "local"
build
of Python. Arguably, the right fix for this is for [CPython to use a
PEP-440
compatible version number][2].
However, as a work-around for now, [as suggested by pradyunsg][3] as one
possible direction forward, we strip the `+`.
This fix does unfortunately mean that one [cannot specify a Python
version
constraint that specifically selects a local version][4]. But at the
time of
writing, it seems reasonable to block such functionality on this being
fixed
upstream (in some way).
Another alternative would be to treat such invalid versions as strings
(which
is what PEP-508 suggests), but this leads to undesirable behavior in
this
case. For example, let's say you have a Python constraint of `>=3.9.1`
and
a local build of Python with a version `3.11.1+`. Using string
comparisons
would mean the constraint wouldn't be satisfied:
>>> "3.9.1" < "3.11.1+"
False
So in the end, we just strip the trailing `+`, as was done in the days
of old
for [legacy version numbers][5].
I tested this fix by manually confirming that
uv venv --python local/python
failed before it and succeeded after it.
Fixes#1357
[1]: https://github.com/astral-sh/uv/issues/1357
[2]: https://github.com/python/cpython/issues/99968
[3]:
https://github.com/pypa/packaging/issues/678#issuecomment-1436033646
[4]: https://github.com/astral-sh/uv/issues/1357#issuecomment-1947645243
[5]:
085ff41692/packaging/version.py (L168-L193)
## Summary
Add `installer` method to `InstalledDist` to distinguish between
different installers. Might be nice to add an enum for all possible
installers, but this might be too hard to keep up to date :).
The `INSTALLER` file is a file that can be added to the `.dist-info`
folder with the installer name.
Closes: #1759
## Test Plan
Not sure if there is a place I can automatically test it, if you have a
pointer I would be happy to add a test.
PEP 508 requires a space between a URL and the semicolon separating it
from the markers to disambiguate it from a url ending with a semicolon.
This is easy to get wrong because the space is not required after a
plain name of PEP 440 specifier. The new error message explicitly points
out the missing space.
Fixes#1637
## Summary
The `DefaultResolverProvider` struct was not public. This PR exposes it
so we can build our own and use this as a fallback.
## Test Plan
I did not explicitly test this trivial change.
A WARN log was being emitted for a "broken cache entry" in the case
where the cache entry simply doesn't exist. But this is totally fine and
expected. So we detect the kind of error that occurred and emit a TRACE
if the file simply didn't exist.
A file in a zip can set arbitrary unix permissions, but we, like pip,
want to preserve only the executable bit and otherwise use the OS
defaults.
This should be faster for wheels with many files since we now avoid the
blocking fs call to set the permissions in most cases.
Fixes#1740.
## Summary
Don't preserve mtime to work around alexcrichton/tar-rs#349. Same as
#634 except for the streaming unzip.
Fixes#1748.
## Test Plan
Added the tomli source dist as test case.
## Summary
I am looking to instantiate a `RegistryClient`. However, when using the
`RegistryClientBuilder` a new reqwest client is always constructed. I
would like to pass in a custom `reqwest::Client` to be able to share the
http resources with other parts of my application.
## Test Plan
The uv codebase does not use my addition to the builder and all tests
still succeed. And in my code I can pass a custom Client.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
Add the environment variable `UV_REQUEST_TIMEOUT` to allow control over
pip timeouts.
Closes#1549
## Test Plan
I built uv in the repository top Dockerfile, set the timeout to 3
seconds, and ran `uv pip install torch`.
I measured the execution time with the time command and confirmed that
the process finished at a value close to the timeout we set.
```bash
root@037c69228cdc:~# time UV_REQUEST_TIMEOUT=3 /uv pip install torch
Resolved 22 packages in 25ms
error: Failed to download distributions
Caused by: Failed to fetch wheel: nvidia-cusolver-cu12==11.4.5.107
Caused by: Failed to extract source distribution
Caused by: request or response body error: operation timed out
Caused by: operation timed out
real 0m3.064s
user 0m0.225s
sys 0m0.240s
```
## Summary
This opens up space to add other cache-related commands. (`uv clean`
continues to work for backwards compatibility but is hidden from the
CLI.)
## Summary
We don't control these, so it seems preferable _not_ to fail on them,
but rather, to just ignore them entirely. (I considered adding a long
allow-list, but then questioned the point of it? We'd end up having to
extend it if more invalid extras were published in the future.)
Closes https://github.com/astral-sh/uv/issues/1633.
## Summary
The main change is that we need to have an explicit list of protocols we
_do_ support (like `https`), so that when we see a Windows absolute path
(`C:\...`), we don't treat the `C` as a protocol itself.
Closes https://github.com/astral-sh/uv/issues/1539.
## Summary
When we read `--index-url` from a `requirements.txt`, we attempt to
respect the `--index-url` provided by the CLI if it exists.
Unfortunately, `--index-url` from the CLI has a default value... so we
_never_ respect the `--index-url` in the requirements file.
This PR modifies the CLI to use `None`, and moves the default into logic
in the `IndexLocations `struct.
Closes https://github.com/astral-sh/uv/issues/1692.
Uses `--find-links` to discover vendored scenario build dependencies and
allows us to use `--index-url` instead of `--extra-index-url` to avoid
hitting the real PyPI in scenario tests.
## Summary
This fixes https://github.com/astral-sh/uv/issues/1704 by removing the
version from the produced header.
## Test Plan
Checked with clippy, and tests are updated too.
## Summary
This PR adds the `--prompt` option to `venv` subcommand.
The default behavior for `uv venv` is to create a virtual environment in
the current directory with `.venv` name. This is different from `venv` /
`virtualenv` where a user always needs to provide the virtual
environment path. This allows us to define our own behavior in the
default scenario (`uv venv`). We've decided to use the current
directory's name in that case.
Workflows:
| Command | Virtual Environment Name | Prompt |
|--------|--------|--------|
| `uv venv` | `.venv` (default) | Current directory name |
| `uv venv project` | `project` | `project` |
| `uv venv --prompt .` | `.venv` | Current directory name |
| `uv venv --prompt foobar` | `.venv` | `foobar` |
| `uv venv project --prompt foobar` | `project` | `foobar` |
Fixes#1445
## Test Plan
This is my first Rust code and I don't know how to write tests yet.
I just checked the behavior manually:
```
$ cargo build
$ mkdir t
$ cd t
$ ../target/debug/uv venv -p 3.11
$ rg -w t .venv/bin/acti*
.venv/bin/activate.csh
13:setenv VIRTUAL_ENV '/Users/inada-n/work/uv/t/.venv'
20:if ('t' != "") then
21: setenv VIRTUAL_ENV_PROMPT 't'
23: setenv VIRTUAL_ENV_PROMPT "$VIRTUAL_ENV:t:q"
38: # in which case, $prompt is undefined and we wouldn't
.venv/bin/activate
48:VIRTUAL_ENV='/Users/inada-n/work/uv/t/.venv'
59: VIRTUAL_ENV_PROMPT="t"
.venv/bin/activate.fish
61:set -gx VIRTUAL_ENV '/Users/inada-n/work/uv/t/.venv'
73:if test -n 't'
74: set -gx VIRTUAL_ENV_PROMPT 't'
.venv/bin/activate.ps1
40:if ("t" -ne "") {
41: $env:VIRTUAL_ENV_PROMPT = "t"
.venv/bin/activate.nu
6:# but then simply `deactivate` won't work because it is just an alias to hide
35: let virtual_env = '/Users/inada-n/work/uv/t/.venv'
50: let virtual_env_prompt = (if ('t' | is-empty) {
53: 't'
```
---------
Co-authored-by: Dhruv Manilawala <dhruvmanila@gmail.com>
This PR introduces more robust cache healing when `uv` fails to
deserialize an existing cache entry.
("Cache healing" in this context means that if `uv` fails to
deserialize a cache entry, then it will automatically invalidate that
entry and re-generate the data. Typically by sending an HTTP request.)
Previous to some optimizations I made around deserialization, we were
already doing this. After those optimizations, deserializing a cache
policy and the payload were split into two steps. While deserializing
a cache policy retained its cache healing behavior, deserializing the
payload did not. This became an issue when #1556 landed, which changed
one of our `rkyv` data types. This in turn made our internal types
incompatible with existing cache entries. One could work-around this
by clearing `uv`'s cache with `uv clean`, but we should just do it
automatically on a cache entry by entry basis.
This does technically introduce a new cost by pessimistically cloning
the HTTP request so that we can re-send it if necessary (see the commit
messages for the knot pushing me toward this approach). So I re-ran my
favorite ad-hoc benchmark:
```
$ hyperfine -w10 --runs 50 "uv-main pip compile --cache-dir ~/astral/tmp/cache-main ~/astral/tmp/reqs/home-assistant-reduced.in -o /dev/null" "uv-test pip compile --cache-dir ~/astral/tmp/cache-test ~/astral/tmp/reqs/home-assistant-reduced.in -o /dev/null" ; A bart
Benchmark 1: uv-main pip compile --cache-dir ~/astral/tmp/cache-main ~/astral/tmp/reqs/home-assistant-reduced.in -o /dev/null
Time (mean ± σ): 114.4 ms ± 3.2 ms [User: 149.4 ms, System: 221.5 ms]
Range (min … max): 106.7 ms … 122.0 ms 50 runs
Benchmark 2: uv-test pip compile --cache-dir ~/astral/tmp/cache-test ~/astral/tmp/reqs/home-assistant-reduced.in -o /dev/null
Time (mean ± σ): 114.0 ms ± 3.0 ms [User: 146.0 ms, System: 223.3 ms]
Range (min … max): 105.3 ms … 121.4 ms 50 runs
Summary
uv-test pip compile --cache-dir ~/astral/tmp/cache-test ~/astral/tmp/reqs/home-assistant-reduced.in -o /dev/null ran
1.00 ± 0.04 times faster than uv-main pip compile --cache-dir ~/astral/tmp/cache-main ~/astral/tmp/reqs/home-assistant-reduced.in -o /dev/null
```
Which is about what I expected.
We should endeavor to have a better testing strategy for these kinds of
bugs, but I think it might be a little tricky to do. I created
https://github.com/astral-sh/uv/issues/1699 to track that.
Fixes#1571
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
Adds cli command / flag (`generate-shell-completion <SHELL>` /
`--generate-shell-completion <SHELL>`) to generate the completion script
for the given shell. Implemented in exactly the same way as it is done
in ruff
(https://github.com/astral-sh/ruff/blob/main/crates/ruff/src/lib.rs#L197)
Closes https://github.com/astral-sh/uv/issues/1654
## Test Plan
I've normally tested the generated script manually only for bash shell
on Ubuntu 22.04.3
```bash
$ uv --generate-shell-completion bash > /usr/share/bash-completion/completions/uv
$ uv # <TAB>
-q -h --verbose --no-cache --version clean
-v -V --no-color --cache-dir pip generate-shell-completion
-n --quiet --color --help venv help
$ uv pip # <TAB>
-q -n -V --verbose --color --cache-dir --version sync uninstall help
-v -h --quiet --no-color --no-cache --help compile install freeze
```
Resolves#1292.
## Summary
Move the yanked warnings for `uv pip sync` and `uv pip install` to the
end of the commands, as per #1292.
## Test Plan
I ran the unit tests: `cargo nextest run`
## Summary
Just as we mark virtualenvs as `gitignore`d by default, we should also
mark them as `CACHEDIR.TAG`, to ensure that they aren't included in
backups, etc.
Closes https://github.com/astral-sh/uv/issues/1648.
## Test Plan
Ran `cargo run venv` and:
```
❯ ls .venv
CACHEDIR.TAG bin lib pyvenv.cfg
```
## Summary
Added `uv` to the list of the preserved packages when building the
installer plan. In that case `uv` is not going to be removed when, for
example, using `python -m uv pip sync requirements.txt` when
requirements.txt does not contain `uv`, but `uv` is installed in that
venv.
Closes#1631
## Test Plan
Got through the example attached to
https://github.com/astral-sh/uv/issues/1631 and did see the uv deletion
in the output
```
$ python -m uv pip sync requirements.txt
Installed 1 package in 20ms
+ ruff==0.2.2
```
## Sumamry
This PR adds the `activation.bat`, `deactivation.bat` and `pyenv.bat`
files to add support for using uv from CMD.
This PR further fixes an issue with our trampoline implementation where
calling an executable like `black` failed:
```
(venv) C:\Users\Micha\astral\test>where black
C:\Users\Micha\astral\test\.venv\Scripts\black.exe
(venv) C:\Users\Micha\astral\test>black
C:\Users\Micha\AppData\Local\Programs\Python\Python312\python.exe: can't open file 'C:\\Users\\Micha\\astral\\test\\black': [Errno 2] No such file or directory
```
The issue was that CMD doesn't extend `black` to its full path before
passing it to the trampoline and our trampoline generated the command
`<python> black` instead of `<python> .venv/Scripts/black`, and Python
can't find `black` in the project directory.
This PR fixes this by using the full executable name (that we already
parsed out to discover the Python version). This adds one complication,
we need to preserve the arguments without repeating the executable name
that is the first argument.
One option is to use
[`CommandLineToArgvW`](https://learn.microsoft.com/de-de/windows/win32/api/shellapi/nf-shellapi-commandlinetoargvw)
and then serialize the arguments 1.. to a string again. I decided
against that. Win32 API calls are easy to get wrong. That's why I
implemented the parsing rules specified in
[`CommandLineToArgvW`](https://learn.microsoft.com/de-de/windows/win32/api/shellapi/nf-shellapi-commandlinetoargvw)
to skip the first argument.
Fixes https://github.com/astral-sh/uv/issues/1471
## Test Plan
https://github.com/astral-sh/uv/assets/1203881/bdb537b6-97c8-4f7e-bb4a-3a614eb5e0f6
Powershell continues to work
https://github.com/astral-sh/uv/assets/1203881/6c806477-a7c6-4047-9ffc-5ed91c6f1c84
I haven't been able to test the aarch binaries.
## Summary
If an editable package declares a direct URL requirement, we currently
error since it's not considered an "allowed" requirement. We need to add
those URLs to the allow-list.
Closes https://github.com/astral-sh/uv/issues/1603.
## Summary
It's incorrect to pass the resolution and dependency mode down to the
`BuildDispatch`, since it means that we'll use `--no-deps` when building
source distributions. If you set resolution to `lowest`, it also means
we end up using (e.g.) the lowest version of `wheel`, which also doesn't
make sense.
It's fine to pass `--exclude-newer`.
Closes https://github.com/astral-sh/uv/issues/1355.
Closes https://github.com/astral-sh/uv/issues/1563.
This PR fixes the bug where the `BIN_NAME` replacement field wasn't
being used in the activator scripts.
fixes: #1518
## Test plan
As I don't have a Windows machine, I switched the `bin_name` value here
to point to `Scripts` on `unix` platform:
2a76c59084/crates/gourgeist/src/bare.rs (L99-L105)
<details><summary>Code diff</summary>
<p>
```diff
```diff
diff --git a/crates/gourgeist/src/bare.rs b/crates/gourgeist/src/bare.rs
index 4c7808d3..0e0b41cf 100644
--- a/crates/gourgeist/src/bare.rs
+++ b/crates/gourgeist/src/bare.rs
@@ -97,9 +97,9 @@ pub fn create_bare_venv(location: &Utf8Path,
interpreter: &Interpreter) -> io::R
// TODO(konstin): I bet on windows we'll have to strip the prefix again
let location = location.canonicalize_utf8()?;
let bin_name = if cfg!(unix) {
- "bin"
- } else if cfg!(windows) {
"Scripts"
+ } else if cfg!(windows) {
+ "bin"
} else {
unimplemented!("Only Windows and Unix are supported")
};
```
</p>
</details>
I then created the virtual environment as usual and tested out that the path modifications were correct:
```console
$ cargo run --bin uv -- venv
Finished dev [unoptimized + debuginfo] target(s) in 0.13s
Running `target/debug/uv venv`
Using Python 3.12.1 interpreter at
/Users/dhruv/.pyenv/versions/3.12.1/bin/python3.12
Creating virtualenv at: .venv
$ source .venv/Scripts/activate
$ echo $PATH
/Users/dhruv/work/astral/uv/.venv/Scripts:[...]
$ which python
/Users/dhruv/work/astral/uv/.venv/Scripts/python
```
I'm not sure how else to test this without having access to a Windows machine
## Summary#1562
It turns out that `hexdump` uses an invalid source distribution format
whereby the contents aren't nested in a top-level directory -- instead,
they're all just flattened at the top-level. In looking at pip's source
(51de88ca64/src/pip/_internal/utils/unpacking.py (L62)),
it only strips the top-level directory if all entries have the same
directory prefix (i.e., if it's the only thing in the directory). This
PR accommodates these "invalid" distributions.
I can't find any history on this method in `pip`. It looks like it dates
back over 15 years ago, to before `pip` was even called `pip`.
Closes https://github.com/astral-sh/uv/issues/1376.
## Summary
This was just a missing line -- we have `dependencies.remove(&package);`
in the ~identical branch above, but it must've been an oversight to omit
it here.
Closes https://github.com/astral-sh/uv/issues/1467.
## Test Plan
`cargo test`
## Summary
It turns out that it's not uncommon to end up with repeated packages in
requirements files when running `pip-sync`, e.g., you might have
`anyio==4.0.0` specified multiple times. This PR relaxes our assertions
in the install plan to allow such repeated packages, as long as the
requirement markers are exactly the same (i.e., they are truly
duplicates).
Closes https://github.com/astral-sh/uv/issues/1552.
## Summary
If you're developing on a package like `attrs` locally, and it has a
recursive extra like `attrs[dev]`, it turns out that we then try to find
the `attrs` in `attrs[dev]` from the registry, rather than recognizing
that it's part of the editable.
This PR fixes the issue by making editables slightly more first-class
throughout the resolver. Instead of mocking metadata, we explicitly
check for extras in various places. Part of the problem here is that we
treated editables as URL dependencies, but when we saw an _extra_ like
`attrs[dev]`, we didn't map that back to the URL. So now, we treat them
as registry dependencies, but with the appropriate guardrails
throughout.
Closes https://github.com/astral-sh/uv/issues/1447.
## Test Plan
- Cloned `attrs`.
- Ran `cargo run venv && cargo run pip install -e ".[dev]" -v`.
## Summary
This _could_ fix https://github.com/astral-sh/uv/issues/1454, but I'm
not sure. I was able to replicate by forcing a bunch of error states.
But, in short, if we fail to hardlink on the initial copy due to a file
existing, and then fail _again_, we fallback to copying. But if we copy,
then the tempfile doesn't exist, and so the `fs_err::rename(&tempfile,
&out_path)?;` will fail with "File not found".
This PR just ensures that the cases are explicitly mutually exclusive:
we only attempt to rename if the hardlink succeeded.
This PR fixes the OS detection for Alpine Linux such that the version
of musl available is correctly determined. The issue boiled down to
a regex that required 2 digits for each version component. But a
valid musl version is 1.2.4, which only has a single digit for each
component.
It's unclear how this was working for musl before this change. My
theory is that our other methods of OS detection were somehow working.
The first commit in this PR cleans up our Linux detection logic and adds
lots of tracing calls to make debugging issues like this easier in the
future. To do so, one can run:
$ RUST_LOG=trace uv pip install -v whatever
The second commit has the actual fix.
Fixes#1427
## Summary
By using the display representation of `Version` to form a `PackageId`,
we run the risk (as seen in the linked issue) of thinking that versions
like `2021.1` and `2021.1.0` are not equivalent.
Closes https://github.com/astral-sh/uv/issues/1536
This fixes a bug where `uv pip install` failed to install `polars`:
```
$ uv pip install polars==0.14.0
error: Failed to download: polars==0.14.0
Caused by: Couldn't parse metadata of polars-0.14.0-cp37-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.whl from 749022b096cb7c1c2cc32b7f433c4f/polars-0.14.0-cp37-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Caused by: Operator >= cannot be used with a wildcard version specifier
pyarrow>=4.0.*; extra == 'pyarrow'
^^^^^^^
```
Since `pyarrow>=4.0.*; extra == 'pyarrow'` is invalid *and* it comes
from the metadata of a dependency (that isn't under the control of the
end user), we actually attempt to "fix" it. Namely, wildcard
dependency specifications are only allowed with `==` and `!=`, as per
the [Version Specifiers spec]. (They aren't explicitly forbidden in
these cases, but instead only have specified behavior for the `==` and
`!=` operators.)
This is all fine, but it turns out that when we fix the `>=4.0.*`
component, we also strip the quotes around `pyarrow`. (Because some
dependency specifications include stray quotes.) We fix this by making
our quote stripping a bit more selective. (We require that it appear
adjacent to a digit or a `*`.)
Note that #1477 also reports this error:
```
$ uv pip install 'requests>=2.30.*'
error: Failed to parse `requests>=2.30.*`
Caused by: Operator >= cannot be used with a wildcard version specifier
requests>=2.30.*
```
However, we specifically keep that error message since it's something
under the end user's control. And similarly for a dependency
specification in a `requirements.txt` file.
Fixes#1477
[Version Specifiers spec]:
https://packaging.python.org/en/latest/specifications/version-specifiers/
It turns out that /bin/ls can sometimes be plain text file. For
example, in Rocky Linux 9:
```
$ cat /bin/ls
#!/usr/bin/coreutils --coreutils-prog-shebang=ls
```
However, `/bin/sh` is an ELF binary:
```
$ file /bin/sh
/bin/sh: ELF 64-bit LSB pie executable, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=7acbb41bf6f1b7d977f1b44675bf3ed213776835, for GNU/Linux 3.2.0, stripped
```
In a related issue (#1433), @zanieb fixed#1395 where, on NixOS,
`/bin/ls` doesn't exist but `/bin/sh` does. However, the fix attempts
`/bin/ls` first and only tries `/bin/sh` if `/bin/ls` doesn't exist. If
`/bin/ls` exists but isn't a valid ELF file, then the entire enterprise
gives up and `uv` fails to detect the version of `libc` that is
installed.
Instead of tweaking the logic to keep trying `/bin/ls` and then
`/bin/sh` after even if parsing `/bin/ls` fails, we just switch over to
reading `/bin/sh` only. It seems like a more fundamental thing to sniff
and likely less error prone.
We can adjust this heuristic as needed if it provdes to be problematic.
I tested this fix manually on Rocky Linux 9 via Docker:
```
$ cross b -r -p uv --target x86_64-unknown-linux-musl
$ cp target/x86_64-unknown-linux-musl/release/uv ~/astral/issues/uv/i1486/uv
$ docker run --rm -it --mount type=bind,src=/home/andrew/astral/issues/uv/i1486,dst=/host rockylinux:9 bash
[root@df2baa65d2f8 /]# /host/uv venv
Using Python 3.9.18 interpreter at /usr/bin/python3.9
Creating virtualenv at: .venv
[root@df2baa65d2f8 /]#
```
Fixes#1486, Ref #1433
I'm not sure if we should just switch to _always_ reading from sh
instead? I don't love that all these errors are strings and I if
`/bin/ls` exists but can't be parsed we still won't try `/bin/sh`. We
may want to address these things in the future.
Closes https://github.com/astral-sh/uv/issues/1395
## Summary
It looks like `devpi` might add an empty fragment (`#`) at the end of
the URL. We expect it to contain the hash; this just makes
empty-fragment map to "no hash".
Closes https://github.com/astral-sh/uv/issues/1441.
## Summary
If a distribution contains a `+`, it'll be HTML-escaped; so when we try
to identify the `#`, we'll split in the wrong location.
Closes https://github.com/astral-sh/uv/issues/1338.
Closes https://github.com/astral-sh/uv/issues/1388
Fixes incorrect handling of relative paths returned by indexes without
an explicit `<base>`.
`Url.join` will drop the last segment in an url e.g. `http://foo/bar` ->
`http://foo/baz` if there is not a trailing slash but what we want is
`http://foo/bar/baz`. We don't add the trailing `/` in
`base_url_join_relative` because flat indexes are `http://foo/bar.html`
and we _want_ `bar.html` to be replaced.
## Summary
In a `requirements.txt` file, it turns out that the `-c` and `-r`
entries should be interpreted as relative to the file in which they're
declared, while the `-e` entries should be interpreted as relative to
the current working directory, no matter where they're defined.
Previously, we always used the current working directory; now, we use
the declaring file's directory for `-c` and `-r`.
Closes https://github.com/astral-sh/uv/issues/1367.
Closes https://github.com/astral-sh/uv/issues/1416.
## Summary
Closes https://github.com/astral-sh/uv/issues/1402.
## Test Plan
Ran `cargo run pip install junos-eznc==2.6.5`, which still fails for me,
but fails identically to `pip` (and not on the `requires-python`):
```
/private/var/folders/nt/6gf2v7_s3k13zq_t3944rwz40000gn/T/.tmp7mxT9L/built-wheels-v0/pypi/ncclient/0.6.13/4vvPwmDC_CL2OUXd68Zqb/ncclient-0.6.13.tar.gz/versioneer.py:421: SyntaxWarning: invalid escape sequence '\s'
LONG_VERSION_PY['git'] = '''
Traceback (most recent call last):
File "<string>", line 10, in <module>
File "/private/var/folders/nt/6gf2v7_s3k13zq_t3944rwz40000gn/T/.tmplD5mMO/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 366, in prepare_metadata_for_build_wheel
self.run_setup()
File "/private/var/folders/nt/6gf2v7_s3k13zq_t3944rwz40000gn/T/.tmplD5mMO/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 480, in run_setup
super().run_setup(setup_script=setup_script)
File "/private/var/folders/nt/6gf2v7_s3k13zq_t3944rwz40000gn/T/.tmplD5mMO/.venv/lib/python3.12/site-packages/setuptools/build_meta.py", line 311, in run_setup
exec(code, locals())
File "<string>", line 45, in <module>
File "/private/var/folders/nt/6gf2v7_s3k13zq_t3944rwz40000gn/T/.tmp7mxT9L/built-wheels-v0/pypi/ncclient/0.6.13/4vvPwmDC_CL2OUXd68Zqb/ncclient-0.6.13.tar.gz/versioneer.py", line 1480, in get_version
return get_versions()["version"]
^^^^^^^^^^^^^^
File "/private/var/folders/nt/6gf2v7_s3k13zq_t3944rwz40000gn/T/.tmp7mxT9L/built-wheels-v0/pypi/ncclient/0.6.13/4vvPwmDC_CL2OUXd68Zqb/ncclient-0.6.13.tar.gz/versioneer.py", line 1412, in get_versions
cfg = get_config_from_root(root)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/private/var/folders/nt/6gf2v7_s3k13zq_t3944rwz40000gn/T/.tmp7mxT9L/built-wheels-v0/pypi/ncclient/0.6.13/4vvPwmDC_CL2OUXd68Zqb/ncclient-0.6.13.tar.gz/versioneer.py", line 342, in get_config_from_root
parser = configparser.SafeConfigParser()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: module 'configparser' has no attribute 'SafeConfigParser'. Did you mean: 'RawConfigParser'?
```
This PR improves the error message for the problem described in
https://github.com/astral-sh/uv/issues/1376. The original output
duplicates the actual error message and includes lots of noise
(`DirEntry { inner: DirEntry(...) }`).
```
$ uv pip install hexdump==3.3
error: Failed to download and build: hexdump==3.3
Caused by: Failed to extract source distribution: The top level of the archive must only contain a list directory, but it contains: [DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/__main__.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/hexdump.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/data") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/PKG-INFO") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/setup.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/README.txt") }]
Caused by: The top level of the archive must only contain a list directory, but it contains: [DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/__main__.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/hexdump.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/data") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/PKG-INFO") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/setup.py") }, DirEntry { inner: DirEntry("/home/robin/.cache/uv/.tmpgSvTCk/README.txt") }]
```
This PR removes the duplication and `DirEntry` internals so that the
error message is easier to grasp:
```
$ uv pip install hexdump==3.3
error: Failed to download and build: hexdump==3.3
Caused by: Failed to extract source distribution
Caused by: The top level of the archive must only contain a list directory, but it contains: ["__main__.py", "hexdump.py", "data", "PKG-INFO", "setup.py", "README.txt"]
```
It's a little picky about the value, but that seems okay.
```
❯ ./target/debug/uv pip install trio
Audited 1 package in 4ms
❯ UV_NO_CACHE=true ./target/debug/uv pip install trio
Audited 1 package in 50ms
```
Closes#1382
First, replace all usages in files in-place. I used my editor for this.
If someone wants to add a one-liner that'd be fun.
Then, update directory and file names:
```
# Run twice for nested directories
find . -type d -print0 | xargs -0 rename s/puffin/uv/g
find . -type d -print0 | xargs -0 rename s/puffin/uv/g
# Update files
find . -type f -print0 | xargs -0 rename s/puffin/uv/g
```
Then add all the files again
```
# Add all the files again
git add crates
git add python/uv
# This one needs a force-add
git add -f crates/uv-trampoline
```
Instead of dropping versions without a compatible distribution, we track
them as incompatibilities in the solver. This implementation follows
patterns established in https://github.com/astral-sh/puffin/pull/1290.
This required some significant refactoring of how we track incompatible
distributions. Notably:
- `Option<TagPriority>` is now `WheelCompatibility` which allows us to
track the reason a wheel is incompatible instead of just `None`.
- `Candidate` now has a `CandidateDist` with `Compatible` and
`Incompatibile` variants instead of just `ResolvableDist`; candidates
are not strictly compatible anymore
- `ResolvableDist` was renamed to `CompatibleDist`
- `IncompatibleWheel` was given an ordering implementation so we can
track the "most compatible" (but still incompatible) wheel. This allows
us to collapse the reason a version cannot be used to a single
incompatibility.
- The filtering in the `VersionMap` is retained, we still only store one
incompatible wheel per version. This is sufficient for error reporting.
- A `TagCompatibility` type was added for tracking which part of a wheel
tag is incompatible
- `Candidate::validate_python` moved to
`PythonRequirement::validate_dist`
I am doing more refactoring in #1298 — I think a couple passes will be
necessary to clarify the relationships of these types.
Includes improved error message snapshots for multiple incompatible
Python tag types from #1285 — we should add more scenarios for coverage
of behavior when multiple tags with different levels are present.