The uv build backend has gone through some feedback cycles, we expect no
more major configuration changes, and we're ready to take the next step:
The uv build backend in stable.
This PR stabilizes:
* Using `uv_build` as build backend
* The documentation of the uv build backend
* The direct build fast path, where uv doesn't use PEP 517 if you're
using `uv_build` in a compatible version.
* `uv build --list`, which is limited to `uv_build`.
It does not:
* Make `uv_build` the default on `uv init`
* Make `--package` the default on `uv init`
## Summary
If we fail to acquire a lock on an environment, uv shouldn't fail; we
should just warn. In some cases, users run uv with read-only permissions
for their projects, etc.
For now, I kept any locks acquired _in the cache_ as hard failures,
since we always need write-access to the cache.
Closes https://github.com/astral-sh/uv/issues/14411.
## Summary
The idea here is that if a user runs `uv pip compile --universal`, we
should ignore the patch version on the current interpreter. I think this
makes sense... `--universal` tries to resolve for all future versions,
so it seems a bit odd that we'd start at the _current_ patch version.
Closes https://github.com/astral-sh/uv/issues/14397.
Clap does not perform global validation, so flag that are declared as
overriding can be set at the same time:
https://github.com/clap-rs/clap/issues/6049. This would previously cause
a panic. We work around this by choosing the yes-value always and
writing a warning.
An alternative would be erroring when both are set, but it's unclear to
me if this may break things we want to support. (`UV_OFFLINE=1 cargo run
-q pip --no-offline install tqdm --no-cache` is already banned).
Fixes https://github.com/astral-sh/uv/pull/14299
**Test Plan**
```
$ cargo run -q pip --offline install --no-offline tqdm --no-cache
warning: Boolean flags on different levels are not correctly supported (https://github.com/clap-rs/clap/issues/6049)
× No solution found when resolving dependencies:
╰─▶ Because tqdm was not found in the cache and you require tqdm, we can conclude that your requirements are unsatisfiable.
hint: Packages were unavailable because the network was disabled. When the network is disabled, registry packages may only be read from the cache.
```
## Summary
The basic idea here is that we can (should) reuse a build environment
across resolution (`prepare_metadata_for_build_wheel`) and installation.
This also happens to solve the build-PyTorch-from-source problem, since
we use a consistent build environment between the invocations.
Since `SourceDistributionBuilder` is stateless, we instead store the
builds on `BuildContext`, and we key them by various properties: the
underlying interpreter, the configuration settings, etc. This just
ensures that if we build the same package twice within a process, we
don't accidentally reuse an incompatible build (virtual) environment.
(Note that still drop build environments at the end of the command, and
don't attempt to reuse them across processes.)
Closes#14269.
If/when we see https://github.com/astral-sh/uv/issues/14171 again, this
should clarify whether our retry logic was skipped (i.e. a transient
error wasn't correctly identified as transient), or whether we exhausted
our retries. Previously, if you ran a local example fileserver as in
https://github.com/astral-sh/uv/issues/14171#issuecomment-3014580701 and
then you tried to install Python from it, you'd get:
```
$ export UV_TEST_NO_CLI_PROGRESS=1
$ uv python install 3.8.20 --mirror http://localhost:8000 2>&1 | cat
error: Failed to install cpython-3.8.20-linux-x86_64-gnu
Caused by: Failed to extract archive: cpython-3.8.20-20241002-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
Caused by: failed to unpack `/home/jacko/.local/share/uv/python/.temp/.tmpS4sHHZ/python/lib/libpython3.8.so.1.0`
Caused by: failed to unpack `python/lib/libpython3.8.so.1.0` into `/home/jacko/.local/share/uv/python/.temp/.tmpS4sHHZ/python/lib/libpython3.8.so.1.0`
Caused by: error decoding response body
Caused by: request or response body error
Caused by: error reading a body from connection
Caused by: Connection reset by peer (os error 104)
```
With this change you get:
```
error: Failed to install cpython-3.8.20-linux-x86_64-gnu
Caused by: Request failed after 3 retries
Caused by: Failed to extract archive: cpython-3.8.20-20241002-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
Caused by: failed to unpack `/home/jacko/.local/share/uv/python/.temp/.tmp4Ia24w/python/lib/libpython3.8.so.1.0`
Caused by: failed to unpack `python/lib/libpython3.8.so.1.0` into `/home/jacko/.local/share/uv/python/.temp/.tmp4Ia24w/python/lib/libpython3.8.so.1.0`
Caused by: error decoding response body
Caused by: request or response body error
Caused by: error reading a body from connection
Caused by: Connection reset by peer (os error 104)
```
At the same time, I'm updating the way we handle the retry count to
avoid nested retry loops exceeding the intended number of attempts, as I
mentioned at
https://github.com/astral-sh/uv/issues/14069#issuecomment-3020634281.
It's not clear to me whether we actually want this part of the change,
and I need feedback here.
## Summary
This PR ensures that we avoid cleaning up build directories until the
end of a resolve-and-install cycle. It's not bulletproof (since we could
still run into issues with `uv lock` followed by `uv sync` whereby a
build directory gets cleaned up that's still referenced in the `build`
artifacts), but it at least gets PyTorch building without error with `uv
pip install .`, which is a case that's been reported several times.
Closes https://github.com/astral-sh/uv/issues/14269.
The marker display code assumes that all versions are normalized, in
that all trailing zeroes are stripped. This is not the case for
tilde-equals and equals-star versions, where the trailing zeroes (before
the `.*`) are semantically relevant. This would cause path
dependent-behavior where we would get a different marker string
depending on whether a version with or without a trailing zero was added
to the cache first.
To handle both equals-star and tilde-equals when converting
`python_version` to `python_full_version` markers, we have to merge the
version normalization (i.e. trimming the trailing zeroes) and the
conversion both to `python_full_version` and to `Ranges`, while special
casing equals-star and tilde-equals.
To avoid churn in lockfiles, we only trim in the conversion to `Ranges`
for markers, but keep using untrimmed versions for requires-python.
(Note that this behavior is technically also path dependent, as versions
with and without trailing zeroes have the same Hash and Eq. E.q.,
`requires-python == ">= 3.10.0"` and `requires-python == ">= 3.10"` in
the same workspace could lead to either value in `uv.lock`, and which
one it is could change if we make unrelated (performance) changes.
Always trimming however definitely changes lockfiles, a churn I wouldn't
do outside another breaking or lockfile-changing change.) Nevertheless,
there is a change for users who have `requires-python = "~= 3.12.0"` in
their `pyproject.toml`, as this now hits the correct normalization path.
Fixes#14231Fixes#14270
## Summary
There's a good example of the downside of using verbatim URLs here:
https://github.com/astral-sh/uv/pull/14197#discussion_r2163599625 (we
show two relative paths that point to the same directory, but it's not
clear from the error message).
The diff:
```
2 2 │ ----- stdout -----
3 3 │
4 4 │ ----- stderr -----
5 5 │ error: Requirements contain conflicting URLs for package `library` in all marker environments:
6 │-- ../../library
7 │-- ./library
6 │+- file://[TEMP_DIR]/library
7 │+- file://[TEMP_DIR]/library (editable)
```
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
As explained in the [`codspeed-rust` v3 release
notes](https://github.com/CodSpeedHQ/codspeed-rust/releases/tag/v3.0.0),
the `v3` of the compatibility layers is now required to work with the
latest version(`v3`) of `cargo-codspeed`.
and prefer emulated x64 windows in its stead.
This is preparatory work for shipping support for uv downloading and
installing aarch64 (arm64) windows Pythons. We've [had builds for this
platform ready for a
while](https://github.com/astral-sh/python-build-standalone/pull/387),
but have held back on shipping them due to a fundamental problem:
**The Python packaging ecosystem does not have strong support for
aarch64 windows**, e.g., not many projects build aarch64 wheels yet. The
net effect of this is that, if we handed you an aarch64 python
interpreter on windows, you would have to build a lot more sdists, and
there's a high chance you will simply fail to build that sdist and be
sad.
Yes unfortunately, in this case a non-native Python interpreter simply
*works better* than the native one... in terms of working at all, today.
Of course, if the native interpreter works for your project, it should
presumably have better performance and platform compatibility.
We do not want to stand in the way of progress, as ideally this
situation is a temporary state of affairs as the ecosystem grows to
support aarch64 windows. To enable progress, on aarch64 Windows builds
of uv:
* We will still use a native python interpreter, e.g., if it's at the
front of your `PATH` or the only installed version.
* If we are choosing between equally good interpreters that differ in
architecture, x64 will be preferred.
* If the aarch64 version is newer, we will prefer the aarch64 one.
* We will emit a diagnostic on installation, and show the python request
to pass to uv to force aarch64 windows to be used.
* Will be shipping [aarch64 Windows Python
downloads](https://github.com/astral-sh/python-build-standalone/pull/387)
* Will probably add some kind of global override setting/env-var to
disable this behaviour.
* Will be shipping this behaviour in
[astral-sh/setup-uv](https://github.com/astral-sh/setup-uv)
We're coordinating with Microsoft, GitHub (for the `setup-python`
action), and the CPython team (for the `python.org` installers), to
ensure we're aligned on this default and the timing of toggling to
prefer native distributions in the future.
See discussion in
- https://github.com/astral-sh/uv/issues/12906
---
This is an alternative to
* #13719
which uses sorting rather than filtering, as discussed in
* #13721
This fixes an obscure cache collision in Python interpreter queries,
which we believe to be the root cause of CI flakes we've been seeing
where a project environment is invalidated and recreated.
This work follows from the logs in [this CI
run](https://github.com/astral-sh/uv/actions/runs/15934322410/job/44950599993?pr=14326)
which captured one of the flakes with tracing enabled. There, we can see
that the project environment is invalidated because the Python
interpreter in the environment has a different version than expected:
```
DEBUG Checking for Python environment at `.venv`
TRACE Cached interpreter info for Python 3.12.9, skipping probing: .venv/bin/python3
DEBUG The interpreter in the project environment has different version (3.12.9) than it was created with (3.9.21)
```
(this message is updated to reflect #14329)
The flow is roughly:
- We create an environment with 3.12.9
- We query the environment, and cache the interpreter version for
`.venv/bin/python`
- We create an environment for 3.9.12, replacing the existing one
- We query the environment, and read the cached information
The Python cache entries are keyed by the absolute path to the
interpreter, and rely on the modification time (ctime, nsec resolution)
of the canonicalized path to determine if the cache entry should be
invalidated. The key is a hex representation of a u64 sea hasher output
— which is very unlikely to collide.
After an audit of the Python query caching logic, we determined that the
most likely cause of a collision in cache entries is that the
modification times of underlying interpreters are identical. This seems
pretty feasible, especially if the file system does not support
nanosecond precision — though it appears that the GitHub runners do
support it.
The fix here is to include the canonicalized path in the cache key,
which ensures we're looking at the modification time of the _same_
underlying interpreter.
This will "invalidate" all existing interpreter cache entries but that's
not a big deal.
This should also have the effect of reducing cache churn for
interpreters in virtual environments. Now, when you change Python
versions, we won't invalidate the previous cache entry so if you change
_back_ to the old version we can re-use our cached information.
It's a bit speculative, since we don't have a deterministic reproduction
in CI, but this is the strongest candidate given the logs and should
increase correctness regardless.
Closes https://github.com/astral-sh/uv/issues/14160
Closes https://github.com/astral-sh/uv/issues/13744
Closes https://github.com/astral-sh/uv/issues/13745
Once it's confirmed the flakes are resolved, we should revert
- https://github.com/astral-sh/uv/pull/14275
- #13817
## Summary
In #14245, we started normalizing index URLs by dropping the trailing
slash in the lockfile. We added tests to ensure that this didn't cause
existing lockfiles to be invalidated, but we missed one of the
constructors (specifically, the path that's used with
`tool.uv.sources`).
Motivated by some code duplication highlighted in
https://github.com/astral-sh/uv/pull/14201, I noticed we weren't taking
advantage of the existing implementation for casting to a str here.
Unfortunately, we do need a special case for CPython still.
Close#7426
## Summary
Picking up on #8284, I noticed that the `requires_python` object already
has its specifiers canonicalized in the `intersection` method, meaning
`~=3.12` is converted to `>=3.12, <4`. To fix this, we check and warn in
`intersection`.
## Test Plan
Used the same tests from #8284.
## Summary
When the user provides a requirement like `==2.4.*`, we desugar that to
`>=2.4.dev0,<2.5.dev0`. These bounds then appear in error messages, and
worse, they also trick the error message reporter into thinking that the
user asked for a pre-release.
This PR adds logic to convert to the more-concise `==2.4.*`
representation when possible. We could probably do a similar thing for
the compatible release operator (`~=`).
Closes https://github.com/astral-sh/uv/issues/14177.
Co-authored-by: Zanie Blue <contact@zanie.dev>
Python `bin` installations installed with `uv python install --default
--preview` (no version specified) were not being installed as
upgradeable. Instead each link was pointed at the highest patch version
for a minor version. This change ensures that these preview default
installations are also treated as upgradeable.
The PR includes some updates to the related tests. First, it checks the
default install without specified version case. Second, since it's
adding more read link checks, it creates a new `read_link` helper method
to consolidate repeated logic and replace instances of
`#[cfg(unix/windows)` with `if cfg!(unix/windows)`.
Fixes#14247
This PR updates `IndexUrl` parsing to normalize non-file URLs by
removing trailing slashes. It also normalizes registry source URLs when
using them to validate the lockfile.
Prior to this change, when writing an index URL to the lockfile, uv
would use a trailing slash if present in the provided URL and no
trailing slash otherwise. This can cause surprising behavior. For
example, `uv lock --locked` will fail when a package is added with an
`--index` value without a trailing slash and then `uv lock --locked` is
run with a `pyproject.toml` version of the index URL that contains a
trailing slash. This PR fixes this and adds a test for the scenario.
It might be safe to normalize file URLs in the same way, but since
slashes have a well-defined meaning in the context of files and
directories, I chose not to normalize them here.
Closes#13707.
uv currently ignores URL-encoded credentials in a redirect location.
This PR adds a check for these credentials to the redirect handling
logic. If found, they are moved to the Authorization header in the
redirect request.
Closes#11097
Previously we were using the XDG data dir to avoid symlinks, but there's no
particular guarantee that that's not going to be a symlink too. Using the
canonicalized temp dir by default is also slightly nicer for a couple reasons:
It's sometimes faster (an in-memory tempfs on e.g. Arch), and it makes
overriding `$TMPDIR` or `%TMP%` sufficient to control where tests put temp
files, without needing to override `UV_INTERNAL__TEST_DIR` too.
There was a regression introduced in #13954 on Windows where creating a
venv behaved as if there was a minor version link even if none existed.
This PR adds a check to fix this.
Closes#14249.
We do not currently support passing index names to `--index` for
installing packages. However, we do accept relative paths that can look
like index names. This PR adds the requirement that `--index` values
must be disambiguated with a prefix (`./` or `../` on Unix and Windows
or `.\\` or `..\\` on Windows). For now, if an ambiguous value is
provided, uv will warn that this will not be supported in the future.
Currently, if you provide an index name like `--index test` when there
is no `test` directory, uv will error with a `Directory not found...`
error. That's not very informative if you thought index names were
supported. The new warning makes the context clearer.
Closes#13921
In workspaces with multiple packages, you usually don't want to include
shared files such as the license repeatedly. Instead, we reading from
symlinked files. This would be supported if we had used std's `is_file`
and read methods, but walkdir's `is_file` does not consider symlinked
files as files.
See https://github.com/astral-sh/uv/issues/3957#issuecomment-2994675003
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
Update [schemars
0.9.0](https://github.com/GREsau/schemars/releases/tag/v0.9.0)
There are differences in the generated JSON Schema and I will [contact
the author](https://github.com/GREsau/schemars/issues/407).
## Test Plan
---------
Co-authored-by: konstin <konstin@mailbox.org>
## Summary
This flakes often and we don't really need it to be monitored
continuously. We can always revive it from Git later.
Closes https://github.com/astral-sh/uv/issues/13952.
Don't log that we resolved a reference through the GitHub fast path if
we didn't use GitHub at all but used the cached revision. This avoids
stating that the fast path works when it's blocked due to unrelated
reasons (e.g. rate limits).
@oconnor663 discovered that executing `3.10.8` on Arch Linux ran into an
error loading `libcrypt.so.1`. This caused uv to install the latest
patch version on `uv venv` operations during upgrade tests, which
undermined their purpose (since they are checking that if you first
install `3.10.8` and then upgrade, virtual environments are
transparently upgraded). This PR updates the test to use `3.10.17`
instead to avoid this issue.
#13954 introduced an unnecessary slow-down to Python uninstall by
calling `installations.find_all()` to discover remaining installations
after an uninstall. Instead, we can filter all initial installations
against those in `uninstalled`.
As part of this change, I've updated `uninstalled` from a `Vec` to an
`IndexSet` in order to do efficient lookups in the filter. This required
a change I call out below to how we were retrieving them for messaging.
We were checking whether a path was an executable in a virtual
environment or the base directory of a virtual environment in multiple
places in the codebase. This PR consolidates this logic into one place.
Closes#13947.
This PR contains the following updates:
| Package | Type | Update | Change |
|---|---|---|---|
| [mimalloc](https://redirect.github.com/purpleprotocol/mimalloc_rust) |
dependencies | patch | `0.1.46` -> `0.1.47` |
---
> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.
---
### Release Notes
<details>
<summary>purpleprotocol/mimalloc_rust (mimalloc)</summary>
###
[`v0.1.47`](https://redirect.github.com/purpleprotocol/mimalloc_rust/releases/tag/v0.1.47):
Version 0.1.47
[Compare
Source](https://redirect.github.com/purpleprotocol/mimalloc_rust/compare/v0.1.46...v0.1.47)
##### Changes
- Mimalloc `v2.2.4`
</details>
---
### Configuration
📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).
🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.
♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.
🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.
---
- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box
---
This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).
<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC42Mi4xIiwidXBkYXRlZEluVmVyIjoiNDAuNjIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
## Summary
Allows `--torch-backend=auto` to detect AMD GPUs. The approach is fairly
well-documented inline, but I opted for `rocm_agent_enumerator` over
(e.g.) `rocminfo` since it seems to be the recommended approach for
scripting:
https://rocm.docs.amd.com/projects/rocminfo/en/latest/how-to/use-rocm-agent-enumerator.html.
Closes https://github.com/astral-sh/uv/issues/14086.
## Test Plan
```
root@rocm-jupyter-gpu-mi300x1-192gb-devcloud-atl1:~# ./uv-linux-libc-11fb582c5c046bae09766ceddd276dcc5bb41218/uv pip install torch --torch-backend=auto
Resolved 11 packages in 251ms
Prepared 2 packages in 6ms
Installed 11 packages in 257ms
+ filelock==3.18.0
+ fsspec==2025.5.1
+ jinja2==3.1.6
+ markupsafe==3.0.2
+ mpmath==1.3.0
+ networkx==3.5
+ pytorch-triton-rocm==3.3.1
+ setuptools==80.9.0
+ sympy==1.14.0
+ torch==2.7.1+rocm6.3
+ typing-extensions==4.14.0
```
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
I was looking into `uv tool` not supporting version files, and noticed
this implementation was confusing and skipped handling like a tracing
log if `--no-config` excludes selection a file. I've refactored it in
preparation for the next change.
> NOTE: The PRs that were merged into this feature branch have all been
independently reviewed. But it's also useful to see all of the changes
in their final form. I've added comments to significant changes
throughout the PR to aid discussion.
This PR introduces transparent Python version upgrades to uv, allowing
for a smoother experience when upgrading to new patch versions.
Previously, upgrading Python patch versions required manual updates to
each virtual environment. Now, virtual environments can transparently
upgrade to newer patch versions.
Due to significant changes in how uv installs and executes managed
Python executables, this functionality is initially available behind a
`--preview` flag. Once an installation has been made upgradeable through
`--preview`, subsequent operations (like `uv venv -p 3.10` or patch
upgrades) will work without requiring the flag again. This is
accomplished by checking for the existence of a minor version symlink
directory (or junction on Windows).
### Features
* New `uv python upgrade` command to upgrade installed Python versions
to the latest available patch release:
```
# Upgrade specific minor version
uv python upgrade 3.12 --preview
# Upgrade all installed minor versions
uv python upgrade --preview
```
* Transparent upgrades also occur when installing newer patch versions:
```
uv python install 3.10.8 --preview
# Automatically upgrades existing 3.10 environments
uv python install 3.10.18
```
* Support for transparently upgradeable Python `bin` installations via
`--preview` flag
```
uv python install 3.13 --preview
# Automatically upgrades the `bin` installation if there is a newer patch version available
uv python upgrade 3.13 --preview
```
* Virtual environments can still be tied to a patch version if desired
(ignoring patch upgrades):
```
uv venv -p 3.10.8
```
### Implementation
Transparent upgrades are implemented using:
* Minor version symlink directories (Unix) or junctions (Windows)
* On Windows, trampolines simulate paths with junctions
* Symlink directory naming follows Python build standalone format: e.g.,
`cpython-3.10-macos-aarch64-none`
* Upgrades are scoped to the minor version key (as represented in the
naming format: implementation-minor version+variant-os-arch-libc)
* If the context does not provide a patch version request and the
interpreter is from a managed CPython installation, the `Interpreter`
used by `uv python run` will use the full symlink directory executable
path when available, enabling transparently upgradeable environments
created with the `venv` module (`uv run python -m venv`)
New types:
* `PythonMinorVersionLink`: in a sense, the core type for this PR, this
is a representation of a minor version symlink directory (or junction on
Windows) that points to the highest installed managed CPython patch
version for a minor version key.
* `PythonInstallationMinorVersionKey`: provides a view into a
`PythonInstallationKey` that excludes the patch and prerelease. This is
used for grouping installations by minor version key (e.g., to find the
highest available patch installation for that minor version key) and for
minor version directory naming.
### Compatibility
* Supports virtual environments created with:
* `uv venv`
* `uv run python -m venv` (using managed Python that was installed or
upgraded with `--preview`)
* Virtual environments created within these environments
* Existing virtual environments from before these changes continue to
work but aren't transparently upgradeable without being recreated
* Supports both standard Python (`python3.10`) and freethreaded Python
(`python3.10t`)
* Support for transparently upgrades is currently only available for
managed CPython installations
Closes#7287Closes#7325Closes#7892Closes#9031Closes#12977
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
This PR is a combination of #12920 and #13754. Prior to these changes,
following a redirect when searching indexes would bypass our
authentication middleware. This PR updates uv to support propagating
credentials through our middleware on same-origin redirects and to
support netrc credentials for both same- and cross-origin redirects. It
does not handle the case described in #11097 where the redirect location
itself includes credentials (e.g.,
`https://user:pass@redirect-location.com`). That will be addressed in
follow-up work.
This includes unit tests for the new redirect logic and integration
tests for credential propagation. The automated external registries test
is also passing for AWS CodeArtifact, Azure Artifacts, GCP Artifact
Registry, JFrog Artifactory, GitLab, Cloudsmith, and Gemfury.
Close#13922
## Summary
Add a warning if the directory given by the `--index` argument is empty.
## Test Plan
Added test case `add_index_empty_directory` in `edit.rs`
When working on support for reading global Python pins in tool
operations, I noticed that we weren't using the canonicalized Python
request in receipts — we were using the raw string provided by the user.
Since we'll need to compare these values, we should be using the
canonicalized string.
The `Tool` and `ToolReceipt` types have been updated to hold a
`PythonRequest` instead of a `String`, and `Serialize` was implemented
for `PythonRequest` so canonicalization can happen at the edge instead
of being the caller's responsibility.
Fix `uv run -p 3.7` by not using a walrus operator. Python 3.7 isn't
really supported anymore, but there's no reason to break interpreter
discovery for it.
When using `uv lock --upgrade-package=python` after changing
`requires-python`, it was possible to get into a state where the fork
markers produced corresponded to the empty set. This in turn resulted in
an empty lock file.
There was already some infrastructure in place that I think was perhaps
intended to handle this. In particular, `Lock::check_marker_coverage`
checks whether the fork markers have some overlap with the supported
environments (including the `requires-python`). But there were two
problems with this.
First is that in lock validation, this marker coverage check came
_after_ a path that returned `Preferable` (meaning that the fork markers
should be kept) when `--upgrade-package` was used. Second is that the
marker coverage check used the `requires-python` in the lock file and
_not_ the `requires-python` in the now updated `pyproject.toml`.
We attempt to solve this conundrum by slightly re-arranging lock file
validation and by explicitly checking whether the *new*
`requires-python` is disjoint from the fork markers in the lock file. If
it is, then we return `Versions` from lock file validation (indicating
that the fork markers should be dropped).
Fixes#13951
We always ignore the `clippy::struct_excessive_bools` rule and formerly
annotated this at the function level. This PR specifies the allow in
`workspace.lints.clippy` in `Cargo.toml`.
Using a companion change in the middleware
(https://github.com/TrueLayer/reqwest-middleware/pull/235, forked&tagged
pending review), we can check and show retries for HTTP status core
errors, to consistently report retries again.
We fix two cases:
* Show retries for status code errors for cache client requests
* Show retries for status code errors for Python download requests
Not handled:
* Show previous retries when a distribution download fails mid-streaming
* Perform retries when a distribution download fails mid-streaming
* Show previous retries when a Python download fails mid-streaming
* Perform retries when a Python download fails mid-streaming
(or legacy tool.uv.workspace).
This cleaves out a dedicated SourcedDependencyGroups type based on
RequiresDist but with only the DependencyGroup handling implemented.
This allows `uv pip` to read `dependency-groups` from pyproject.tomls
that only have that table defined, per PEP 735, and as implemented by
`pip`.
However we want our implementation to respect various uv features when
they're available:
* `tool.uv.sources`
* `tool.uv.index`
* `tool.uv.dependency-groups.mygroup.requires-python` (#13735)
As such we want to opportunistically detect "as much as possible" while
doing as little as possible when things are missing. The issue with the
old RequiresDist path was that it fundamentally wanted to build the
package, and if `[project]` was missing it would try to desperately run
setuptools on the pyproject.toml to try to find metadata and make a hash
of things.
At the same time, the old code also put in a lot of effort to try to
pretend that `uv pip` dependency-groups worked like `uv`
dependency-groups with defaults and non-only semantics, only to separate
them back out again. By explicitly separating them out, we confidently
get the expected behaviour.
Note that dependency-group support is still included in RequiresDist, as
some `uv` paths still use it. It's unclear to me if those paths want
this same treatment -- for now I conclude no.
Fixes#13138
This allows you to specify requires-python on individual dependency-groups,
with the intended usecase being "oh my dev-dependencies have a higher
requires-python than my actual project".
This includes a large driveby move of the RequiresPython type to
uv-distribution-types to allow us to generate the appropriate markers at
this point in the code. It also migrates RequiresPython from
pubgrub::Range to version_ranges::Ranges, and makes several pub(crate)
items pub, as it's no longer defined in uv_resolver.
Fixes#11606
In the case where we have platform information in a Python request, we
should filter managed Python distributions by it prior to querying them.
Closes https://github.com/astral-sh/uv/issues/13935
---------
Co-authored-by: Aria Desires <aria.desires@gmail.com>
Often, HTTP requests don't fail due to server errors, but from spurious
network errors such as connection resets. reqwest surfaces these as
`io::Error`, and we have to handle their retrying separately.
Companion PR: https://github.com/LukeMathWalker/wiremock-rs/pull/159
Unlike regular packages, specifying all `__init__.py` directories for a
namespace package would be very verbose There is e.g.
https://github.com/python-poetry/poetry/tree/main/src/poetry, which has
18 modules, or https://github.com/googleapis/api-common-protos which is
inconsistently nested. For both the Google Cloud SDK, there are both
packages with a single module and those with complex structures, with
many having multiple modules due to versioning through `<module>_v1`
versioning. The Azure SDK seems to use one module per package (it's not
explicitly documented but seems to follow from the process in
https://azure.github.io/azure-sdk/python_design.html#azure-sdk-distribution-packages
and
ccb0e03a3d/doc/dev/packaging.md).
For simplicity with complex projects, we add a `namespace = true` switch
which disabled checking for an `__init__.py`. We only check that there's
no `<module_root>/<module_name>/__init__.py` and otherwise add the whole
`<module_root>/<module_name>` folder. This comes at the cost of
`namespace = true` effectively creating an opt-out from our usual checks
that allows creating an almost entirely arbitrary package.
For simple projects with only a single module, the module name can be
dotted to point to the target module, so the build still gets checked:
```toml
[tool.uv.build-backend]
module-name = "poetry.core"
```
## Alternatives
### Declare all packages
We could make `module-name` a list and allow or require declaring all
packages:
```toml
[tool.uv.build-backend]
module-name = ["cloud_sdk.service.storage", "cloud_sdk.service.storage_v1", "cloud_sdk.billing.storage"]
```
Or for Poetry:
```toml
[tool.uv.build-backend]
module-name = [
"poetry.config",
"poetry.console",
"poetry.inspection",
"poetry.installation",
"poetry.json",
"poetry.layouts",
"poetry.masonry",
"poetry.mixology",
"poetry.packages",
"poetry.plugins",
"poetry.publishing",
"poetry.puzzle",
"poetry.pyproject",
"poetry.repositories",
"poetry.toml",
"poetry.utils",
"poetry.vcs",
"poetry.version"
]
```
### Support multiple namespaces
We could also allow namespace packages with multiple root level module:
```toml
[tool.uv.build-backend]
module-name = ["cloud_sdk.my_ext", "local_sdk.my_ext"]
```
For lack of use cases, we delegate this to creating a workspace with one
package per module.
## Implementation
Due to the more complex options for the module name, I'm moving
verification on deserialization later, dropping the source span we'd get
from serde. We also don't show similarly named directories anymore.
---------
Co-authored-by: Andrew Gallant <andrew@astral.sh>
Investigating #13744
I tried reproducing here by running the test in a loop, but could not. I
presume it's an interaction with other tests.
This drops the snapshot, but I think it's worth it to try to examine the
flake?
Use TTY detection to determine when we should forward SIGINT instead of
counting signals, which can lead to various problems where multiple
SIGINTs are sent to a child after the first signal. Counting does not
make sense in interactive situations that do not exit on interrupt,
e.g., the Python REPL.
Closes https://github.com/astral-sh/uv/issues/13919
Closes https://github.com/astral-sh/uv/issues/12108
As another follow-up in the vein of
https://github.com/astral-sh/uv/pull/13944, I noticed `uv python pin`
doesn't download Python versions, which is a bit weird because we'll
warn it's not found.
See https://github.com/astral-sh/uv/issues/13935#issuecomment-2957300516
where we fail to write a pin file because we encounter an unusable
interpreter. This is actually a special case where `MissingPython` is
not raised because we want to show why we failed to find a usable
interpreter, which is useful in commands where you _need_ an interpreter
to use, but here we don't actually need it. Here, we just log the
failure and move on.
Related https://github.com/astral-sh/uv/pull/13936
Add basic tests for error messages on retryable network errors.
This test mod is intended to grow to ensure that we handle retryable
errors correctly and that we show the appropriate error message if we
failed after retrying.
The starter tests show some common cases we've seen download errors in:
simple and find links indexes, file downloads and Python installs.
For `io::Error` fault injection to test the reqwest `Err` path besides
the HTTP status code `Ok` path, see
https://github.com/LukeMathWalker/wiremock-rs/issues/149.
As per #13874, passing a relative URL like `test` to `--index` for `uv
add` causes unexpected behavior if the directory does not exist. The
non-existent index is effectively ignored and uv falls back to PyPI. If
a package is found there, the spurious index is then written to
`pyproject.toml`. This doesn't happen for `--default-index` since
resolution will fail without fallback to PyPI.
This PR adds a validation step for indexes provided on the command line.
If a directory does not exist, uv will fail with an error.
Closes#13874
For the case where there was no matching wheel on sync, we previously
added a note about which wheels are available vs. on which platform you
are on. We extend this error message to link directly towards
`tool.uv.required-environments`, which otherwise has a discovery
problem.
On Linux (Setting `tool.uv.required-environments` doesn't help here
either, but it's a clear example):
```
[project]
name = "debug"
version = "0.1.0"
requires-python = "==3.10.*"
dependencies = ["tensorflow-macos>=2.13.1"]
```
```
Resolved 41 packages in 24ms
error: Distribution `tensorflow-macos==2.16.2 @ registry+https://pypi.org/simple` can't be installed because it doesn't have a source distribution or wheel for the current platform
hint: You're on Linux (`manylinux_2_39_x86_64`), but there are no wheels for the current platform, consider configuring `tool.uv.required-environments`.
hint: `tensorflow-macos` (v2.16.2) only has wheels for the following platform: `macosx_12_0_arm64`.
```

---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
Users are not (yet) properly familiar with the concept of universal
resolution and its implication that we need to resolve for all possible
platforms and Python versions. Some projects only target a specific
platform or Python version, and users experience resolution errors due
to failures for other platforms. Indicated by the number of questions we
get about it, `tool.uv.environments` for restricting environments is not
well discoverable.
We add a special hint when resolution failed on a fork disjoint with the
current environment, hinting the user to constrain `requires-python` and
`tool.uv.environments` respectively.
The hint has false positives for cases where the resolution failed on a
different platform, but equally fails on the current platform, in cases
where the non-current fork was tried earlier. Given that conflicts can
be based on `requires-python`, afaik we can't parse whether the current
platform would also be affected from the derivation tree.
Two cases not covered by this are build errors as well as install errors
that need `tool.uv.required-environments`.
- Define all list elements using `-`: it used to be a mix of `*` and
`-`. `-` is what Prettier linter formats it to by default.
- Removed unnecessary blank line between 2 list elements. Other elements
were stitched together without blank lines in between.
- Only the first list element started in sentence case (capital letter
first) - I made all start like so.
Surprisingly, we weren't locking during `uv sync` so far, so running `uv
sync` in parallel could cause errors in filesystem races.
I've also added locks to `uv add` and `uv remove` which concurrently
modify `pyproject.toml`. These locks only apply after we determined the
interpreter, so they don't actually prevent computing the same thing
twice when running `uv add` in parallel.
All other subcommands that I checked were already locking (with no claim
to exhaustiveness)
Fixes#12751
# Test Plan
I don't have CI-sized reproducer for this.
```toml
[project]
name = "debug"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = [
"boto3>=1.38.30",
"fastapi>=0.115.12",
"numba>=0.61.2",
"polars>=1.30.0",
"protobuf>=6.31.1",
"pyarrow>=20.0.0",
"pydantic>=2.11.5",
"requests>=2.32.3",
"urllib3>=2.4.0",
"scikit-learn>=1.6.1",
"jupyter>=1.1.1",
]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
```
```
rm -rf .venv && parallel -n0 "uv sync -q" ::: {1..10}
```
## Summary
I think `GlobDirFilter::match_directory` should return `true` if it
failed to construct a DFA
to force a full directory traversal. Returning `false` means that the
build backend excludes all files which
is unlikely what we want in that situation.
## Summary
When trying out `uv export --no-editable --format pylock.toml` the
exported contents would still retain `editable = true` regardless.
## Test Plan
Added additional test. Tested locally on few projects where I was
previously using `uv export --no-editable --format requirements.txt` to
ensure the output aligns.
Extends https://github.com/astral-sh/uv/pull/13841 — I'll drop that
commit later after that pull request merges but it's small.
I find the split into a "Configuration" section awkward and don't think
it's helping us. Everything moved into the "Concepts" section, except
the "Environment variables" page which definitely belongs in the
reference and the "Installer" page which is fairly niche and seems
better in the reference.
Before / After
<img
src="https://github.com/user-attachments/assets/80d8304b-17da-4900-a5f4-c3ccac96fcc5"
width="400">
## Summary
Switch the `sync_dry_run` test to use Python 3.9 instead of Python 3.8.
I didn't previously catch it since it was marked as `python_managed`.
## Test Plan
`cargo test --no-default-features --features git,pypi,python` without
Python 3.8 installed.
## Summary
Modify `requirements_txt_ssh_git_username` test to pass `-o
UserKnownHostsFile=/dev/null` to OpenSSH, in order to prevent it from
trying to write into the known-hosts in user's home directory. To add
insult to the injury, OpenSSH ignores `HOME` and writes into the actual
home when it is explicitly overridden for the purpose of testing.
## Test Plan
`cargo test --no-default-features --features=git,pypi,python` in an
environment where the user can't write to `pw_home` (but can to
`${HOME}`). Previously the test would fail:
```
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ Snapshot Summary ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Snapshot: requirements_txt_ssh_git_username
Source: crates/uv/tests/it/export.rs:1252
────────────────────────────────────────────────────────────────────────────────
Expression: snapshot
────────────────────────────────────────────────────────────────────────────────
-old snapshot
+new results
────────────┬───────────────────────────────────────────────────────────────────
7 7 │ ├─▶ failed to clone into: [PATH]
8 8 │ ├─▶ failed to fetch branch, tag, or commit `d780faf0ac91257d4d5a4f0c5a0e4509608c0071`
9 9 │ ╰─▶ process didn't exit successfully: [GIT_COMMAND_ERROR]
10 10 │ --- stderr
11 │+ Could not create directory '/var/lib/portage/home/.ssh' (Permission denied).
12 │+ Failed to add the host to the list of known hosts (/var/lib/portage/home/.ssh/known_hosts).
11 13 │ Load key "[TEMP_DIR]/fake_deploy_key": [ERROR]
12 14 │ git@github.com: Permission denied (publickey).
13 15 │ fatal: Could not read from remote repository.
14 16 │
────────────┴───────────────────────────────────────────────────────────────────
```
---------
Co-authored-by: konstin <konstin@mailbox.org>
## Summary
When missing an operator for version parsing, it would give an error
that was hard to know how to fix if you were not familiar with version
specifiers / PEP-440:
```
Unexpected end of version specifier, expected operator
```
Now, it will attempt to provide a more useful hint if it can parse the
version from the remaining scanner:
```
Unexpected end of version specifier, expected operator (did you mean "==3.12"?)
```
## Test Plan
Unit tests in `version_specifier.rs` were added/updated for the
following cases:
- `test_parse_specifier_missing_operator_error`
- `test_parse_specifier_missing_operator_invalid_version_error`
- `test_invalid_word`
- `test_invalid_specifier`
- `error_message_version_specifiers_parse_error`
A test in `edit.rs` for failing to parse the `pyproject.toml` when using
`add` was also included to match the request in the original issue:
- `add_invalid_requires_python`
I didn't add cases where no version specifier is provided because it
seemed like it doesn't get to the point of parsing in that case, so it
should not happen.
## Reference
Fixes#13126
---------
Co-authored-by: Jacob Woliver <jacob@jmw.sh>
Co-authored-by: konstin <konstin@mailbox.org>
This includes some initial work on adding Pyodide support (issue
#12729). It is enough to get
```
uv pip compile -p /path/to/pyodide --extra-index-url file:/path/to/simple-index
```
to work which should already be quite useful.
## Test Plan
* added a unit test for `pyodide_platform`
* integration tested manually with:
```
cargo run pip install \
-p /home/rchatham/Documents/programming/tmp/pyodide-venv-test/.pyodide-xbuildenv-0.29.3/0.27.4/xbuildenv/pyodide-root/dist/python \
--extra-index-url file:/home/rchatham/Documents/programming/tmp/pyodide-venv-test/.pyodide-xbuildenv-0.29.3/0.27.4/xbuildenv/pyodide-root/package_index \
--index-strategy unsafe-best-match --target blah --no-build \
numpy pydantic
```
---------
Co-authored-by: konsti <konstin@mailbox.org>
Co-authored-by: Zanie Blue <contact@zanie.dev>
This updates the `Debug` implementation of `DisplaySafeUrl` to display
the git username in a case like `git+ssh://git@github.com/org/repo`. It
also factors out the git username check into a shared function and adds
related unit tests to `DisplaySafeUrl`.
`trace!`-log the locations of data directories used by a wheel. These
paths are queried from sysconfig, i.e. they are mostly Python
interpreter defined instead of being computed by uv.
Fixes#13795 specifically by not redacting the username convention `git`
for SSH URLs, though we should never write stars in `uv export`
generally (#13791).
We've been using a number of different winapi crates. This PR removes
winsafe in favor of the official windows-* crates, so all of uv's own
winapi calls go through the official windows-* crates.
---------
Co-authored-by: Aria Desires <aria.desires@gmail.com>
## Summary
Implemented as suggested in #13761
eg.
```
$ uv tool install 'harlequin[postgres]'
$ uv tool list --show-extras
harlequin v2.1.2 [extras: postgres]
- harlequin
```
## Test Plan
Added a new test with the argument along with the others from the `uv
tool list` cli.
## Summary
Update `lock::lock_requires_python` not to require Python 3.8.
Fixes#13676
## Test Plan
`cargo test --no-default-features --features git,pypi,python` after
removing Python 3.8.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
Use the existing `pypi` feature to conditionalize a number of tests that
attempt to access https://pypi.org and/or
https://files.pythonhosted.org. See
https://github.com/astral-sh/uv/issues/8970#issuecomment-2466794088.
There is no reason to believe that these are *all* of the tests that
need to be conditionalized on the `pypi` feature, but this should be a
solid step in the right direction.
## Test Plan
<!-- How was it tested? -->
This allows me to build and run the integration tests in [Fedora’s `uv`
package](https://src.fedoraproject.org/rpms/uv) without having to
manually skip tests that try to access PyPI. I confirmed that this
appears to accomplish that goal.
Otherwise, this should be tested by building and running the tests as
usual. As mentioned in
https://github.com/astral-sh/uv/issues/8970#issuecomment-2516181501, a
more complete solution would include CI tests that confirm these
features are working as intended. I’m not in a position to offer that.
This PR contains the following updates:
| Package | Type | Update | Change |
|---|---|---|---|
| [tempfile](https://stebalien.com/projects/tempfile-rs/)
([source](https://redirect.github.com/Stebalien/tempfile)) |
workspace.dependencies | minor | `3.19.1` -> `3.20.0` |
---
> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.
---
### Release Notes
<details>
<summary>Stebalien/tempfile (tempfile)</summary>
###
[`v3.20.0`](https://redirect.github.com/Stebalien/tempfile/blob/HEAD/CHANGELOG.md#3200)
[Compare
Source](https://redirect.github.com/Stebalien/tempfile/compare/v3.19.1...v3.20.0)
This release mostly unifies the behavior/capabilities around "keeping"
temporary files:
- Rename `Builder::keep(bool)` (via deprecation) to
`Builder::disable_cleanup(bool)` to make it clear that behaves
differently from `NamedTempFile::keep()`. The former disables automatic
cleanup while the latter *consumes* the `NamedTempFile` object entirely
and unsets the "temporary file" attribute (on Windows).
- Rename `TempDir::into_path` (via deprecation) to `TempDir::keep` to
mirror `NamedTempFile::keep`.
- Add `TempDir::disable_cleanup`, `NamedTempFile::disable_cleanup`, and
`TempPath::disable_cleanup` making it possible to disable automatic
cleanup in-place *after* creating a temporary file/directory (equivalent
to calling `Builder::disable_cleanup` before creating the
file/directory).
Additionally, it adds a few spooled temporary file features:
- Add `SpooledTempFile::into_file` for turning a `SpooledTempFile` into
a regular unnamed temporary file, writing it to the backing storage
("rolling" it) if it was still stored in-memory.
- Add `spooled_tempfile_in` and `SpooledTempFile::new_in` methods for
creating spooled temporary files in a specific directory. This makes it
possible to choose the backing device for your spooled temporary file
which is rather important on Linux where the default temporary directory
is likely backed by memory (defeating the entire point of having a
spooled temporary file).
Finally, this release improves documentation, especially the top-level
documentation explaining which temporary file type to use.
**BREAKING** for those with `deny(warnings)`:
- `Builder::keep` deprecated in favor of `Builder::disable_cleanup`.
- `TempDir::into_path` is deprecated in favor of `TempDir::keep`.
**BREAKING**:
</details>
---
### Configuration
📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC),
Automerge - At any time (no schedule defined).
🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.
♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.
🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.
---
- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box
---
This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/astral-sh/uv).
<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0MC4zMy42IiwidXBkYXRlZEluVmVyIjoiNDAuMzMuNiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOlsiaW50ZXJuYWwiXX0=-->
---------
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: konstin <konstin@mailbox.org>
## Summary
Update a few more tests to avoid using Python 3.8 unnecessarily. From
what I can see, neither of these really need that specific Python
version, so just update them to use 3.9 instead.
Bug #13676
## Test Plan
Uninstall Python 3.8 and run: `cargo test --no-default-features
--features git,pypi,python`
One test failure remains.
Previously if you wanted to run e.g. PyPy via `uvx`, you had to spell it
like `uvx -p pypy python`. Now we reuse some of the
`PythonRequest::parse` machinery to handle the executable, so all of the
following examples work:
- `uvx python3.8`
- `uvx 'python>3.7,<3.9'`
- `uvx --from python3.8 python` (or e.g. `bash`)
- `uvx pypy38`
- `uvx graalpy@38`
The `python` (and on Windows only, `pythonw`) special cases are
retained, which normally aren't allowed values of `-p`/`--python`.
Closes https://github.com/astral-sh/uv/issues/13536.
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
Co-authored-by: konsti <konstin@mailbox.org>
Replacement for https://github.com/astral-sh/uv/pull/13474 (clobbered by
a changed base)
Once these are explicitly installed, they should be discoverable and
usable. Currently, they are not.
In https://github.com/astral-sh/uv/pull/13721#issuecomment-2920530601 I
presumed that all the installation problems in
https://github.com/astral-sh/uv/pull/13722 were solved by
https://github.com/astral-sh/uv/pull/13709 but they were not because we
don't differentiate between implicit and explicit architecture requests
so a request for `aarch64` is considered satisfied by an existing
`x86-64` installation even if the user explicitly requested that
architecture.
Now, we track if it was explicit or implicit, requiring an exact match
in the former case, and a `supports` in the latter.
We considered doing this for other items in the request, like the
operating system but we don't have a `supports()` concept there. It
might make sense for libc in the future.
Resolves
https://github.com/astral-sh/uv/pull/13474#discussion_r2112586405
This kind of dynamic ordering freaks me out a little, but I think it's
probably the best solution and is static at compile-time.
Currently, we're just sorting by the stringified representation! which
is just convenient for reproducibility, but we rely on these orderings
for prioritization in discovery.
Previously you could get a confusing error message like this:
```
$ docker run ghcr.io/astral-sh/uv run python
error: Could not read ELF interpreter from any of the following paths: /bin/sh, /usr/bin/env, /bin/dash, /bin/ls
```
Now the error message is better:
```
error: Failed to discover managed Python installations
Caused by: Failed to determine the libc used on the current platform
Caused by: Failed to find any common binaries to determine libc from: /bin/sh, /usr/bin/env, /bin/dash, /bin/ls
```
See https://github.com/astral-sh/uv/issues/8635.
---------
Co-authored-by: konsti <konstin@mailbox.org>
Co-authored-by: Zanie Blue <contact@zanie.dev>
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
Related to https://github.com/astral-sh/uv/issues/6801.
Currently on Windows, uv itself will always creates a console window,
even though the window could be empty if `uv run --gui-script` is used.
This is due to it using the [default `console` window
subsystem](https://rust-lang.github.io/rfcs/1665-windows-subsystem.html).
This PR introduces a wrapper `uvw` that, similar to the existing `uvx`,
invokes `uv` with the
[`CREATE_NO_WINDOW`](https://learn.microsoft.com/en-us/windows/win32/procthread/process-creation-flags#:~:text=CREATE_NO_WINDOW)
process creation flag on Windows, which creates child process without
console window.
Note that this PR does not alter any behaviors regarding `run --script`
and `run --gui-script`.
## Test Plan
Built and tested locally by doing something like `uvw run test.py`.
Currently, it is not possible to set a custom Python downloads JSON on
Windows, as Windows paths can be valid URLs.
```rust
use url::Url;
fn main() {
dbg!(Url::parse(r"C:\Users\Ferris\download.json"));
}
```
Tested by https://github.com/astral-sh/uv/pull/13585 (where it is
currently failing CI).
By default, uv uses only a lower bound in `uv add`, which avoids
dependency conflicts due to upper bounds. With this PR, this cna be
changed by setting a different bound kind. The bound kind can be
configured in `uv.toml`, as a user preference, in `pyproject.toml`, as a
project preference, or on the CLI, when adding a specific project.
We add two options that add an upper bound on the constraint, one for
SemVer (`>=1.2.3,<2.0.0`, dubbed "major", modeled after the SemVer
caret) and another one for dependencies that make breaking changes in
minor version (`>=1.2.3,<1.3.0`, dubbed "minor", modeled after the
SemVer tilde). Intuitively, the major option bumps the most significant
version component, while the minor option bumps the second most
significant version component. There is also an exact bounds option
(`==1.2.3`), though generally we recommend setting a wider bound and
using the lockfile for pinning.
Versions can have leading zeroes, such as `0.1` or `0.0.1`. For a single
leading 0, we shift the the meaning of major and minor similar to cargo.
For two or more leading zeroes, the difference between major and minor
becomes inapplicable, instead both bump the most significant component:
- major: `0.1` -> `>=0.1,<0.2`
- major: `0.0.1` -> `>=0.0.1,<0.0.2`
- major: `0.0.1.1` -> `>=0.0.1.1,<0.0.2.0`
- major: `0.0.0.1` -> `>=0.0.0.1,<0.0.0.2`
- minor: `0.1` -> `>=0.1,<0.1.1`
- minor: `0.0.1` -> `>=0.0.1,<0.0.2`
- minor: `0.0.1.1` -> `>=0.0.1.1,<0.0.2.0`
- minor: `0.0.0.1` -> `>=0.0.0.1,<0.0.0.2`
For a consistent appearance, we try to preserve the number of components
in the upper bound. For example, adding a version `2.17` with the major
option is stored as `>=2.17,<3.0`. If a version uses three components
and is greater than 0, both bounds will also use three components
(SemVer versions always have three components). Of the top 100 PyPI
packages, 8 use a non-three-component version (docutils, idna, pycparser
and soupsieve with two components, packaging, pytz and tzdata with two
component, CalVer and trove-classifiers with four component CalVer).
Example `pyproject.toml` files with the top 100 packages: [`--bounds
major`](https://gist.github.com/konstin/0aaffa9ea53c4834c22759e8865409f4)
and [`--bounds
minor`](https://gist.github.com/konstin/e77f5e990a7efe8a3c8a97c5c5b76964).
While many projects follow version scheme that roughly or directly
matches the major or minor options, these compatibility ranges are
usually not applicable for the also popular CalVer versioning.
For pre-release versions, there are two framings we could take: One is
that pre-releases generally make no guarantees about compatibility
between them and are used to introduce breaking changes, so we should
pin them exactly. In many cases however, pre-release specifiers are used
because a project needs a bugfix or a feature that hasn't made it into a
stable release, or because a project is compatible with the next version
before a final version for that release is published. In those cases,
compatibility with other packages that depend on the same library is
more important, so the desired bound is the same as it would be for the
stable release, except with the lower bound lowered to include
pre-release.
The names of the bounds and the name of the flag is up for bikeshedding.
Currently, the option is call `tool.uv.bounds`, but we could also move
it under `tool.uv.edit.bounds`, where it would be the first/only entry.
Fixes#6783
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>
## Summary
This PR makes a few performance improvements:
1. Reduces the need to unpack and repack a `_GLibCVersion` tuple
2. Reduces the doubled call to `_is_compatible(arch, glibc_version)`
3. Moves the assignment of the `tag` variable directly into the yield,
reducing memory allocation in case this is never used when
`_is_compatible(arch, glibc_version)` is false
4. Combines the check of the `glibc_version` being in
`_LEGACY_MANYLINUX_MAP` and the assignment to the variable together. I'm
not sure if this is actually better, but using the assignment expression
reduces this from 4 lines to 2
## Test Plan
I upstreamed these changes in
https://github.com/pypa/packaging/pull/869. Otherwise, I'm pretty
confident this is a minor change that works the same
By default, Rust does not support safe cast from `&U` to `&T` for
`#[repr(transparent)] T(U)` even if the newtype opts in. The dtolnay
ref-cast crate fills this gap, allowing to remove `DisplaySafeUrlRef`.
This PR implements a few review follow-ups from #13560. In particular,
it
* Makes `DisplaySafeUrlRef` implement `Copy` so that it can be passed by
value.
* Updates `to_string_with_credentials` methods with
`displayable_with_credentials`, returning an `impl Display` instead of
`String` for greater flexibility.
* Updates the `DisplaySafeUrl` and `DisplaySafeUrlRef` `Debug`
implementations to match the underlying `Url` `Debug` implementation
(with the exception that credentials are masked).
* Replaces an unnecessary `DisplaySafeUrl::from(Url::from_file_path`
with `DisplaySafeUrl::from_file_path`
See https://github.com/astral-sh/uv/issues/13676
Python 3.8 is EOL, and we should avoid using it for tests unless we're
covering 3.8-specific behaviors.
In some cases, the Python version we require for the test is arbitrary,
for which I've added a new constant (set to 3.12). In other cases, we
require an older Python version for compatibility with various packages,
like `setuptools`. For those, we can _probably_ switch to a newer
version but it'd be more invasive as we'd need to change our constraints
or `exclude-newer`.
Adds a feature flag for cases where we still need to use the EOL
version.
There's a lot of usage in packse scenarios too, I'll update those
separately because the diff will be large.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
This change allows for `uv python install --reinstall` to include
pre-releases when reinstalling. It does this by explicitly allowing
pre-releases to be included within `PythonRequest::Any` if the user does
not specify a version to reinstall.
Fixes#13582
## Test Plan
<!-- How was it tested? -->
```bash
uv python install 3.14 3.13 3.10
uv python install --no-config --reinstall
```
## Summary
Right now, if a workspace member is first created by way of being a dev
dependency on another member, we end up duplicating it in the graph.
Instead, we should create all the roots upfront; all subsequent node
creations are robust to existing nodes.
Closes
https://github.com/astral-sh/uv/issues/13673#issuecomment-2912196406.
## Summary
Resolves https://github.com/astral-sh/uv/issues/13580.
`uv self update --offline` should fail and exit early because
self-updating requires network connection.
## Test Plan
A snapshot test is added.
---------
Co-authored-by: Aria Desires <aria.desires@gmail.com>
Co-authored-by: konsti <konstin@mailbox.org>
## Summary
This should reduce the number of filesystem operations fairly
dramatically:
- Only query actual symlinks.
- Don't recurse into package bodies (huge).
- Only traverse once (rather than twice).
Closes https://github.com/astral-sh/uv/issues/13667.
Prior to this PR, there were numerous places where uv would leak
credentials in logs. We had a way to mask credentials by calling methods
or a recently-added `redact_url` function, but this was not secure by
default. There were a number of other types (like `GitUrl`) that would
leak credentials on display.
This PR adds a `DisplaySafeUrl` newtype to prevent leaking credentials
when logging by default. It takes a maximalist approach, replacing the
use of `Url` almost everywhere. This includes when first parsing config
files, when storing URLs in types like `GitUrl`, and also when storing
URLs in types that in practice will never contain credentials (like
`DirectorySourceUrl`). The idea is to make it easy for developers to do
the right thing and for the compiler to support this (and to minimize
ever having to manually convert back and forth). Displaying credentials
now requires an active step. Note that despite this maximalist approach,
the use of the newtype should be zero cost.
One conspicuous place this PR does not use `DisplaySafeUrl` is in the
`uv-auth` crate. That would require new clones since there are calls to
`request.url()` that return a `&Url`. One option would have been to make
`DisplaySafeUrl` wrap a `Cow`, but this would lead to lifetime
annotations all over the codebase. I've created a separate PR based on
this one (#13576) that updates `uv-auth` to use `DisplaySafeUrl` with
one new clone. We can discuss the tradeoffs there.
Most of this PR just replaces `Url` with `DisplaySafeUrl`. The core is
`uv_redacted/lib.rs`, where the newtype is implemented. To make it
easier to review the rest, here are some points of note:
* `DisplaySafeUrl` has a `Display` implementation that masks
credentials. Currently, it will still display the username when there is
both a username and password. If we think is the wrong choice, it can
now be changed in one place.
* `DisplaySafeUrl` has a `remove_credentials()` method and also a
`.to_string_with_credentials()` method. This allows us to use it in a
variety of scenarios.
* `IndexUrl::redacted()` was renamed to
`IndexUrl::removed_credentials()` to make it clearer that we are not
masking.
* We convert from a `DisplaySafeUrl` to a `Url` when calling `reqwest`
methods like `.get()` and `.head()`.
* We convert from a `DisplaySafeUrl` to a `Url` when creating a
`uv_auth::Index`. That is because, as mentioned above, I will be
updating the `uv_auth` crate to use this newtype in a separate PR.
* A number of tests (e.g., in `pip_install.rs`) that formerly used
filters to mask tokens in the test output no longer need those filters
since tokens in URLs are now masked automatically.
* The one place we are still knowingly writing credentials to
`pyproject.toml` is when a URL with credentials is passed to `uv add`
with `--raw`. Since displaying credentials is no longer automatic, I
have added a `to_string_with_credentials()` method to the `Pep508Url`
trait. This is used when `--raw` is passed. Adding it to that trait is a
bit weird, but it's the simplest way to achieve the goal. I'm open to
suggestions on how to improve this, but note that because of the way
we're using generic bounds, it's not as simple as just creating a
separate trait for that method.
PackageMetadata, for whatever reason, does not have a mirrored Wire type
so it was easy to not realize that it contains markers that need to be
complexified.
Fixes#13614
---------
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
We were previously rendering messages for the info level, carrying
overhead in pubgrub which using `log::info!`. We avoid this by only
configuring `LevelFilter::INFO` if the durations layer exists.
I've confirmed that the default `Subscriber::max_level_hint` goes from
`INFO` to `OFF` and the profile skips `Incompatibility::display`.
## Summary
Closes#13612
We check if the git error message includes `not a git repository` to
figure out if the path isn't a Git repo or if Git's broken. This PR sets
`LC_ALL=C` when invoking `git rev-parse --is-inside-work-tree` so that
the error message doesn’t change based on the user’s locale settings.
There is a runtime issue with some of these builds
Here is the testing I've seen (❌ has bug, ✅ works):
* uv version (pbs version)
* ❌ uv 0.7.7 (pbs 20250521)
* ❌ uv 0.7.6 (pbs 20250517)
* ✅ uv 0.7.5 (pbs 20250409)
* os
* ❌ linux
* ✅ windows
* arch
* ❌ x86_64
* ✅ aarch64
* python version
* ❌ 3.12
* ❌ 3.13
* ❌ 3.14
Fixes#13610