Add an experimental JSON format for METADATA and WHEEL files in the uv build backend to have a reference what it would look like and to demonstrate that the actual change is just a few lines, how much easier it is than the existing email-ish format.
## Summary
After chatting with the PyTorch team, it looks like some number of
wheels were accidentally uploaded with
`no-cache,no-store,must-revalidate` due to
https://github.com/pytorch/pytorch/pull/149218. They're going to correct
this for the respective wheels. I've encouraged them to set an immutable
caching header for these files, and it might happen. But even if this
isn't set, by default we only allow these wheels to be cached for 600s,
since the other wheels don't include a `Cache-Control` header at all
(but do include a `Last-Modified`, so we cache based on our heuristic:
`Freshness lifetime heuristically assumed because of presence of
last-modified header: 600s`). This probably leads to tons of unnecessary
downloads for users over time. Andrey from the PyTorch team agreed that
we should do this.
Closes https://github.com/astral-sh/uv/issues/15480.
## Summary
This is causing some cyclic dependencies issues for me, because these
can be used in virtually _any_ crate (like `uv-install-wheel`), which
then means that all of `uv-configuration` becomes a dependency, etc. I
think this should be a leaf crate so that we can safely depend on it
anywhere.
## Summary
Move the Resolver reference into a new Internals section in the
reference. Add the new nav item, fix internal linking to the new path,
fix server side redirect to the new path for external traffic via
redirect_maps, fix non existent anchor in
"docs/concepts/projects/dependencies.md"
Closes#15412
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
Closes#14866. Adds a `no-install-local` flag to the sync and export
commands that excludes locally defined packages from being installed.
This helps with if you're caching your virtual environment. You can
exclude local packages since they're more likely to change between
builds.
## Test Plan
snapshot test: `sync::no_install_local`
CI
## Notes
I made an `InstallOptions` struct to avoid a crate isolation issue I was
running into while implementing.
Thanks for maintaining this project!
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
i noticed some of the line highlights are wrong in the docs
## Test Plan
visual verification
## Summary
There isn't any risk here, and we have reports of at least one zip file
with more than one (but fewer than, e.g., 10) null bytes.
Closes https://github.com/astral-sh/uv/issues/15451.
## Summary
Packages like `triton` should come from the PyTorch index, but they
don't actually vary across (e.g.) the `cu128` or `cu129` indexes.
Closes https://github.com/astral-sh/uv/issues/15446.
## Test Plan
Validate that the following pins to `cu128`, rather than `cpu`:
```
echo "vllm\ntorch==2.7.1+cu128" | cargo run pip compile --torch-backend=auto --extra-index-url https://wheels.vllm.ai/b2f6c247a9b84556a8ea0e75bb4a2db765ff3315 - --python-platform linux --python-version 3.13 -v
```
## Summary
After #15395, I realized that we didn't actually need a separate struct
for this since we now pass it around as an `Option`. (The key change
from #15395 is that when combining, we treat the options as a single
unit.)
## Summary
`match-runtime` can be explicitly specified, and if it's `false` it
should behave the same way as if it's omitted.
## Test Plan
Added snapshot test
We should not unnecessarily leak memory. Instead, we follow the general
patterns and use `Cow` for strings that can be from either a static or a
dynamic source.
## Summary
Right now, if you put `upgrade = false` in a `uv.toml`, then pass
`--upgrade-package numpy` on the CLI, we won't upgrade NumPy. This PR
fixes that interaction by ensuring that when we "combine", we look at
those arguments holistically (i.e., we bundle `upgrade` and
`upgrade-package` into a single struct, which then goes through the
`.combine` logic), rather than combining `upgrade` and `upgrade-package`
independently.
If approved, I then need to add the same thing for `no-build-isolation`,
`reinstall`, `no-build`, and `no-binary`.
## Summary
Add torch cuda 12.9 backend
<!-- What's the purpose of the change? What does it do, and why? -->
## Test Plan
<!-- How was it tested? -->
---------
Signed-off-by: youkaichao <youkaichao@gmail.com>
Co-authored-by: Charlie Marsh <charlie.r.marsh@gmail.com>
## Summary
The PyTorch team publishes ARM Linux wheels for `triton` to the PyTorch
index, which aren't available on PyPI.
## Test Plan
```
echo "torch" | cargo run pip compile - --torch-backend=cu128 --python-platform aarch64-unknown-linux-gnu --python-version 3.13
```
Previously failed because it couldn't find a compatible `triton` wheel.
As a frontend to Ruff's formatter.
There are some interesting choices here, some of which may just be
temporary:
1. We pin a default version of Ruff, so `uv format` is stable for a
given uv version
2. We install Ruff from GitHub instead of PyPI, which means we don't
need a Python interpreter or environment
3. We do not read the Ruff version from the dependency tree
See https://github.com/astral-sh/ruff/pull/19665 for a prototype of the
LSP integration.
<!--
Thank you for contributing to uv! To help us out with reviewing, please
consider the following:
- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->
## Summary
<!-- What's the purpose of the change? What does it do, and why? -->
Currently record hashes are the hex encoded sha-256 sum. However,
they're supposed to be urlsafe-base64-nopad.
https://packaging.python.org/en/latest/specifications/recording-installed-packages/#the-record-fileFixes#15398
## Test Plan
<!-- How was it tested? -->
Build any wheel
```
uv build --wheel
```
Unpack the wheel
```
uvx wheel unpack dist/*.whl
```
Before this change, it will fail with a hash mismatch. I could confirm
with a local build that now the wheel can be unpacked with the `wheel`
command. While I don't enable hash checking when syncing, presumably it
would also currently fail.
## Summary
I've written a reasonably-long comment to explain what's going on here.
We should fix this, but it's better to continue using a
potentially-stale distribution than to panic.
Closes https://github.com/astral-sh/uv/issues/15386.
---------
Co-authored-by: Zanie Blue <contact@zanie.dev>