Commit Graph

381 Commits

Author SHA1 Message Date
konsti e9c9e9718e
Use version in `RegistryIndex` (#543)
When building up the `RegistryIndex`, index by both package name and
version to fix #537.
2023-12-04 17:26:14 +01:00
Charlie Marsh 95b8316023
Preserve seed packages for non-Puffin-created virtualenvs (#535)
## Summary

This PR modifies the install plan to avoid removing seed packages if the
virtual environment was created by anyone other than Puffin.

Closes https://github.com/astral-sh/puffin/issues/414.

## Test Plan

- Ran: `virtualenv .venv`.
- Ran: `cargo run -p puffin-cli -- pip-sync
scripts/benchmarks/requirements.txt --verbose --no-cache`.
- Verified that `pip` et al were not removed, and that the logging
including a message around preserving seed packages.
2023-12-04 09:31:00 -05:00
konsti 77b3921b7a
Fix cargo warning (#542)
It's odd that `dev-dependencies` don't default to `dependencies` for
workspace versions.
2023-12-04 11:10:36 +00:00
Charlie Marsh 0ac4254a7e
Enforce target and interpreter `requires-python` versions (#532)
## Summary

This PR modifies the behavior of our `--python-version` override in two
ways:

1. First, we always use the "real" interpreter in the source
distribution builder. I think this is correct. We don't need to use the
fake markers for recursive builds, because all we care about is the
top-level resolution, and we already assume that a single source
distribution will always return the same metadata regardless of its
build environment.
2. Second, we require that source distributions are compatible with
_both_ the "real" interpreter version and the marker environment. This
ensures that we don't try to build source distributions that are
compatible with our interpreter, but incompatible with the target
version.

Closes https://github.com/astral-sh/puffin/issues/407.
2023-12-04 11:27:36 +01:00
Charlie Marsh d96c18b3a8
Respect `requires` for non-`build-backend` PEP 517 builds (#530)
## Summary

This PR modifies `puffin-build` to be closer in behavior to
[pip](a15dd75d98/src/pip/_internal/pyproject.py (L53))
and
[build](de5b44b0c2/src/build/__init__.py (L94)).

Specifically, if a project contains a `[build-system]` field, but no
`build-backend`, we now perform a PEP 517 build (instead of using
`setup.py` directly) _and_ respect the `requires` of the
`[build-system]`. Without this change, we were failing to build source
distributions for packages like `ujson`.

Closes #527.

---------

Co-authored-by: konstin <konstin@mailbox.org>
2023-12-04 10:13:42 +00:00
konsti 6dc8ebcb90
Test interpreter cache invalidation (#540)
Add missing test for #529/#508.
2023-12-04 10:03:43 +00:00
konsti 811c088603
Improve wheel cache docs: Unzipping is lazy (#539)
Also sneaking `fs_err::rename(staging.into_path(), &normalized_path)?`
in here, for a better resolution of
https://github.com/astral-sh/puffin/pull/524#discussion_r1412459016
2023-12-04 10:01:35 +00:00
Charlie Marsh ee009ace86
Remove target directory prior to unzipping (#538)
## Summary

This is not a _fix_ for https://github.com/astral-sh/puffin/issues/537,
but it does ensure that we avoid hard-failing on what's really an
optimization and caching case.
2023-12-04 05:18:45 +00:00
Charlie Marsh fc20d01593
Ignore empty `VIRTUAL_ENV` variables (#536)
I'm not sure how my interpreter gets into this state, but it's certainly
wrong to respect these.
2023-12-04 04:53:26 +00:00
Charlie Marsh 3b55d0b295
Deduplicate various `.dist-info/METADATA` read implementations (#531)
Closes https://github.com/astral-sh/puffin/issues/484.
2023-12-03 21:29:00 -05:00
Charlie Marsh fa3107b173
Use full Python version when determining compatibility (#528)
## Summary

When resolving with Python 3.7.13, I was failing to find a matching
distribution that required Python 3.7.9 or later.
2023-12-04 01:02:24 +00:00
Charlie Marsh 2613382747
Invalidate interpreter marker cache (#529)
In a refactor, we lost the cache invalidation behavior for interpreter
markers, leading to stale interpreter errors for me when creating
environments with different Python versions. Specifically, the
modification timestamp used to be part of the _cache key_ when we used
`cacache`. Now it's not -- but it's stored within the cache. So we need
to validate the key after-the-fact.
2023-12-03 22:44:43 +00:00
Charlie Marsh ee2fca3a48
Add CACHEDIR and .gitignore tags to cache directories (#526)
## Summary

Even if this will typically be in the user's application folder (rather
than a local directory), it's still a good practice.

Closes https://github.com/astral-sh/puffin/issues/280.
2023-12-02 00:37:51 +00:00
konsti 9806901a16
Consolidate wheel caches (#524)
After this change, two wheel caches remain: `built-wheels-v0` and
`wheels-v0`, docs screenshots below. Each contains both the wheel
metadata, cache policy and zip or unzipped wheels under the same name.

The zipped/unzipped strategy is as follows: In `pip-compile`, when we
build a wheel, we store it zipped. When `pip-sync` or a source dist
build in `pip-compile` need to install the wheel, we unzip it, remove
the file and replace it with the unzipped wheel.

This removes `WheelCache` and `UrlIndex` in favor of `Cache` plus
`WheelCache`. The non-built wheel cache now considers index urls and the
url for url wheels.

I'm unsure if we need the `Unzipper` type, this could just be a
function.

I move `no_index` into `IndexUrls` and started using `IndexUrl` up to
the clap level.

I left a number of TODOs in the code, namely performing the actual
invalidation of unzipped wheels and making the `InstallPlan` understand
cache invalidation (i.e. uninstall wheels when their remote changed).


![image](https://github.com/astral-sh/puffin/assets/6826232/c4d45979-485b-4954-848d-fd3347ee2510)
2023-12-01 20:16:33 +00:00
konsti 4551994b7d
Clear built wheels when remote changed (#519)
Remove built wheels alongside their metadata when their index source
dist or url source dist changed. For git source dists, we currently
don't clear the previous build but use a new directory (not sure what's
right here - are there any generic cache GC approaches out there? I've
seen that e.g. spotify keeps its cache at 10GB max, but i also haven't
seen any reusable, well tested approaches for this). Path distributions
are unchanged (#478).

I like the structure of metadata alongside the wheel for cache
invalidation, i'll try to do that for `wheels-v0`/`wheel-metadata-v0`
too. (The unzipped wheels afaik currently lack cache invalidation when
the remote changed.) This should give is roughly the same structure for
wheel and built wheels and a very similar pattern of invalidation.
2023-12-01 14:56:47 -05:00
Zanie Blue 2a8544df9e
Use a custom pubgrub report formatter (#521)
Uses https://github.com/zanieb/pubgrub/pull/10 to drastically simplify
our reporter implementation. This will allow us to make use of upstream
improvements to the reporter e.g.
https://github.com/zanieb/pubgrub/pull/8 without multiple duplicative
pull requests.
2023-12-01 13:36:12 -06:00
Zanie Blue 5f1f207628
Recursively merge existing package directories on installation (#516)
Previously, when installing a package we would delete the target
directory before copying (or linking) the contents of the package.
However, this means that we do not properly support namespace packages
which can share a target directory. Instead the last package to be
installed would be override existing packages. Since we install packages
in parallel, this could result in a race condition where the target
directory already exists which is not allowed when using `clonefile`.
See example error in #515.
c7e63d2dce
provides a regression test for this — it fails on `main`.

Here, we implement a recursive merge when the target directory already
exists. Both packages will be installed into the same directory. We no
longer delete the target directory, which seems okay since we uninstall
packages before installing now.

When files conflict, we will likely throw an error still. The correct
behavior to implement in this case is unclear, as if we just take "first
write wins" or "last write wins" we could end up with some files from
one package and some from another resulting in two broken packages. A
possible solution here is to lock the target directories while copying.
2023-11-30 10:14:51 -06:00
konsti 6841c06e2d
Show error paths in install-wheel-rs (#514)
Ensure that we consistently show a path for all io errors in
install-wheel-rs either (preferred) through `fs_err`, or as fallback by
a custom error type. For zip reading errors, we rely on the caller to
add the name and/or location of the wheel.
2023-11-29 20:14:34 +01:00
konsti 2539f00952
Better tracing span (#513)
This will help us get better insight into what is happening and how long
it takes. I'm particularly interested in how long the different source
dist steps take (download, extract, build step(s)), to make better
decisions about their caching, which i want to report through tracing.

Example output:

```console
$ RUST_LOG=puffin=info cargo run --bin puffin -q -- pip-compile -v --no-cache scripts/requirements/all-kinds.in > /dev/null
  puffin_distribution::source_dist::download_source_dist filename="werkzeug-3.0.1.tar.gz", source_dist=werkzeug @ ff1904eb5e2853bf83db817a7dd53d/werkzeug-3.0.1.tar.gz
  puffin_dispatch::build_source source_dist="werkzeug @ ff1904eb5e2853bf83db817a7dd53d/werkzeug-3.0.1.tar.gz", subdirectory=None
    puffin_build::extract_archive sdist="werkzeug-3.0.1.tar.gz"
    puffin_dispatch::resolve requirements="flit-core <4"
    puffin_dispatch::install requirements="flit-core ==3.9.0", venv="/tmp/.tmpgZAEAh/.venv"
    puffin_build::get_requires_for_build_wheel name="build_wheel", python_version=3.12
    puffin_build::build package_id="werkzeug @ ff1904eb5e2853bf83db817a7dd53d/werkzeug-3.0.1.tar.gz"
      puffin_build::run_python_script name="build_wheel", python_version=3.12
  puffin_dispatch::build_source source_dist="pydantic-extra-types @ git+https://github.com/pydantic/pydantic-extra-types.git@843b753e9e8cb74e83cac55598719b39a4d5ef1f", subdirectory=None
    puffin_dispatch::resolve requirements="hatchling"
    puffin_dispatch::install requirements="hatchling ==1.18.0, trove-classifiers ==2023.11.22, editables ==0.5, pathspec ==0.11.2, pluggy ==1.3.0, packaging ==23.2", venv="/tmp/.tmpJjweUn/.venv"
    puffin_build::get_requires_for_build_wheel name="build_wheel", python_version=3.12
    puffin_build::build package_id="pydantic-extra-types @ git+https://github.com/pydantic/pydantic-extra-types.git@843b753e9e8cb74e83cac55598719b39a4d5ef1f"
      puffin_build::run_python_script name="build_wheel", python_version=3.12
  puffin_distribution::source_dist::download_source_dist filename="django-allauth-0.51.0.tar.gz", source_dist=django-allauth==0.51.0
  puffin_dispatch::build_source source_dist="django-allauth==0.51.0", subdirectory=None
    puffin_build::extract_archive sdist="django-allauth-0.51.0.tar.gz"
    puffin_dispatch::resolve requirements="wheel, setuptools, pip"
    puffin_dispatch::install requirements="setuptools ==69.0.2, pip ==23.3.1, wheel ==0.42.0", venv="/tmp/.tmplSZisu/.venv"
    puffin_build::build package_id="django-allauth==0.51.0"
 Resolved 35 packages in 11.71s
```
2023-11-29 10:34:18 +00:00
konsti 929df586fb
Skip tf-models-nightly in resolve-many dev script for now (#510)
`tf-models-nightly` has pathologic backtracking behaviour, skip it for
now so we can benchmark the rest.
2023-11-28 18:25:32 +00:00
konsti d89fbeb642
Migrate interpreter query to custom caching (#508)
This removes the last usage of cacache by replacing it with a custom,
flat json caching keyed by the digest of the executable path.


![image](https://github.com/astral-sh/puffin/assets/6826232/8f777c4c-1f1b-4656-ba7b-002175270556)

A step towards #478. I've made `CachedByTimestamp<T>` generic over `T`
but intentionally not moved it to `puffin-cache` yet.
2023-11-28 17:14:59 +00:00
konsti 5435d44756
Introduce `Cache`, `CacheBucket` and `CacheEntry` (#507)
This is mostly a mechanical refactor that moves 80% of our code to the
same cache abstraction.

It introduces cache `Cache`, which abstracts away the path of the cache
and the temp dir drop and is passed throughout the codebase. To get a
specific cache bucket, you need to requests your `CacheBucket` from
`Cache`. `CacheBucket` is the centralizes the names of all cache
buckets, moving them away from the string constants spread throughout
the crates.

Specifically for working with the `CachedClient`, there is a
`CacheEntry`. I'm not sure yet if that is a strict improvement over
`cache_dir: PathBuf, cache_file: String`, i may have to rotate that
later.

The interpreter cache moved into `interpreter-v0`.

We can use the `CacheBucket` page to document the cache structure in
each bucket:


![image](https://github.com/astral-sh/puffin/assets/6826232/b023fdfb-e34d-4c2d-8663-b5f73937a539)
2023-11-28 17:11:14 +00:00
Charlie Marsh 3d47d2b1da
Error when `ldd` is not in path (#506)
Closes https://github.com/astral-sh/puffin/issues/493.
2023-11-28 05:55:04 +00:00
konsti 8855f44b5f
Move simple index queries to `CachedClient` (#504)
Replaces the usage of `http-cache-reqwest` for simple index queries with
our custom cached client, removing `http-cache-reqwest` altogether.

The new cache paths are `<cache>/simple-v0/<index>/<package_name>.json`.
I could not test with a non-pypi index since i'm not aware of any other
json indices (jax and torch are both html indices).

In a future step, we can transform the response to be a
`HashMap<Version, {source_dists: Vec<(SourceDistFilename, File)>,
wheels: Vec<(WheeFilename, File)>}` (independent of python version, this
cache is used by all environments together). This should speed up cache
deserialization a bit, since we don't need to try source dist and wheel
anymore and drop incompatible dists, and it should make building the
`VersionMap` simpler. We can speed this up even further by splitting
into a version lists and the info for each version. I'm mentioning this
because deserialization was a major bottleneck in the rust part of the
old python prototype.

Fixes #481
2023-11-28 00:11:03 +00:00
konsti 1142a14f4d
Check compatibility for cached unzipped wheels (#501)
**Motivation** Previously, we would install any wheel with the correct
package name and version from the cache, even if it doesn't match the
current python interpreter.

**Summary** The unzipped wheel cache for registries now uses the entire
wheel filename over the name-version (`editables-0.5-py3-none-any.whl`
over `editables-0.5`).

Built wheels are not stored in the `wheels-v0` unzipped wheels cache
anymore. For each source distribution, there can be multiple built
wheels (with different compatibility tags), so i argue that we need a
different cache structure for them (follow up PR).

For `all-kinds.in` with

```bash
rm -rf cache-all-kinds
virtualenv --clear -p 3.12 .venv
cargo run --bin puffin -- pip-sync --cache-dir cache-all-kinds target/all-kinds.txt
```

we get:

**Before**
```
cache-all-kinds/wheels-v0/
├── registry
│   ├── annotated_types-0.6.0
│   ├── asgiref-3.7.2
│   ├── blinker-1.7.0
│   ├── certifi-2023.11.17
│   ├── cffi-1.16.0
│   ├── [...]
│   ├── tzdata-2023.3
│   ├── urllib3-2.1.0
│   └── wheel-0.42.0
└── url
    ├── 4b8be67c801a7ecb
    │   ├── flask
    │   └── flask-3.0.0.dist-info
    ├── 6781bd6440ae72c2
    │   ├── werkzeug
    │   └── werkzeug-3.0.1.dist-info
    └── a67db8ed076e3814
        ├── pydantic_extra_types
        └── pydantic_extra_types-2.1.0.dist-info

48 directories, 0 files
```

**After**

```
cache-all-kinds/wheels-v0/
├── registry
│   ├── annotated_types-0.6.0-py3-none-any.whl
│   ├── asgiref-3.7.2-py3-none-any.whl
│   ├── blinker-1.7.0-py3-none-any.whl
│   ├── certifi-2023.11.17-py3-none-any.whl
│   ├── cffi-1.16.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
│   ├── [...]
│   ├── tzdata-2023.3-py2.py3-none-any.whl
│   ├── urllib3-2.1.0-py3-none-any.whl
│   └── wheel-0.42.0-py3-none-any.whl
└── url
    └── 4b8be67c801a7ecb
        └── flask-3.0.0-py3-none-any.whl

39 directories, 0 files
```

**Outlook** Part of #477 "Fix wheel caching". Further tasks:
* Replace the `CacheShard` with `WheelMetadataCache` which handles urls
properly.
* Delete unzipped wheels when their remote wheel changed
* Store built wheels next to the `metadata.json` in the source dist
directory; delete built wheels when their source dist changed (different
cache bucket, but it's the same problem of fixing wheel caching) I'll
make stacked PRs for those
2023-11-27 16:03:58 -08:00
konsti 71295702bf
Reduce pip_sync test duplication (#502)
Move venv creation and running python to check the installation into
function instead of copy&pasting them every time
2023-11-27 10:21:40 +00:00
Charlie Marsh afda835544
Avoid clone for `WheelMetadataCache` (#500)
This doesn't need to own the underlying data which allows us to remove a
number of clones.
2023-11-25 23:33:59 +00:00
Charlie Marsh 3eb0a43995
Perform a single Git fetch when building source distributions (#499)
## Summary

We need to pass in the distribution with the "precise" URL to avoid
refetching.

## Test Plan

Ran `cargo run -p puffin-cli -- pip-compile requirements.in --verbose`
with `flask @ git+https://github.com/pallets/flask.git` and verified
that we only checked out Flask once.
2023-11-25 23:29:41 +00:00
konsti d54e780843
Source dist metadata refactor (#468)
## Summary and motivation

For a given source dist, we store the metadata of each wheel built
through it in `built-wheel-metadata-v0/pypi/<source dist
filename>/metadata.json`. During resolution, we check the cache status
of the source dist. If it is fresh, we check `metadata.json` for a
matching wheel. If there is one we use that metadata, if there isn't, we
build one. If the source is stale, we build a wheel and override
`metadata.json` with that single wheel. This PR thereby ties the local
built wheel metadata cache to the freshness of the remote source dist.
This functionality is available through `SourceDistCachedBuilder`.

`puffin_installer::Builder`, `puffin_installer::Downloader` and
`Fetcher` are removed, instead there are now `FetchAndBuild` which calls
into the also new `SourceDistCachedBuilder`. `FetchAndBuild` is the new
main high-level abstraction: It spawns parallel fetching/building, for
wheel metadata it calls into the registry client, for wheel files it
fetches them, for source dists it calls `SourceDistCachedBuilder`. It
handles locks around builds, and newly added also inter-process file
locking for git operations.

Fetching and building source distributions now happens in parallel in
`pip-sync`, i.e. we don't have to wait for the largest wheel to be
downloaded to start building source distributions.

In a follow-up PR, I'll also clear built wheels when they've become
stale.

Another effect is that in a fully cached resolution, we need neither zip
reading nor email parsing.

Closes #473

## Source dist cache structure 

Entries by supported sources:
 * `<build wheel metadata cache>/pypi/foo-1.0.0.zip/metadata.json`
* `<build wheel metadata
cache>/<sha256(index-url)>/foo-1.0.0.zip/metadata.json`
* `<build wheel metadata
cache>/url/<sha256(url)>/foo-1.0.0.zip/metadata.json`
But the url filename does not need to be a valid source dist filename

(<https://github.com/search?q=path%3A**%2Frequirements.txt+master.zip&type=code>),
so it could also be the following and we have to take any string as
filename:
* `<build wheel metadata
cache>/url/<sha256(url)>/master.zip/metadata.json`

Example:
```text
# git source dist
pydantic-extra-types @ git+https://github.com/pydantic/pydantic-extra-types.git
# pypi source dist
django_allauth==0.51.0
# url source dist
werkzeug @ ff1904eb5e2853bf83db817a7dd53d/werkzeug-3.0.1.tar.gz
```
will be stored as
```text
built-wheel-metadata-v0
├── git
│   └── 5c56bc1c58c34c11
│       └── 843b753e9e8cb74e83cac55598719b39a4d5ef1f
│           └── metadata.json
├── pypi
│   └── django-allauth-0.51.0.tar.gz
│       └── metadata.json
└── url
    └── 6781bd6440ae72c2
        └── werkzeug-3.0.1.tar.gz
            └── metadata.json
```

The inside of a `metadata.json`:
```json
{
  "data": {
    "django_allauth-0.51.0-py3-none-any.whl": {
      "metadata-version": "2.1",
      "name": "django-allauth",
      "version": "0.51.0",
      ...
    }
  }
}
```
2023-11-24 17:47:58 +00:00
konsti 8d247fe95b
Add `Tags::from_interpreter` (#498)
Small refactoring
2023-11-24 11:36:01 +00:00
konsti f7976ce5cc
Write docs for distribution types (#495)
Document the type hierarchy, excluding the traits.
2023-11-23 13:39:39 +00:00
konsti 1c0e03f807
puffin_interpreter cleanup ahead of #235 (#492)
Preparing for #235, some refactoring to `puffin_interpreter`.

* Added a dedicated error type instead of anyhow
* `InterpreterInfo` -> `Interpreter`
* `detect_virtual_env` now returns an option so it can be chained for
#235
2023-11-23 08:57:33 +00:00
Charlie Marsh 9d35128840
Use Clippy lint table over Cargo config (#490)
Closes https://github.com/astral-sh/puffin/issues/482.
2023-11-22 15:10:27 +00:00
Charlie Marsh 443a0a9df2
Use a sparse Metadata 2.1 representation (#488)
This is an optimization to avoid parsing the entire Metadata 2.1 when we
only need a small subset of the fields.

Closes #175.
2023-11-22 13:25:35 +00:00
konsti a030a466e6
Error before download with no_build (#487)
This is fixes a performance regression where when `--no-build` was set,
the fetcher would still download the source dist only to error
afterwards.
2023-11-22 10:38:10 +00:00
konsti e1dafe7203
Allow applying multiple fixups for version specifiers (#486)
Allow applying multiple fixups for version specifiers, remove the
duplication from the code and add another test case.
2023-11-22 10:26:12 +00:00
konsti ff1100a1ab
Fixup for `>= '2.7'` (#485)
Fixup to allow parsing
https://pypi.org/simple/shellingham/?format=application/vnd.pypi.simple.v1+json
2023-11-22 10:00:12 +00:00
konsti 7c7daa8f83
Consistent Cargo.toml syntax (#483)
Remove the last Cargo.toml inconsistencies, see
1526b3458a (r1401083681).
Now all `[dependencies]` are workspace dependencies.
2023-11-22 08:34:08 +00:00
konsti 934e32ea98
Remove outdated todos (#476) 2023-11-21 13:57:40 +00:00
Charlie Marsh 17228ba04e
Add support for path dependencies (#471)
## Summary

This PR adds support for local path dependencies. The approach mostly
just falls out of our existing approach and infrastructure for Git and
URL dependencies.

Closes https://github.com/astral-sh/puffin/issues/436. (We'll open a
separate issue for editable installs.)

## Test Plan

Added `pip-compile` tests that pre-download a wheel or source
distribution, then install it via local path.
2023-11-21 11:49:42 +00:00
Charlie Marsh f1aa70d9d3
Refactor distribution types to return `Result` (#470)
## Summary

A variety of small refactors to the distribution types crate to (1)
return `Result` if we find an invalid wheel, rather than treating it as
a source distribution with a `.whl` suffix, and (2) DRY up some repeated
code around URLs.
2023-11-20 23:08:54 +00:00
konsti f0841cdb6e
Wheel metadata refactor (#462)
A consistent cache structure for remote wheel metadata:

 * `<wheel metadata cache>/pypi/foo-1.0.0-py3-none-any.json`
* `<wheel metadata
cache>/<digest(index-url)>/foo-1.0.0-py3-none-any.json`
* `<wheel metadata cache>/url/<digest(url)>/foo-1.0.0-py3-none-any.json`

The source dist caching will use a similar structure (#468).
2023-11-20 17:26:36 +01:00
konsti d3e9e1783f
Refactor lenient parsing (#467)
Deduplicate lenient parsing code between version specifiers and
Requirement. Use `warn_once!` since the warnings did show up multiple
times in my code. Fix the macro hygiene in `warn_once!`.
2023-11-20 15:35:38 +00:00
Charlie Marsh 60f595b469
Prefer future stream over `JoinSet` in downloader (#469)
This avoids introducing a static lifetime requirement and, in my
benchmarks, is even a little faster.
2023-11-20 13:23:30 +00:00
Charlie Marsh 8decb29bad
Use a dedicated error type for `puffin-distribution` (#466) 2023-11-20 11:38:27 +00:00
Charlie Marsh 342fc628f0
Store downloaded wheels in a local cache (#463)
This PR modifies the `Fetcher` to cache remote wheels that we _already_
store to-disk. We might read these again in the future, so we might as
well store them in the cache for consistency (rather than using a
temporary directory).
2023-11-20 11:32:22 +00:00
Charlie Marsh 35fd86631b
Unify distribution operations into a single crate (#460)
## Summary

This PR unifies the behavior that lived in the resolver's `distribution`
crates with the behaviors that were spread between the various structs
in the installer crate into a single `Fetcher` struct that is intended
to manage all interactions with distributions. Specifically, the
interface of this struct is such that it can access distribution
metadata, download distributions, return those downloads, etc., all with
a common cache.

Overall, this is mostly just DRYing up code that was repeated between
the two crates, and putting it behind a reasonable shared interface.
2023-11-20 11:22:52 +00:00
konsti 45d032dd7d
Fix wheel filename serialization (#465)
We need an underscore in the wheel filename, not a dash
2023-11-20 11:21:22 +00:00
konsti 46bb18f06e
Track file index (#452)
Track the index (or at least its url) where we got a file from across
the source code.

Fixes #448
2023-11-20 08:48:16 +00:00
Charlie Marsh 6fd582f8b9
Rename `puffin-distribution` to `distribution-types` (#458)
## Summary

This crate only contains types, and I want to introduce a new crate for
all _operations_ on distributions, so this feels like a more natural
name given we also have `pypi-types`.
2023-11-20 09:40:26 +01:00
konsti 2fed14fdc6
Optional serde feature for distribution-filename (#461)
https://github.com/astral-sh/puffin/pull/459#discussion_r1398482972
2023-11-19 19:53:32 +00:00
konsti 255edf4445
Serde support for WheelFilename through str repr (#459)
I need this later, splitting out for PR size
2023-11-19 19:43:14 +00:00
Charlie Marsh 3df3110800
Use shortened `anyhow::Result` everywhere (#457) 2023-11-19 19:26:21 +00:00
Charlie Marsh 380030bb5c
Pin all resolver tests using `--exclude-newer` (#456)
Uses yesterday's date, which should make it much less likely that our
tests become stale over time.

Closes https://github.com/astral-sh/puffin/issues/449.
2023-11-19 15:10:57 +00:00
konsti 24f00f5a33
Create cache dir before canonicalize (#454)
`fs::canonicalize` fails when the directory does not exist, which i
missed in #453
2023-11-19 13:49:13 +00:00
konsti ab60233131
Use absolute cache paths (#453)
Previously, git requirements would fail when setting `--cache-dir`:

```console
$ cargo run --bin puffin -- pip-compile --cache-dir cache-all-kinds scripts/benchmarks/requirements/all-kinds.in
error: Failed to build distribution from URL: git+https://github.com/pydantic/pydantic-extra-types.git
  Caused by: Invalid path URL: cache-all-kinds/git-v0/db/b49ffcfeb6c2e9d8
  ```

The cause is using a relative and not an absolute path, which `Url` needs, the solution is to turn the cache dir into an absolute path.

This never showed up in the tests since the tests use absolute temp dirs for everything.
2023-11-19 13:32:32 +00:00
konsti dd4347980a
Fix tests: Certifi got an update (#451) 2023-11-19 12:10:54 +00:00
Zanie Blue 5dedfeb097
Fix import of `CacheArgs` in `puffin-cli` (#447)
```
error[E0432]: unresolved imports `puffin_cache::CacheArgs`, `puffin_cache::CacheDir`
  --> crates/puffin-cli/src/main.rs:11:20
   |
11 | use puffin_cache::{CacheArgs, CacheDir};
   |                    ^^^^^^^^^  ^^^^^^^^ no `CacheDir` in the root
   |                    |
   |                    no `CacheArgs` in the root
   |
note: found an item that was configured out
  --> /Users/mz/eng/src/astral-sh/puffin/crates/puffin-cache/src/lib.rs:7:15
   |
7  | pub use cli::{CacheArgs, CacheDir};
   |               ^^^^^^^^^
   = note: the item is gated behind the `clap` feature
note: found an item that was configured out
  --> /Users/mz/eng/src/astral-sh/puffin/crates/puffin-cache/src/lib.rs:7:26
   |
7  | pub use cli::{CacheArgs, CacheDir};
   |                          ^^^^^^^^
   = note: the item is gated behind the `clap` feature

For more information about this error, try `rustc --explain E0432`.
error: could not compile `puffin-cli` (bin "puffin") due to previous error
```
2023-11-17 15:35:01 -05:00
Charlie Marsh 03599d2bb4
Split resolver inputs into manifest and options (#446)
## Summary

This is a refactor to address a TODO in the build context whereby we
aren't respecting the resolution options in recursive resolutions. Now,
the options are split out from the resolution _manifest_, and shared
across the build context tree.
2023-11-17 18:53:53 +00:00
konsti 9db6644be6
Test requirements script (#382)
This script can compare different requirements between pip(-compile) and
puffin across python versions, with debug and release builds.

Examples:
```shell
scripts/compare_with_pip/compare_with_pip.py
scripts/compare_with_pip/compare_with_pip.py -p 3.10
scripts/compare_with_pip/compare_with_pip.py --release -p 3.9 --target 'transformers[deepspeed-testing,dev-tensorflow]'
```

It found a bunch of fixed bugs, e.g. the lack of yanked package handling
and source dist handling, as well as #423, which is currently most of
the output.

Example output:
https://gist.github.com/konstin/9ccf8dc7c2dcca737bf705429ced4892

#443 should be merged first
2023-11-17 18:26:55 +00:00
konsti bf71e7adcf
Add graphviz output to puffin-dev resolve-cli (#443)
I added output in graphviz DOT format to `puffin-dev resolve-cli` to
help with debugging resolutions. This requires tracking the requested
ranges in the graph. I also fixed the direction of the graph.

 Output for `black`:

```dot
digraph {
    0 [ label="click\n8.1.7"]
    1 [ label="black\n23.11.0"]
    2 [ label="packaging\n23.2"]
    3 [ label="mypy-extensions\n1.0.0"]
    4 [ label="tomli\n2.0.1"]
    5 [ label="pathspec\n0.11.2"]
    6 [ label="typing-extensions\n4.8.0"]
    7 [ label="platformdirs\n4.0.0"]
    1 -> 0 [ label=">=8.0.0"]
    1 -> 3 [ label=">=0.4.3"]
    1 -> 5 [ label=">=0.9.0"]
    1 -> 4 [ label=">=1.1.0"]
    1 -> 6 [ label=">=4.0.1"]
    1 -> 2 [ label=">=22.0"]
    1 -> 7 [ label=">=2"]
}
```


![image](https://github.com/astral-sh/puffin/assets/6826232/4a440fcd-6248-4349-8e1a-c3e0363e42b1)

transformers:


![image](https://github.com/astral-sh/puffin/assets/6826232/a13a693c-a8c0-4a4f-95d9-3458431c678a)

jupyter:


![graphviz](https://github.com/astral-sh/puffin/assets/6826232/ef730033-6fd9-4ea9-ac93-8c874c19a101)
2023-11-17 18:16:24 +00:00
Zanie Blue d39e9b3499
Remove duplicate `cache_dir` argument from `puffin-dev resolve-cli` (#445) 2023-11-17 17:17:00 +00:00
Zanie Blue 221751487c
Use `UnusableDependencies` for URL dependency conflicts (#425)
Extends #424 with support for URL dependency incompatibilities.

Requires changes to `miette` to prevent URLs from being word wrapped;
accepted upstream in https://github.com/zkat/miette/pull/321
2023-11-17 08:28:12 -06:00
Charlie Marsh 2094680cdd
Add a `warn_user_once!` macro (#442)
Closes https://github.com/astral-sh/puffin/issues/429.
2023-11-17 02:34:06 +00:00
Charlie Marsh 25fcee0d9f
Avoid using incompatible wheels for source distribution-less packages (#441)
We're willing to use platform-incompatible wheels during resolution, to
quicken access to metadata... But we should avoid choosing an
incompatible wheel if the package lacks a source distribution since, in
that case, we definitely won't be able to install it.

Closes https://github.com/astral-sh/puffin/issues/439.
2023-11-17 02:10:54 +00:00
Charlie Marsh b1c29447df
Use `temp_dir` casing everywhere (#440) 2023-11-16 21:04:10 +00:00
konsti 1883dbdc21
Always¹ clear temporary directories (#437)
Always¹ clear the temporary directories we create.

* Clear source dist downloads: Previously, the temporary directories
would remain in the cache dir, now they are cleared properly
* Clear wheel file downloads: Delete the `.whl` file, we only need to
cache the unpacked wheel
* Consistent handling of cache arguments: Abstract the handling for CLI
cache args away, again making sure we remove the `--no-cache` temp dir.

There are no more `into_path()` calls that persist `TempDir`s that i
could find.

¹Assuming drop is run, and deleting the directory doesn't silently
error.
2023-11-16 20:49:48 +00:00
Zanie Blue 0d9d4f9fca
Add an `UnusableDependencies` incompatibility kind and use for conflicting versions (#424)
Addresses
https://github.com/astral-sh/puffin/issues/309#issuecomment-1792648969

Similar to #338 this throws an error when merging versions results in an
empty set. Instead of propagating that error, we capture it and return a
new dependency type of `Unusable`. Unusable dependencies are a new
incompatibility kind which includes an arbitrary "reason" string that we
present to the user. Adding a new incompatibility kind requires changes
to the vendored pubgrub crate.

We could use this same incompatibility kind for conflicting urls as in
#284 which should allow the solver to backtrack to another valid version
instead of failing (see #425).

Unlike #383 this does not require changes to PubGrub's package mapping
model. I think in the long run we'll want PubGrub to accept multiple
versions per package to solve this specific issue, but we're interested
in it being merged upstream first. This pull request is just using the
issue as a simple case to explore adding a new incompatibility type.

We may or may not be able convince them to add this new incompatibility
type upstream. As discussed in
https://github.com/pubgrub-rs/pubgrub/issues/152, we may want a more
general incompatibility kind instead which can be used for arbitrary
problems. An upstream pull request has been opened for discussion at
https://github.com/pubgrub-rs/pubgrub/pull/153.

Related to:
- https://github.com/pubgrub-rs/pubgrub/issues/152
- #338 
- #383

---------

Co-authored-by: konsti <konstin@mailbox.org>
2023-11-16 20:02:06 +00:00
Zanie Blue 832058dbba
Switch from vendored PubGrub to a fork (#438)
A fork will let us stay up to date with the upstream while replaying our
work on top of it.

I expect a similar workflow to the RustPython-Parser fork we maintained,
except that I wrote an automation to create tags for each commit on the
fork (https://github.com/zanieb/pubgrub/pull/2) so we do not need to
manually tag and document each commit.

To update with the upstream:

- Rebase our fork's `main` branch on top of the latest changes in
upstream's `dev` branch
- Force push, overwriting our `main` branch history
- Change the commit hash here to the last commit on `main` in our fork

Since we automatically tag each commit on the fork, we should never lose
the commits that are dropped from `main` during rebase.
2023-11-16 13:49:19 -06:00
konsti e41ec12239
Option to resolve at a fixed timestamp with `pip-compile --exclude-newer YYYY-MM-DD` (#434)
This works by filtering out files with a more recent upload time, so if
the index you use does not provide upload times, the results might be
inaccurate. pypi provides upload times for all files. This is, the field
is non-nullable in the warehouse schema, but the simple API PEP does not
know this field.

If you have only pypi dependencies, this means deterministic,
reproducible(!) resolution. We could try doing the same for git repos
but it doesn't seem worth the effort, i'd recommend pinning commits
since git histories are arbitrarily malleable and also if you care about
reproducibility and such you such not use git dependencies but a custom
index.

Timestamps are given either as RFC 3339 timestamps such as
`2006-12-02T02:07:43Z` or as UTC dates in the same format such as
`2006-12-02`. Dates are interpreted as including this day, i.e. until
midnight UTC that day. Date only is required to make this ergonomic and
midnight seems like an ergonomic choice.

In action for `pandas`:

```console
$ target/debug/puffin pip-compile --exclude-newer 2023-11-16 target/pandas.in
Resolved 6 packages in 679ms
# This file was autogenerated by Puffin v0.0.1 via the following command:
#    target/debug/puffin pip-compile --exclude-newer 2023-11-16 target/pandas.in
numpy==1.26.2
    # via pandas
pandas==2.1.3
python-dateutil==2.8.2
    # via pandas
pytz==2023.3.post1
    # via pandas
six==1.16.0
    # via python-dateutil
tzdata==2023.3
    # via pandas
$ target/debug/puffin pip-compile --exclude-newer 2022-11-16 target/pandas.in
Resolved 5 packages in 655ms
# This file was autogenerated by Puffin v0.0.1 via the following command:
#    target/debug/puffin pip-compile --exclude-newer 2022-11-16 target/pandas.in
numpy==1.23.4
    # via pandas
pandas==1.5.1
python-dateutil==2.8.2
    # via pandas
pytz==2022.6
    # via pandas
six==1.16.0
    # via python-dateutil
$ target/debug/puffin pip-compile --exclude-newer 2021-11-16 target/pandas.in
Resolved 5 packages in 594ms
# This file was autogenerated by Puffin v0.0.1 via the following command:
#    target/debug/puffin pip-compile --exclude-newer 2021-11-16 target/pandas.in
numpy==1.21.4
    # via pandas
pandas==1.3.4
python-dateutil==2.8.2
    # via pandas
pytz==2021.3
    # via pandas
six==1.16.0
    # via python-dateutil
```
2023-11-16 19:46:17 +00:00
konsti 0d455ebd06
Always use puffin as binary name (#435)
It doesn't matter how exactly the user called puffin, the lockfile
should look the same either way.
2023-11-16 19:05:46 +01:00
konsti 751f7fa9c6
Improve PEP 691 compatibility (#428)
[PEP 691](https://peps.python.org/pep-0691/#project-detail) has slightly
different, more relaxed rules around file metadata. These changes are
now reflected in the `File` struct. This will make it easier to support
alternative indices.

I had expected that i need to introduce a separate type for that, so i'm
happy it's two `Option`s more and an alias.

Part of #412
2023-11-16 19:03:44 +01:00
konsti 3a4988f999
Small test cleanup after #431 (#433)
Remove unused filters after #431
2023-11-16 11:22:47 +00:00
konsti c0339893e7
Use `sys.executable` as python root path (#431)
Previously, we were assuming that `which <python>` return the path to
the python executable. This is not true when using pyenv shims, which
are bash scripts. Instead, we have to use `sys.executable`. Luckily,
we're already querying the python interpreter and can do it in that
pass.

We are also not allowed to cache the execution of the python interpreter
through the shim because pyenv might change the target. As a heuristic,
we check whether `sys.executable`, the real binary, is the same our
canonicalized `which` result.

---------

Co-authored-by: Zanie Blue <contact@zanie.dev>
2023-11-16 12:16:49 +01:00
Charlie Marsh d3caf9ae86
Choose most-compatible wheel in resolver and installer (#422)
## Summary

This PR implements logic to sort wheels by priority, where priority is
defined as preferring more "specific" wheels over less "specific"
wheels. For example, in the case of Black, my machine now selects
`black-23.11.0-cp311-cp311-macosx_11_0_arm64.whl`, whereas sorting by
lowest priority instead gives me `black-23.11.0-py3-none-any.whl`.

As part of this change, I've also modified the resolver to fallback to
using incompatible wheels when determining package metadata, if no
compatible wheels are available.

The `VersionMap` was also moved out of `resolver.rs` and into its own
file with a wrapper type, for clarity.

Closes https://github.com/astral-sh/puffin/issues/380.
Closes https://github.com/astral-sh/puffin/issues/421.
2023-11-15 18:22:11 +00:00
konsti 1147a4de14
Simpler and more resilient pip compile tests (#426)
The pip compile test now explicitly set their python version and `puffin
venv` resolves e.g. `python3.12` correctly now. The venv creation is
moved to a shared method
2023-11-15 18:32:33 +01:00
Charlie Marsh a20325f184
Remove unnecessary clones in resolver (#420) 2023-11-13 21:00:52 -05:00
Charlie Marsh 13ba4405aa
Update README and crates manifest (#419) 2023-11-14 01:20:07 +00:00
konsti bacf1dc911
Filter out yanked files (#413)
Implement two behaviors for yanked versions:

* During `pip-compile`, yanked versions are filtered out entirely, we
currently treat them is if they don't exist. This is leads to confusing
error messages because a version that does exist seems to have suddenly
disappeared.
* During `pip-sync`, we warn when we fetch a remote distribution and it
has been yanked. We currently don't warn on cached or installed
distributions that have been yanked.
2023-11-13 20:58:50 +00:00
Charlie Marsh 28ec4e79f0
Co-locate lenient requirement parsing (#418)
No behavior changes.
2023-11-13 15:46:21 -05:00
Charlie Marsh 437d4fb87e
Add trailing-comma fix to lenient requirements (#417)
Closes https://github.com/astral-sh/puffin/issues/408.
2023-11-13 20:20:57 +00:00
Charlie Marsh 582c94cec3
Add missing-dot fix to lenient requirements (#416)
Part of https://github.com/astral-sh/puffin/issues/408.
2023-11-13 20:17:01 +00:00
Charlie Marsh 0af2f7e39f
Use `anstream` to avoid writing colorized output (#415)
A more robust solution to avoiding colorized output by ensuring we write
to `stdout` and `stderr` via the
[`anstream`](https://docs.rs/anstream/latest/anstream/) crate.

Closes https://github.com/astral-sh/puffin/issues/393.
2023-11-13 20:00:12 +00:00
konsti 76a41066ac
Filter out incompatible dists (#398)
Filter out source dists and wheels whose `requires-python` from the
simple api is incompatible with the current python version.

This change showed an important problem: When we use a fake python
version for resolving, building source distributions breaks down because
we can only build with versions we actually have.

This change became surprisingly big. The tests now require python 3.7 to
be installed, but changing that would mean an even bigger change.

Fixes #388
2023-11-13 17:14:07 +01:00
konsti 81c9cd0d4a
Print url for bad json error (#409)
Split out from #382
2023-11-13 11:41:20 +00:00
konsti fa423b8751
Backend path is supported (#405)
The check is outdated now
2023-11-13 08:19:20 +00:00
Zanie Blue beadd3274a
Improve debug log version display (#403)
Follow-up to https://github.com/astral-sh/puffin/pull/346 for some debug
messages
2023-11-10 17:07:29 -06:00
Charlie Marsh 06b312de7e
Overwrite existing files when hardlinking (#402)
## Summary

Closes https://github.com/astral-sh/puffin/issues/390.

## Test Plan

Installed `jupyter_core==5.5.0`, then removed the `jupyter_core` and
`jupyter_core-5.5.0.dist-info` directories from my virtualenv manually,
but left `jupyter.py`. I then re-ran `puffin pip-compile`, and verified
that it errored on `main` but succeeded here.
2023-11-10 20:24:19 +00:00
Charlie Marsh 56a4b51eb6
Refactor hardlink fallback to use an enum (#401)
Makes an invalid state unrepresentable (`first_try_hard_linking = true`,
`use_copy_fallback` = true`).
2023-11-10 15:18:51 -05:00
Andrew Gallant ff4d079dc9
pep508-rs: remove \x20 trailing whitespace hack (#400)
... we just remove the trailing whitespace from the input and that
resolves things.

Thanks @konstin for pointing this out!

Ref https://github.com/astral-sh/puffin/pull/399#discussion_r1389854823
2023-11-10 15:11:29 -05:00
Charlie Marsh e8108cb28b
Remove `__pycache__` directories when uninstalling (#397)
According to the [packaging
documentation](https://packaging.python.org/en/latest/specifications/binary-distribution-format/#binary-distribution-format),
"uninstallers should be smart enough to remove .pyc even if it is not
mentioned in RECORD". Previously, we weren't handling this case, so if
you installed via Puffin, then imported a file (to trigger bytecode
compilation), then uninstalled, we'd leave spare `__pycache__`
directories around.

Closes https://github.com/astral-sh/puffin/issues/395.
2023-11-10 14:55:33 -05:00
Andrew Gallant 63f7f65190
change global allocator to jemalloc (and mimalloc on Windows) (#399)
This copies the allocator configuration used in the Ruff project. In
particular, this gives us an instant 10% win when resolving the top 1K
PyPI packages:

    $ hyperfine \
"./target/profiling/puffin-dev-main resolve-many --cache-dir
cache-docker-no-build --no-build pypi_top_8k_flat.txt --limit 1000 2>
/dev/null" \
"./target/profiling/puffin-dev resolve-many --cache-dir
cache-docker-no-build --no-build pypi_top_8k_flat.txt --limit 1000 2>
/dev/null"
Benchmark 1: ./target/profiling/puffin-dev-main resolve-many --cache-dir
cache-docker-no-build --no-build pypi_top_8k_flat.txt --limit 1000 2>
/dev/null
Time (mean ± σ): 974.2 ms ± 26.4 ms [User: 17503.3 ms, System: 2205.3
ms]
      Range (min … max):   943.5 ms … 1015.9 ms    10 runs

Benchmark 2: ./target/profiling/puffin-dev resolve-many --cache-dir
cache-docker-no-build --no-build pypi_top_8k_flat.txt --limit 1000 2>
/dev/null
Time (mean ± σ): 883.1 ms ± 23.3 ms [User: 14626.1 ms, System: 2542.2
ms]
      Range (min … max):   849.5 ms … 916.9 ms    10 runs

    Summary
'./target/profiling/puffin-dev resolve-many --cache-dir
cache-docker-no-build --no-build pypi_top_8k_flat.txt --limit 1000 2>
/dev/null' ran
1.10 ± 0.04 times faster than './target/profiling/puffin-dev-main
resolve-many --cache-dir cache-docker-no-build --no-build
pypi_top_8k_flat.txt --limit 1000 2> /dev/null'

I was moved to do this because I noticed `malloc`/`free` taking up a
fairly sizeable percentage of time during light profiling.

As is becoming a pattern, it will be easier to review this
commit-by-commit.

Ref #396 (wouldn't call this issue fixed)

-----

I did also try adding a `smallvec` optimization to the
`Version::release` field, but it didn't bare any fruit. I still think
there is more to explore since the results I observed don't quite line
up with what I expect. (So probably either my mental model is off or my
measurement process is flawed.) You can see that attempt with a little
more explanation here:
f9528b4ecd

In the course of adding the `smallvec` optimization, I also shrunk the
`Version` fields from a `usize` to a `u32`. They should at least be a
fixed size integer since version numbers aren't used to index memory,
and I shrunk it to `u32` since it seems reasonable to assume that all
version numbers will be smaller than `2^32`.
2023-11-10 14:48:59 -05:00
konsti d8408b1783
Add source to failing metadata parsing (#387)
Before:
```
cargo run --bin puffin-dev -q -- resolve-cli "transformers[accelerate, agents, all, audio, codecarbon, deepspeed, deepspeed-testing, dev, dev-tensorflow, dev-torch, docs, docs_specific, flax, flax-speech, ftfy, integrations, ja, modelcreation, onnx, onnxruntime, optuna, quality, ray, retrieval, sagemaker, sentencepiece, serving, sigopt, sklearn, speech, testing, tf, tf-cpu, tf-speech, timm, tokenizers, torch, torch-speech, torch-vision, torchhub, video, vision]"
puffin-dev failed
  Caused by: No solution found when resolving: transformers[accelerate,agents,all,audio,codecarbon,deepspeed,deepspeed-testing,dev,dev-tensorflow,dev-torch,docs,docs-specific,flax,flax-speech,ftfy,integrations,ja,modelcreation,onnx,onnxruntime,optuna,quality,ray,retrieval,sagemaker,sentencepiece,serving,sigopt,sklearn,speech,testing,tf,tf-cpu,tf-speech,timm,tokenizers,torch,torch-speech,torch-vision,torchhub,video,vision]
  Caused by: Not a valid package or extra name: ".none". Names must start and end with a letter or digit and may only contain -, _, ., and alphanumeric characters
```
After:
```
cargo run --bin puffin-dev -q -- resolve-cli "transformers[accelerate, agents, all, audio, codecarbon, deepspeed, deepspeed-testing, dev, dev-tensorflow, dev-torch, docs, docs_specific, flax, flax-speech, ftfy, integrations, ja, modelcreation, onnx, onnxruntime, optuna, quality, ray, retrieval, sagemaker, sentencepiece, serving, sigopt, sklearn, speech, testing, tf, tf-cpu, tf-speech, timm, tokenizers, torch, torch-speech, torch-vision, torchhub, video, vision]"
puffin-dev failed
  Caused by: No solution found when resolving: transformers[accelerate,agents,all,audio,codecarbon,deepspeed,deepspeed-testing,dev,dev-tensorflow,dev-torch,docs,docs-specific,flax,flax-speech,ftfy,integrations,ja,modelcreation,onnx,onnxruntime,optuna,quality,ray,retrieval,sagemaker,sentencepiece,serving,sigopt,sklearn,speech,testing,tf,tf-cpu,tf-speech,timm,tokenizers,torch,torch-speech,torch-vision,torchhub,video,vision]
  Caused by: Couldn't parse metadata in fastapi-0.10.1-py3-none-any.whl (97ac91cb7cd2baab1a50b0c7a17d83/fastapi-0.10.1-py3-none-any.whl)
  Caused by: Not a valid package or extra name: ".none". Names must start and end with a letter or digit and may only contain -, _, ., and alphanumeric characters
```
2023-11-10 18:33:49 +00:00
Charlie Marsh b3edf7c2b2
Delete any directories listed in the RECORD file (#394)
## Summary

It looks like, when you install `pip`, it includes a bunch of
`__pycache__` directories in the RECORD file (although these directories
don't exist until you run `pip`). Our uninstaller assumed that the
RECORD file only contained _files_.

Closes https://github.com/astral-sh/puffin/issues/389.
2023-11-10 18:17:52 +00:00
Charlie Marsh 6a15950cb5
Rename `Distribution` to `Dist` in all structs and traits (#384)
We tend to avoid abbreviations, but this one is just so long and
absolutely ubiquitous.
2023-11-10 14:55:11 +00:00
konsti 5cef40d87a
Add proper caching for pypi metadata fetching kinds (#368)
I intend this to become the main form of caching for puffin: You can
make http requests, you tranform the data to what you really need, you
have control over the cache key, and the cache is always json (or
anything else much faster we want to replace it with as long as it's
serde!)
2023-11-10 11:03:40 +00:00
konsti d1b57acaa8
Implement PEP 517 backend-path (#385)
Closes #192
2023-11-10 11:54:23 +01:00
Charlie Marsh a148f9d0be
Refactor distribution types to adhere to a clear hierarchy (#369)
## Summary

This PR refactors our `RemoteDistribution` type such that it now follows
a clear hierarchy that matches the actual variants, and encodes the
differences between source and built distributions:

```rust
pub enum Distribution {
    Built(BuiltDistribution),
    Source(SourceDistribution),
}

pub enum BuiltDistribution {
    Registry(RegistryBuiltDistribution),
    DirectUrl(DirectUrlBuiltDistribution),
}

pub enum SourceDistribution {
    Registry(RegistrySourceDistribution),
    DirectUrl(DirectUrlSourceDistribution),
    Git(GitSourceDistribution),
}

/// A built distribution (wheel) that exists in a registry, like `PyPI`.
pub struct RegistryBuiltDistribution {
    pub name: PackageName,
    pub version: Version,
    pub file: File,
}

/// A built distribution (wheel) that exists at an arbitrary URL.
pub struct DirectUrlBuiltDistribution {
    pub name: PackageName,
    pub url: Url,
}

/// A source distribution that exists in a registry, like `PyPI`.
pub struct RegistrySourceDistribution {
    pub name: PackageName,
    pub version: Version,
    pub file: File,
}

/// A source distribution that exists at an arbitrary URL.
pub struct DirectUrlSourceDistribution {
    pub name: PackageName,
    pub url: Url,
}

/// A source distribution that exists in a Git repository.
pub struct GitSourceDistribution {
    pub name: PackageName,
    pub url: Url,
}
```

Most of the PR just stems downstream from this change. There are no
behavioral changes, so I'm largely relying on lint, tests, and the
compiler for correctness.
2023-11-10 02:45:41 +00:00
Andrew Gallant 33c0901a28
distribution-filename: speed up is_compatible (#367)
This PR tweaks the representation of `Tags` in order to offer a
faster implementation of `WheelFilename::is_compatible`. We now use a
nested map of tags that lets us avoid looping over every supported
platform tag. As the code comments suggest, that is the essential gain.
We still do not mind looping over the tags in each wheel name since they
tend to be quite small. And pushing our thumb on that side of things can
make things worse overall since it would likely slow down WheelFilename
construction itself.

For micro-benchmarks, we improve considerably for compatibility
checking:

    $ critcmp base test3
group base test3
----- ---- -----
build_platform_tags/burntsushi-archlinux 1.00 46.2±0.28µs ? ?/sec 2.48
114.8±0.45µs ? ?/sec
wheelname_parsing/flyte-long-compatible 1.00 624.8±3.31ns 174.0 MB/sec
1.01 629.4±4.30ns 172.7 MB/sec
wheelname_parsing/flyte-long-incompatible 1.00 743.6±4.23ns 165.4 MB/sec
1.00 746.9±4.62ns 164.7 MB/sec
wheelname_parsing/flyte-short-compatible 1.00 526.7±4.76ns 54.3 MB/sec
1.01 530.2±5.81ns 54.0 MB/sec
wheelname_parsing/flyte-short-incompatible 1.00 540.4±4.93ns 60.0 MB/sec
1.01 545.7±5.31ns 59.4 MB/sec
wheelname_parsing_failure/flyte-long-extension 1.00 13.6±0.13ns 3.2
GB/sec 1.01 13.7±0.14ns 3.2 GB/sec
wheelname_parsing_failure/flyte-short-extension 1.00 14.0±0.20ns 1160.4
MB/sec 1.01 14.1±0.14ns 1146.5 MB/sec
wheelname_tag_compatibility/flyte-long-compatible 11.33 159.8±2.79ns
680.5 MB/sec 1.00 14.1±0.23ns 7.5 GB/sec
wheelname_tag_compatibility/flyte-long-incompatible 237.60
1671.8±37.99ns 73.6 MB/sec 1.00 7.0±0.08ns 17.1 GB/sec
wheelname_tag_compatibility/flyte-short-compatible 16.07 223.5±8.60ns
128.0 MB/sec 1.00 13.9±0.30ns 2.0 GB/sec
wheelname_tag_compatibility/flyte-short-incompatible 149.83 628.3±2.13ns
51.6 MB/sec 1.00 4.2±0.10ns 7.6 GB/sec

We do regress slightly on the time it takes for `Tags::new` to run, but
this is somewhat expected. And in absolute terms, 114us is perfectly
acceptable given that it's only executed ~once for each `puffin`
invocation.

Ad hoc benchmarks indicate an overall 25% perf improvement in `puffin
pip-compile` times. This roughly corresponds with how much time
`is_compatible` was taking. Indeed, profiling confirms that it has
virtually disappeared from the profile.

Fixes #157
2023-11-09 09:01:03 -05:00
konsti bdb89b4072
Allow setting num tasks in puffin-dev parallel resolve (#374) 2023-11-09 13:12:03 +00:00