mirror of https://github.com/astral-sh/ruff
Merge remote-tracking branch 'origin/main' into gggg
* origin/main: (22 commits) [ty] Allow gradual lower/upper bounds in a constraint set (#21957) [ty] disallow explicit specialization of type variables themselves (#21938) [ty] Improve diagnostics for unsupported binary operations and unsupported augmented assignments (#21947) [ty] update implicit root docs (#21955) [ty] Enable even more goto-definition on inlay hints (#21950) Document known lambda formatting deviations from Black (#21954) [ty] fix hover type on named expression target (#21952) Bump benchmark dependencies (#21951) Keep lambda parameters on one line and parenthesize the body if it expands (#21385) [ty] Improve resolution of absolute imports in tests (#21817) [ty] Support `__all__ += submodule.__all__` [ty] Change frequency of invalid `__all__` debug message [ty] Add `KnownUnion::to_type()` (#21948) [ty] Classify `cls` as class parameter (#21944) [ty] Stabilize rename (#21940) [ty] Ignore `__all__` for document and workspace symbol requests [ty] Attach db to background request handler task (#21941) [ty] Fix outdated version in publish diagnostics after `didChange` (#21943) [ty] avoid fixpoint unioning of types containing current-cycle Divergent (#21910) [ty] improve bad specialization results & error messages (#21840) ...
This commit is contained in:
commit
c94fbe20a2
42
CHANGELOG.md
42
CHANGELOG.md
|
|
@ -1,5 +1,47 @@
|
|||
# Changelog
|
||||
|
||||
## 0.14.9
|
||||
|
||||
Released on 2025-12-11.
|
||||
|
||||
### Preview features
|
||||
|
||||
- \[`ruff`\] New `RUF100` diagnostics for unused range suppressions ([#21783](https://github.com/astral-sh/ruff/pull/21783))
|
||||
- \[`pylint`\] Detect subclasses of builtin exceptions (`PLW0133`) ([#21382](https://github.com/astral-sh/ruff/pull/21382))
|
||||
|
||||
### Bug fixes
|
||||
|
||||
- Fix comment placement in lambda parameters ([#21868](https://github.com/astral-sh/ruff/pull/21868))
|
||||
- Skip over trivia tokens after re-lexing ([#21895](https://github.com/astral-sh/ruff/pull/21895))
|
||||
- \[`flake8-bandit`\] Fix false positive when using non-standard `CSafeLoader` path (S506). ([#21830](https://github.com/astral-sh/ruff/pull/21830))
|
||||
- \[`flake8-bugbear`\] Accept immutable slice default arguments (`B008`) ([#21823](https://github.com/astral-sh/ruff/pull/21823))
|
||||
|
||||
### Rule changes
|
||||
|
||||
- \[`pydocstyle`\] Suppress `D417` for parameters with `Unpack` annotations ([#21816](https://github.com/astral-sh/ruff/pull/21816))
|
||||
|
||||
### Performance
|
||||
|
||||
- Use `memchr` for computing line indexes ([#21838](https://github.com/astral-sh/ruff/pull/21838))
|
||||
|
||||
### Documentation
|
||||
|
||||
- Document `*.pyw` is included by default in preview ([#21885](https://github.com/astral-sh/ruff/pull/21885))
|
||||
- Document range suppressions, reorganize suppression docs ([#21884](https://github.com/astral-sh/ruff/pull/21884))
|
||||
- Update mkdocs-material to 9.7.0 (Insiders now free) ([#21797](https://github.com/astral-sh/ruff/pull/21797))
|
||||
|
||||
### Contributors
|
||||
|
||||
- [@Avasam](https://github.com/Avasam)
|
||||
- [@MichaReiser](https://github.com/MichaReiser)
|
||||
- [@charliermarsh](https://github.com/charliermarsh)
|
||||
- [@amyreese](https://github.com/amyreese)
|
||||
- [@phongddo](https://github.com/phongddo)
|
||||
- [@prakhar1144](https://github.com/prakhar1144)
|
||||
- [@mahiro72](https://github.com/mahiro72)
|
||||
- [@ntBre](https://github.com/ntBre)
|
||||
- [@LoicRiegel](https://github.com/LoicRiegel)
|
||||
|
||||
## 0.14.8
|
||||
|
||||
Released on 2025-12-04.
|
||||
|
|
|
|||
|
|
@ -2860,7 +2860,7 @@ dependencies = [
|
|||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.14.8"
|
||||
version = "0.14.9"
|
||||
dependencies = [
|
||||
"anyhow",
|
||||
"argfile",
|
||||
|
|
@ -3118,7 +3118,7 @@ dependencies = [
|
|||
|
||||
[[package]]
|
||||
name = "ruff_linter"
|
||||
version = "0.14.8"
|
||||
version = "0.14.9"
|
||||
dependencies = [
|
||||
"aho-corasick",
|
||||
"anyhow",
|
||||
|
|
@ -3475,7 +3475,7 @@ dependencies = [
|
|||
|
||||
[[package]]
|
||||
name = "ruff_wasm"
|
||||
version = "0.14.8"
|
||||
version = "0.14.9"
|
||||
dependencies = [
|
||||
"console_error_panic_hook",
|
||||
"console_log",
|
||||
|
|
|
|||
|
|
@ -147,8 +147,8 @@ curl -LsSf https://astral.sh/ruff/install.sh | sh
|
|||
powershell -c "irm https://astral.sh/ruff/install.ps1 | iex"
|
||||
|
||||
# For a specific version.
|
||||
curl -LsSf https://astral.sh/ruff/0.14.8/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.14.8/install.ps1 | iex"
|
||||
curl -LsSf https://astral.sh/ruff/0.14.9/install.sh | sh
|
||||
powershell -c "irm https://astral.sh/ruff/0.14.9/install.ps1 | iex"
|
||||
```
|
||||
|
||||
You can also install Ruff via [Homebrew](https://formulae.brew.sh/formula/ruff), [Conda](https://anaconda.org/conda-forge/ruff),
|
||||
|
|
@ -181,7 +181,7 @@ Ruff can also be used as a [pre-commit](https://pre-commit.com/) hook via [`ruff
|
|||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.8
|
||||
rev: v0.14.9
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[package]
|
||||
name = "ruff"
|
||||
version = "0.14.8"
|
||||
version = "0.14.9"
|
||||
publish = true
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[package]
|
||||
name = "ruff_linter"
|
||||
version = "0.14.8"
|
||||
version = "0.14.9"
|
||||
publish = false
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
|
|
|||
|
|
@ -125,6 +125,13 @@ lambda a, /, c: a
|
|||
*x: x
|
||||
)
|
||||
|
||||
(
|
||||
lambda
|
||||
# comment
|
||||
*x,
|
||||
**y: x
|
||||
)
|
||||
|
||||
(
|
||||
lambda
|
||||
# comment 1
|
||||
|
|
@ -196,6 +203,17 @@ lambda: ( # comment
|
|||
x
|
||||
)
|
||||
|
||||
(
|
||||
lambda # 1
|
||||
# 2
|
||||
x, # 3
|
||||
# 4
|
||||
y
|
||||
: # 5
|
||||
# 6
|
||||
x
|
||||
)
|
||||
|
||||
(
|
||||
lambda
|
||||
x,
|
||||
|
|
@ -204,6 +222,71 @@ lambda: ( # comment
|
|||
z
|
||||
)
|
||||
|
||||
|
||||
# Leading
|
||||
lambda x: (
|
||||
lambda y: lambda z: x
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ y
|
||||
+ z # Trailing
|
||||
) # Trailing
|
||||
|
||||
|
||||
# Leading
|
||||
lambda x: lambda y: lambda z: [
|
||||
x,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
y,
|
||||
z
|
||||
] # Trailing
|
||||
# Trailing
|
||||
|
||||
lambda self, araa, kkkwargs=aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*args, **kwargs), e=1, f=2, g=2: d
|
||||
|
||||
# Regression tests for https://github.com/astral-sh/ruff/issues/8179
|
||||
|
|
@ -228,6 +311,441 @@ def a():
|
|||
g = 10
|
||||
)
|
||||
|
||||
def a():
|
||||
return b(
|
||||
c,
|
||||
d,
|
||||
e,
|
||||
f=lambda self, *args, **kwargs: aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(
|
||||
*args, **kwargs
|
||||
) + 1,
|
||||
)
|
||||
|
||||
# Additional ecosystem cases from https://github.com/astral-sh/ruff/pull/21385
|
||||
class C:
|
||||
def foo():
|
||||
mock_service.return_value.bucket.side_effect = lambda name: (
|
||||
source_bucket
|
||||
if name == source_bucket_name
|
||||
else storage.Bucket(mock_service, destination_bucket_name)
|
||||
)
|
||||
|
||||
class C:
|
||||
function_dict: Dict[Text, Callable[[CRFToken], Any]] = {
|
||||
CRFEntityExtractorOptions.POS2: lambda crf_token: crf_token.pos_tag[:2]
|
||||
if crf_token.pos_tag is not None
|
||||
else None,
|
||||
}
|
||||
|
||||
name = re.sub(r"[^\x21\x23-\x5b\x5d-\x7e]...............", lambda m: f"\\{m.group(0)}", p["name"])
|
||||
|
||||
def foo():
|
||||
if True:
|
||||
if True:
|
||||
return (
|
||||
lambda x: np.exp(cs(np.log(x.to(u.MeV).value))) * u.MeV * u.cm**2 / u.g
|
||||
)
|
||||
|
||||
class C:
|
||||
_is_recognized_dtype: Callable[[DtypeObj], bool] = lambda x: lib.is_np_dtype(
|
||||
x, "M"
|
||||
) or isinstance(x, DatetimeTZDtype)
|
||||
|
||||
class C:
|
||||
def foo():
|
||||
if True:
|
||||
transaction_count = self._query_txs_for_range(
|
||||
get_count_fn=lambda from_ts, to_ts, _chain_id=chain_id: db_evmtx.count_transactions_in_range(
|
||||
chain_id=_chain_id,
|
||||
from_ts=from_ts,
|
||||
to_ts=to_ts,
|
||||
),
|
||||
)
|
||||
|
||||
transaction_count = self._query_txs_for_range(
|
||||
get_count_fn=lambda from_ts, to_ts, _chain_id=chain_id: db_evmtx.count_transactions_in_range[_chain_id, from_ts, to_ts],
|
||||
)
|
||||
|
||||
def ddb():
|
||||
sql = (
|
||||
lambda var, table, n=N: f"""
|
||||
CREATE TABLE {table} AS
|
||||
SELECT ROW_NUMBER() OVER () AS id, {var}
|
||||
FROM (
|
||||
SELECT {var}
|
||||
FROM RANGE({n}) _ ({var})
|
||||
ORDER BY RANDOM()
|
||||
)
|
||||
"""
|
||||
)
|
||||
|
||||
long_assignment_target.with_attribute.and_a_slice[with_an_index] = ( # 1
|
||||
# 2
|
||||
lambda x, y, z: # 3
|
||||
# 4
|
||||
x + y + z # 5
|
||||
# 6
|
||||
)
|
||||
|
||||
long_assignment_target.with_attribute.and_a_slice[with_an_index] = (
|
||||
lambda x, y, z: x + y + z
|
||||
)
|
||||
|
||||
long_assignment_target.with_attribute.and_a_slice[with_an_index] = lambda x, y, z: x + y + z
|
||||
|
||||
very_long_variable_name_x, very_long_variable_name_y = lambda a: a + some_very_long_expression, lambda b: b * another_very_long_expression_here
|
||||
|
||||
very_long_variable_name_for_result += lambda x: very_long_function_call_that_should_definitely_be_parenthesized_now(x, more_args, additional_parameters)
|
||||
|
||||
|
||||
if 1:
|
||||
if 2:
|
||||
if 3:
|
||||
if self.location in EVM_EVMLIKE_LOCATIONS and database is not None:
|
||||
exported_dict["notes"] = EVM_ADDRESS_REGEX.sub(
|
||||
repl=lambda matched_address: self._maybe_add_label_with_address(
|
||||
database=database,
|
||||
matched_address=matched_address,
|
||||
),
|
||||
string=exported_dict["notes"],
|
||||
)
|
||||
|
||||
class C:
|
||||
def f():
|
||||
return dict(
|
||||
filter(
|
||||
lambda intent_response: self.is_retrieval_intent_response(
|
||||
intent_response
|
||||
),
|
||||
self.responses.items(),
|
||||
)
|
||||
)
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"op",
|
||||
[
|
||||
# Not fluent
|
||||
param(
|
||||
lambda left, right: (
|
||||
ibis.timestamp("2017-04-01")
|
||||
),
|
||||
),
|
||||
# These four are fluent and fit on one line inside the parenthesized
|
||||
# lambda body
|
||||
param(
|
||||
lambda left, right: (
|
||||
ibis.timestamp("2017-04-01").cast(dt.date)
|
||||
),
|
||||
),
|
||||
param(
|
||||
lambda left, right: (
|
||||
ibis.timestamp("2017-04-01").cast(dt.date).between(left, right)
|
||||
),
|
||||
),
|
||||
param(lambda left, right: ibis.timestamp("2017-04-01").cast(dt.date)),
|
||||
param(lambda left, right: ibis.timestamp("2017-04-01").cast(dt.date).between(left, right)),
|
||||
# This is too long on one line in the lambda body and gets wrapped
|
||||
# inside the body.
|
||||
param(
|
||||
lambda left, right: (
|
||||
ibis.timestamp("2017-04-01").cast(dt.date).between(left, right).between(left, right)
|
||||
),
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_string_temporal_compare_between(con, op, left, right): ...
|
||||
|
||||
[
|
||||
(
|
||||
lambda eval_df, _: MetricValue(
|
||||
scores=eval_df["prediction"].tolist(),
|
||||
aggregate_results={"prediction_sum": sum(eval_df["prediction"])},
|
||||
)
|
||||
),
|
||||
]
|
||||
|
||||
# reuses the list parentheses
|
||||
lambda xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy, zzzzzzzzzzzzzzzzzzzz: [xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy, zzzzzzzzzzzzzzzzzzzz]
|
||||
|
||||
# adds parentheses around the body
|
||||
lambda xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy, zzzzzzzzzzzzzzzzzzzz: xxxxxxxxxxxxxxxxxxxx + yyyyyyyyyyyyyyyyyyyy + zzzzzzzzzzzzzzzzzzzz
|
||||
|
||||
# removes parentheses around the body
|
||||
lambda xxxxxxxxxxxxxxxxxxxx: (xxxxxxxxxxxxxxxxxxxx + 1)
|
||||
|
||||
mapper = lambda x: dict_with_default[np.nan if isinstance(x, float) and np.isnan(x) else x]
|
||||
|
||||
lambda x, y, z: (
|
||||
x + y + z
|
||||
)
|
||||
|
||||
lambda x, y, z: (
|
||||
x + y + z
|
||||
# trailing body
|
||||
)
|
||||
|
||||
lambda x, y, z: (
|
||||
x + y + z # trailing eol body
|
||||
)
|
||||
|
||||
lambda x, y, z: (
|
||||
x + y + z
|
||||
) # trailing lambda
|
||||
|
||||
lambda x, y, z: (
|
||||
# leading body
|
||||
x + y + z
|
||||
)
|
||||
|
||||
lambda x, y, z: ( # leading eol body
|
||||
x + y + z
|
||||
)
|
||||
|
||||
(
|
||||
lambda name:
|
||||
source_bucket # trailing eol comment
|
||||
if name == source_bucket_name
|
||||
else storage.Bucket(mock_service, destination_bucket_name)
|
||||
)
|
||||
|
||||
(
|
||||
lambda name:
|
||||
# dangling header comment
|
||||
source_bucket
|
||||
if name == source_bucket_name
|
||||
else storage.Bucket(mock_service, destination_bucket_name)
|
||||
)
|
||||
|
||||
x = (
|
||||
lambda name:
|
||||
# dangling header comment
|
||||
source_bucket
|
||||
if name == source_bucket_name
|
||||
else storage.Bucket(mock_service, destination_bucket_name)
|
||||
)
|
||||
|
||||
(
|
||||
lambda name: # dangling header comment
|
||||
(
|
||||
source_bucket
|
||||
if name == source_bucket_name
|
||||
else storage.Bucket(mock_service, destination_bucket_name)
|
||||
)
|
||||
)
|
||||
|
||||
(
|
||||
lambda from_ts, to_ts, _chain_id=chain_id: # dangling eol header comment
|
||||
db_evmtx.count_transactions_in_range(
|
||||
chain_id=_chain_id,
|
||||
from_ts=from_ts,
|
||||
to_ts=to_ts,
|
||||
)
|
||||
)
|
||||
|
||||
(
|
||||
lambda from_ts, to_ts, _chain_id=chain_id:
|
||||
# dangling header comment before call
|
||||
db_evmtx.count_transactions_in_range(
|
||||
chain_id=_chain_id,
|
||||
from_ts=from_ts,
|
||||
to_ts=to_ts,
|
||||
)
|
||||
)
|
||||
|
||||
(
|
||||
lambda left, right:
|
||||
# comment
|
||||
ibis.timestamp("2017-04-01").cast(dt.date).between(left, right)
|
||||
)
|
||||
|
||||
(
|
||||
lambda left, right:
|
||||
ibis.timestamp("2017-04-01") # comment
|
||||
.cast(dt.date)
|
||||
.between(left, right)
|
||||
)
|
||||
|
||||
(
|
||||
lambda xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy:
|
||||
# comment
|
||||
[xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy, zzzzzzzzzzzzzzzzzzzz]
|
||||
)
|
||||
|
||||
(
|
||||
lambda x, y:
|
||||
# comment
|
||||
{
|
||||
"key": x,
|
||||
"another": y,
|
||||
}
|
||||
)
|
||||
|
||||
(
|
||||
lambda x, y:
|
||||
# comment
|
||||
(
|
||||
x,
|
||||
y,
|
||||
z
|
||||
)
|
||||
)
|
||||
|
||||
(
|
||||
lambda x:
|
||||
# comment
|
||||
dict_with_default[np.nan if isinstance(x, float) and np.isnan(x) else x]
|
||||
)
|
||||
|
||||
(
|
||||
lambda from_ts, to_ts, _chain_id=chain_id:
|
||||
db_evmtx.count_transactions_in_range[
|
||||
# comment
|
||||
_chain_id, from_ts, to_ts
|
||||
]
|
||||
)
|
||||
|
||||
(
|
||||
lambda
|
||||
# comment
|
||||
*args, **kwargs:
|
||||
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*args, **kwargs) + 1
|
||||
)
|
||||
|
||||
(
|
||||
lambda # comment
|
||||
*args, **kwargs:
|
||||
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*args, **kwargs) + 1
|
||||
)
|
||||
|
||||
(
|
||||
lambda # comment 1
|
||||
# comment 2
|
||||
*args, **kwargs: # comment 3
|
||||
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*args, **kwargs) + 1
|
||||
)
|
||||
|
||||
(
|
||||
lambda # comment 1
|
||||
*args, **kwargs: # comment 3
|
||||
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*args, **kwargs) + 1
|
||||
)
|
||||
|
||||
(
|
||||
lambda *args, **kwargs:
|
||||
# comment 1
|
||||
( # comment 2
|
||||
# comment 3
|
||||
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*args, **kwargs) + 1 # comment 4
|
||||
# comment 5
|
||||
) # comment 6
|
||||
)
|
||||
|
||||
(
|
||||
lambda *brgs, **kwargs:
|
||||
# comment 1
|
||||
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa( # comment 2
|
||||
# comment 3
|
||||
*brgs, **kwargs) + 1 # comment 4
|
||||
# comment 5
|
||||
)
|
||||
|
||||
(
|
||||
lambda *crgs, **kwargs: # comment 1
|
||||
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*crgs, **kwargs) + 1
|
||||
)
|
||||
|
||||
(
|
||||
lambda *drgs, **kwargs: # comment 1
|
||||
(
|
||||
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*drgs, **kwargs) + 1
|
||||
)
|
||||
)
|
||||
|
||||
(
|
||||
lambda * # comment 1
|
||||
ergs, **
|
||||
# comment 2
|
||||
kwargs # comment 3
|
||||
: # comment 4
|
||||
(
|
||||
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*ergs, **kwargs) + 1
|
||||
)
|
||||
)
|
||||
|
||||
(
|
||||
lambda # 1
|
||||
# 2
|
||||
left, # 3
|
||||
# 4
|
||||
right: # 5
|
||||
# 6
|
||||
ibis.timestamp("2017-04-01").cast(dt.date).between(left, right)
|
||||
)
|
||||
|
||||
(
|
||||
lambda x: # outer comment 1
|
||||
(
|
||||
lambda y: # inner comment 1
|
||||
# inner comment 2
|
||||
lambda z: (
|
||||
# innermost comment
|
||||
x + y + z
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
foo(
|
||||
lambda from_ts, # comment prevents collapsing the parameters to one line
|
||||
to_ts, _chain_id=chain_id: db_evmtx.count_transactions_in_range(
|
||||
chain_id=_chain_id,
|
||||
from_ts=from_ts,
|
||||
to_ts=to_ts,
|
||||
)
|
||||
)
|
||||
|
||||
foo(
|
||||
lambda from_ts, # but still wrap the body if it gets too long
|
||||
to_ts,
|
||||
_chain_id=chain_id: db_evmtx.count_transactions_in_rangeeeeeeeeeeeeeeeeeeeeeeeeeeeee(
|
||||
chain_id=_chain_id,
|
||||
from_ts=from_ts,
|
||||
to_ts=to_ts,
|
||||
)
|
||||
)
|
||||
|
||||
transform = lambda left, right: ibis.timestamp("2017-04-01").cast(dt.date).between(left, right).between(left, right) # trailing comment
|
||||
|
||||
(
|
||||
lambda: # comment
|
||||
1
|
||||
)
|
||||
|
||||
(
|
||||
lambda # comment
|
||||
:
|
||||
1
|
||||
)
|
||||
|
||||
(
|
||||
lambda:
|
||||
# comment
|
||||
1
|
||||
)
|
||||
|
||||
(
|
||||
lambda: # comment 1
|
||||
# comment 2
|
||||
1
|
||||
)
|
||||
|
||||
(
|
||||
lambda # comment 1
|
||||
# comment 2
|
||||
: # comment 3
|
||||
# comment 4
|
||||
1
|
||||
)
|
||||
|
||||
(
|
||||
lambda
|
||||
* # comment 2
|
||||
|
|
@ -271,3 +789,18 @@ def a():
|
|||
x:
|
||||
x
|
||||
)
|
||||
|
||||
(
|
||||
lambda: # dangling-end-of-line
|
||||
# dangling-own-line
|
||||
( # leading-body-end-of-line
|
||||
x
|
||||
)
|
||||
)
|
||||
|
||||
(
|
||||
lambda: # dangling-end-of-line
|
||||
( # leading-body-end-of-line
|
||||
x
|
||||
)
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
use ruff_formatter::{Argument, Arguments, write};
|
||||
use ruff_formatter::{Argument, Arguments, format_args, write};
|
||||
use ruff_text_size::{Ranged, TextRange, TextSize};
|
||||
|
||||
use crate::context::{NodeLevel, WithNodeLevel};
|
||||
|
|
@ -33,20 +33,27 @@ impl<'ast> Format<PyFormatContext<'ast>> for ParenthesizeIfExpands<'_, 'ast> {
|
|||
{
|
||||
let mut f = WithNodeLevel::new(NodeLevel::ParenthesizedExpression, f);
|
||||
|
||||
write!(
|
||||
f,
|
||||
[group(&format_with(|f| {
|
||||
if_group_breaks(&token("(")).fmt(f)?;
|
||||
|
||||
if self.indent {
|
||||
soft_block_indent(&Arguments::from(&self.inner)).fmt(f)?;
|
||||
} else {
|
||||
Arguments::from(&self.inner).fmt(f)?;
|
||||
}
|
||||
|
||||
if_group_breaks(&token(")")).fmt(f)
|
||||
}))]
|
||||
)
|
||||
if self.indent {
|
||||
let parens_id = f.group_id("indented_parenthesize_if_expands");
|
||||
group(&format_args![
|
||||
if_group_breaks(&token("(")),
|
||||
indent_if_group_breaks(
|
||||
&format_args![soft_line_break(), &Arguments::from(&self.inner)],
|
||||
parens_id
|
||||
),
|
||||
soft_line_break(),
|
||||
if_group_breaks(&token(")"))
|
||||
])
|
||||
.with_id(Some(parens_id))
|
||||
.fmt(&mut f)
|
||||
} else {
|
||||
group(&format_args![
|
||||
if_group_breaks(&token("(")),
|
||||
Arguments::from(&self.inner),
|
||||
if_group_breaks(&token(")")),
|
||||
])
|
||||
.fmt(&mut f)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,15 +1,21 @@
|
|||
use ruff_formatter::write;
|
||||
use ruff_python_ast::AnyNodeRef;
|
||||
use ruff_python_ast::ExprLambda;
|
||||
use ruff_formatter::{FormatRuleWithOptions, RemoveSoftLinesBuffer, format_args, write};
|
||||
use ruff_python_ast::{AnyNodeRef, Expr, ExprLambda};
|
||||
use ruff_text_size::Ranged;
|
||||
|
||||
use crate::comments::dangling_comments;
|
||||
use crate::expression::parentheses::{NeedsParentheses, OptionalParentheses};
|
||||
use crate::builders::parenthesize_if_expands;
|
||||
use crate::comments::{SourceComment, dangling_comments, leading_comments, trailing_comments};
|
||||
use crate::expression::parentheses::{
|
||||
NeedsParentheses, OptionalParentheses, Parentheses, is_expression_parenthesized,
|
||||
};
|
||||
use crate::expression::{CallChainLayout, has_own_parentheses};
|
||||
use crate::other::parameters::ParametersParentheses;
|
||||
use crate::prelude::*;
|
||||
use crate::preview::is_parenthesize_lambda_bodies_enabled;
|
||||
|
||||
#[derive(Default)]
|
||||
pub struct FormatExprLambda;
|
||||
pub struct FormatExprLambda {
|
||||
layout: ExprLambdaLayout,
|
||||
}
|
||||
|
||||
impl FormatNodeRule<ExprLambda> for FormatExprLambda {
|
||||
fn fmt_fields(&self, item: &ExprLambda, f: &mut PyFormatter) -> FormatResult<()> {
|
||||
|
|
@ -20,13 +26,19 @@ impl FormatNodeRule<ExprLambda> for FormatExprLambda {
|
|||
body,
|
||||
} = item;
|
||||
|
||||
let body = &**body;
|
||||
let parameters = parameters.as_deref();
|
||||
|
||||
let comments = f.context().comments().clone();
|
||||
let dangling = comments.dangling(item);
|
||||
let preview = is_parenthesize_lambda_bodies_enabled(f.context());
|
||||
|
||||
write!(f, [token("lambda")])?;
|
||||
|
||||
if let Some(parameters) = parameters {
|
||||
// In this context, a dangling comment can either be a comment between the `lambda` the
|
||||
// Format any dangling comments before the parameters, but save any dangling comments after
|
||||
// the parameters/after the header to be formatted with the body below.
|
||||
let dangling_header_comments = if let Some(parameters) = parameters {
|
||||
// In this context, a dangling comment can either be a comment between the `lambda` and the
|
||||
// parameters, or a comment between the parameters and the body.
|
||||
let (dangling_before_parameters, dangling_after_parameters) = dangling
|
||||
.split_at(dangling.partition_point(|comment| comment.end() < parameters.start()));
|
||||
|
|
@ -86,7 +98,7 @@ impl FormatNodeRule<ExprLambda> for FormatExprLambda {
|
|||
// *x: x
|
||||
// )
|
||||
// ```
|
||||
if comments.has_leading(&**parameters) {
|
||||
if comments.has_leading(parameters) {
|
||||
hard_line_break().fmt(f)?;
|
||||
} else {
|
||||
write!(f, [space()])?;
|
||||
|
|
@ -95,32 +107,90 @@ impl FormatNodeRule<ExprLambda> for FormatExprLambda {
|
|||
write!(f, [dangling_comments(dangling_before_parameters)])?;
|
||||
}
|
||||
|
||||
write!(
|
||||
f,
|
||||
[parameters
|
||||
.format()
|
||||
.with_options(ParametersParentheses::Never)]
|
||||
)?;
|
||||
|
||||
write!(f, [token(":")])?;
|
||||
|
||||
if dangling_after_parameters.is_empty() {
|
||||
write!(f, [space()])?;
|
||||
// Try to keep the parameters on a single line, unless there are intervening comments.
|
||||
if preview && !comments.contains_comments(parameters.into()) {
|
||||
let mut buffer = RemoveSoftLinesBuffer::new(f);
|
||||
write!(
|
||||
buffer,
|
||||
[parameters
|
||||
.format()
|
||||
.with_options(ParametersParentheses::Never)]
|
||||
)?;
|
||||
} else {
|
||||
write!(f, [dangling_comments(dangling_after_parameters)])?;
|
||||
write!(
|
||||
f,
|
||||
[parameters
|
||||
.format()
|
||||
.with_options(ParametersParentheses::Never)]
|
||||
)?;
|
||||
}
|
||||
|
||||
dangling_after_parameters
|
||||
} else {
|
||||
write!(f, [token(":")])?;
|
||||
dangling
|
||||
};
|
||||
|
||||
// In this context, a dangling comment is a comment between the `lambda` and the body.
|
||||
if dangling.is_empty() {
|
||||
write!(f, [space()])?;
|
||||
} else {
|
||||
write!(f, [dangling_comments(dangling)])?;
|
||||
}
|
||||
write!(f, [token(":")])?;
|
||||
|
||||
if dangling_header_comments.is_empty() {
|
||||
write!(f, [space()])?;
|
||||
} else if !preview {
|
||||
write!(f, [dangling_comments(dangling_header_comments)])?;
|
||||
}
|
||||
|
||||
write!(f, [body.format()])
|
||||
if !preview {
|
||||
return body.format().fmt(f);
|
||||
}
|
||||
|
||||
let fmt_body = FormatBody {
|
||||
body,
|
||||
dangling_header_comments,
|
||||
};
|
||||
|
||||
match self.layout {
|
||||
ExprLambdaLayout::Assignment => fits_expanded(&fmt_body).fmt(f),
|
||||
ExprLambdaLayout::Default => fmt_body.fmt(f),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Default, Copy, Clone)]
|
||||
pub enum ExprLambdaLayout {
|
||||
#[default]
|
||||
Default,
|
||||
|
||||
/// The [`ExprLambda`] is the direct child of an assignment expression, so it needs to use
|
||||
/// `fits_expanded` to prefer parenthesizing its own body before the assignment tries to
|
||||
/// parenthesize the whole lambda. For example, we want this formatting:
|
||||
///
|
||||
/// ```py
|
||||
/// long_assignment_target = lambda x, y, z: (
|
||||
/// x + y + z
|
||||
/// )
|
||||
/// ```
|
||||
///
|
||||
/// instead of either of these:
|
||||
///
|
||||
/// ```py
|
||||
/// long_assignment_target = (
|
||||
/// lambda x, y, z: (
|
||||
/// x + y + z
|
||||
/// )
|
||||
/// )
|
||||
///
|
||||
/// long_assignment_target = (
|
||||
/// lambda x, y, z: x + y + z
|
||||
/// )
|
||||
/// ```
|
||||
Assignment,
|
||||
}
|
||||
|
||||
impl FormatRuleWithOptions<ExprLambda, PyFormatContext<'_>> for FormatExprLambda {
|
||||
type Options = ExprLambdaLayout;
|
||||
|
||||
fn with_options(mut self, options: Self::Options) -> Self {
|
||||
self.layout = options;
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -137,3 +207,266 @@ impl NeedsParentheses for ExprLambda {
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
struct FormatBody<'a> {
|
||||
body: &'a Expr,
|
||||
|
||||
/// Dangling comments attached to the lambda header that should be formatted with the body.
|
||||
///
|
||||
/// These can include both own-line and end-of-line comments. For lambdas with parameters, this
|
||||
/// means comments after the parameters:
|
||||
///
|
||||
/// ```py
|
||||
/// (
|
||||
/// lambda x, y # 1
|
||||
/// # 2
|
||||
/// : # 3
|
||||
/// # 4
|
||||
/// x + y
|
||||
/// )
|
||||
/// ```
|
||||
///
|
||||
/// Or all dangling comments for lambdas without parameters:
|
||||
///
|
||||
/// ```py
|
||||
/// (
|
||||
/// lambda # 1
|
||||
/// # 2
|
||||
/// : # 3
|
||||
/// # 4
|
||||
/// 1
|
||||
/// )
|
||||
/// ```
|
||||
///
|
||||
/// In most cases these should formatted within the parenthesized body, as in:
|
||||
///
|
||||
/// ```py
|
||||
/// (
|
||||
/// lambda: ( # 1
|
||||
/// # 2
|
||||
/// # 3
|
||||
/// # 4
|
||||
/// 1
|
||||
/// )
|
||||
/// )
|
||||
/// ```
|
||||
///
|
||||
/// or without `# 2`:
|
||||
///
|
||||
/// ```py
|
||||
/// (
|
||||
/// lambda: ( # 1 # 3
|
||||
/// # 4
|
||||
/// 1
|
||||
/// )
|
||||
/// )
|
||||
/// ```
|
||||
dangling_header_comments: &'a [SourceComment],
|
||||
}
|
||||
|
||||
impl Format<PyFormatContext<'_>> for FormatBody<'_> {
|
||||
fn fmt(&self, f: &mut PyFormatter) -> FormatResult<()> {
|
||||
let FormatBody {
|
||||
dangling_header_comments,
|
||||
body,
|
||||
} = self;
|
||||
|
||||
let body = *body;
|
||||
let comments = f.context().comments().clone();
|
||||
let body_comments = comments.leading_dangling_trailing(body);
|
||||
|
||||
if !dangling_header_comments.is_empty() {
|
||||
// Split the dangling header comments into trailing comments formatted with the lambda
|
||||
// header (1) and leading comments formatted with the body (2, 3, 4).
|
||||
//
|
||||
// ```python
|
||||
// (
|
||||
// lambda # 1
|
||||
// # 2
|
||||
// : # 3
|
||||
// # 4
|
||||
// y
|
||||
// )
|
||||
// ```
|
||||
//
|
||||
// Note that these are split based on their line position rather than using
|
||||
// `partition_point` based on a range, for example.
|
||||
let (trailing_header_comments, leading_body_comments) = dangling_header_comments
|
||||
.split_at(
|
||||
dangling_header_comments
|
||||
.iter()
|
||||
.position(|comment| comment.line_position().is_own_line())
|
||||
.unwrap_or(dangling_header_comments.len()),
|
||||
);
|
||||
|
||||
// If the body is parenthesized and has its own leading comments, preserve the
|
||||
// separation between the dangling lambda comments and the body comments. For
|
||||
// example, preserve this comment positioning:
|
||||
//
|
||||
// ```python
|
||||
// (
|
||||
// lambda: # 1
|
||||
// # 2
|
||||
// ( # 3
|
||||
// x
|
||||
// )
|
||||
// )
|
||||
// ```
|
||||
//
|
||||
// 1 and 2 are dangling on the lambda and emitted first, followed by a hard line
|
||||
// break and the parenthesized body with its leading comments.
|
||||
//
|
||||
// However, when removing 2, 1 and 3 can instead be formatted on the same line:
|
||||
//
|
||||
// ```python
|
||||
// (
|
||||
// lambda: ( # 1 # 3
|
||||
// x
|
||||
// )
|
||||
// )
|
||||
// ```
|
||||
let comments = f.context().comments();
|
||||
if is_expression_parenthesized(body.into(), comments.ranges(), f.context().source())
|
||||
&& comments.has_leading(body)
|
||||
{
|
||||
trailing_comments(dangling_header_comments).fmt(f)?;
|
||||
|
||||
// Note that `leading_body_comments` have already been formatted as part of
|
||||
// `dangling_header_comments` above, but their presence still determines the spacing
|
||||
// here.
|
||||
if leading_body_comments.is_empty() {
|
||||
space().fmt(f)?;
|
||||
} else {
|
||||
hard_line_break().fmt(f)?;
|
||||
}
|
||||
|
||||
body.format().with_options(Parentheses::Always).fmt(f)
|
||||
} else {
|
||||
write!(
|
||||
f,
|
||||
[
|
||||
space(),
|
||||
token("("),
|
||||
trailing_comments(trailing_header_comments),
|
||||
block_indent(&format_args!(
|
||||
leading_comments(leading_body_comments),
|
||||
body.format().with_options(Parentheses::Never)
|
||||
)),
|
||||
token(")")
|
||||
]
|
||||
)
|
||||
}
|
||||
}
|
||||
// If the body has comments, we always want to preserve the parentheses. This also
|
||||
// ensures that we correctly handle parenthesized comments, and don't need to worry
|
||||
// about them in the implementation below.
|
||||
else if body_comments.has_leading() || body_comments.has_trailing_own_line() {
|
||||
body.format().with_options(Parentheses::Always).fmt(f)
|
||||
}
|
||||
// Calls and subscripts require special formatting because they have their own
|
||||
// parentheses, but they can also have an arbitrary amount of text before the
|
||||
// opening parenthesis. We want to avoid cases where we keep a long callable on the
|
||||
// same line as the lambda parameters. For example, `db_evmtx...` in:
|
||||
//
|
||||
// ```py
|
||||
// transaction_count = self._query_txs_for_range(
|
||||
// get_count_fn=lambda from_ts, to_ts, _chain_id=chain_id: db_evmtx.count_transactions_in_range(
|
||||
// chain_id=_chain_id,
|
||||
// from_ts=from_ts,
|
||||
// to_ts=to_ts,
|
||||
// ),
|
||||
// )
|
||||
// ```
|
||||
//
|
||||
// should cause the whole lambda body to be parenthesized instead:
|
||||
//
|
||||
// ```py
|
||||
// transaction_count = self._query_txs_for_range(
|
||||
// get_count_fn=lambda from_ts, to_ts, _chain_id=chain_id: (
|
||||
// db_evmtx.count_transactions_in_range(
|
||||
// chain_id=_chain_id,
|
||||
// from_ts=from_ts,
|
||||
// to_ts=to_ts,
|
||||
// )
|
||||
// ),
|
||||
// )
|
||||
// ```
|
||||
else if matches!(body, Expr::Call(_) | Expr::Subscript(_)) {
|
||||
let unparenthesized = body.format().with_options(Parentheses::Never);
|
||||
if CallChainLayout::from_expression(
|
||||
body.into(),
|
||||
comments.ranges(),
|
||||
f.context().source(),
|
||||
) == CallChainLayout::Fluent
|
||||
{
|
||||
parenthesize_if_expands(&unparenthesized).fmt(f)
|
||||
} else {
|
||||
let unparenthesized = unparenthesized.memoized();
|
||||
if unparenthesized.inspect(f)?.will_break() {
|
||||
expand_parent().fmt(f)?;
|
||||
}
|
||||
|
||||
best_fitting![
|
||||
// body all flat
|
||||
unparenthesized,
|
||||
// body expanded
|
||||
group(&unparenthesized).should_expand(true),
|
||||
// parenthesized
|
||||
format_args![token("("), block_indent(&unparenthesized), token(")")]
|
||||
]
|
||||
.fmt(f)
|
||||
}
|
||||
}
|
||||
// For other cases with their own parentheses, such as lists, sets, dicts, tuples,
|
||||
// etc., we can just format the body directly. Their own formatting results in the
|
||||
// lambda being formatted well too. For example:
|
||||
//
|
||||
// ```py
|
||||
// lambda xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy, zzzzzzzzzzzzzzzzzzzz: [xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy, zzzzzzzzzzzzzzzzzzzz]
|
||||
// ```
|
||||
//
|
||||
// gets formatted as:
|
||||
//
|
||||
// ```py
|
||||
// lambda xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy, zzzzzzzzzzzzzzzzzzzz: [
|
||||
// xxxxxxxxxxxxxxxxxxxx,
|
||||
// yyyyyyyyyyyyyyyyyyyy,
|
||||
// zzzzzzzzzzzzzzzzzzzz
|
||||
// ]
|
||||
// ```
|
||||
else if has_own_parentheses(body, f.context()).is_some() {
|
||||
body.format().fmt(f)
|
||||
}
|
||||
// Finally, for expressions without their own parentheses, use
|
||||
// `parenthesize_if_expands` to add parentheses around the body, only if it expands
|
||||
// across multiple lines. The `Parentheses::Never` here also removes unnecessary
|
||||
// parentheses around lambda bodies that fit on one line. For example:
|
||||
//
|
||||
// ```py
|
||||
// lambda xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy, zzzzzzzzzzzzzzzzzzzz: xxxxxxxxxxxxxxxxxxxx + yyyyyyyyyyyyyyyyyyyy + zzzzzzzzzzzzzzzzzzzz
|
||||
// ```
|
||||
//
|
||||
// is formatted as:
|
||||
//
|
||||
// ```py
|
||||
// lambda xxxxxxxxxxxxxxxxxxxx, yyyyyyyyyyyyyyyyyyyy, zzzzzzzzzzzzzzzzzzzz: (
|
||||
// xxxxxxxxxxxxxxxxxxxx + yyyyyyyyyyyyyyyyyyyy + zzzzzzzzzzzzzzzzzzzz
|
||||
// )
|
||||
// ```
|
||||
//
|
||||
// while
|
||||
//
|
||||
// ```py
|
||||
// lambda xxxxxxxxxxxxxxxxxxxx: (xxxxxxxxxxxxxxxxxxxx + 1)
|
||||
// ```
|
||||
//
|
||||
// is formatted as:
|
||||
//
|
||||
// ```py
|
||||
// lambda xxxxxxxxxxxxxxxxxxxx: xxxxxxxxxxxxxxxxxxxx + 1
|
||||
// ```
|
||||
else {
|
||||
parenthesize_if_expands(&body.format().with_options(Parentheses::Never)).fmt(f)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -52,3 +52,10 @@ pub(crate) const fn is_avoid_parens_for_long_as_captures_enabled(
|
|||
) -> bool {
|
||||
context.is_preview()
|
||||
}
|
||||
|
||||
/// Returns `true` if the
|
||||
/// [`parenthesize_lambda_bodies`](https://github.com/astral-sh/ruff/pull/21385) preview style is
|
||||
/// enabled.
|
||||
pub(crate) const fn is_parenthesize_lambda_bodies_enabled(context: &PyFormatContext) -> bool {
|
||||
context.is_preview()
|
||||
}
|
||||
|
|
|
|||
|
|
@ -9,6 +9,7 @@ use crate::comments::{
|
|||
Comments, LeadingDanglingTrailingComments, SourceComment, trailing_comments,
|
||||
};
|
||||
use crate::context::{NodeLevel, WithNodeLevel};
|
||||
use crate::expression::expr_lambda::ExprLambdaLayout;
|
||||
use crate::expression::parentheses::{
|
||||
NeedsParentheses, OptionalParentheses, Parentheses, Parenthesize, is_expression_parenthesized,
|
||||
optional_parentheses,
|
||||
|
|
@ -18,6 +19,7 @@ use crate::expression::{
|
|||
maybe_parenthesize_expression,
|
||||
};
|
||||
use crate::other::interpolated_string::InterpolatedStringLayout;
|
||||
use crate::preview::is_parenthesize_lambda_bodies_enabled;
|
||||
use crate::statement::trailing_semicolon;
|
||||
use crate::string::StringLikeExtensions;
|
||||
use crate::string::implicit::{
|
||||
|
|
@ -303,12 +305,7 @@ impl Format<PyFormatContext<'_>> for FormatStatementsLastExpression<'_> {
|
|||
&& format_implicit_flat.is_none()
|
||||
&& format_interpolated_string.is_none()
|
||||
{
|
||||
return maybe_parenthesize_expression(
|
||||
value,
|
||||
*statement,
|
||||
Parenthesize::IfBreaks,
|
||||
)
|
||||
.fmt(f);
|
||||
return maybe_parenthesize_value(value, *statement).fmt(f);
|
||||
}
|
||||
|
||||
let comments = f.context().comments().clone();
|
||||
|
|
@ -586,11 +583,7 @@ impl Format<PyFormatContext<'_>> for FormatStatementsLastExpression<'_> {
|
|||
space(),
|
||||
operator,
|
||||
space(),
|
||||
maybe_parenthesize_expression(
|
||||
value,
|
||||
*statement,
|
||||
Parenthesize::IfBreaks
|
||||
)
|
||||
maybe_parenthesize_value(value, *statement)
|
||||
]
|
||||
);
|
||||
}
|
||||
|
|
@ -1369,3 +1362,32 @@ fn is_attribute_with_parenthesized_value(target: &Expr, context: &PyFormatContex
|
|||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
/// Like [`maybe_parenthesize_expression`] but with special handling for lambdas in preview.
|
||||
fn maybe_parenthesize_value<'a>(
|
||||
expression: &'a Expr,
|
||||
parent: AnyNodeRef<'a>,
|
||||
) -> MaybeParenthesizeValue<'a> {
|
||||
MaybeParenthesizeValue { expression, parent }
|
||||
}
|
||||
|
||||
struct MaybeParenthesizeValue<'a> {
|
||||
expression: &'a Expr,
|
||||
parent: AnyNodeRef<'a>,
|
||||
}
|
||||
|
||||
impl Format<PyFormatContext<'_>> for MaybeParenthesizeValue<'_> {
|
||||
fn fmt(&self, f: &mut PyFormatter) -> FormatResult<()> {
|
||||
let MaybeParenthesizeValue { expression, parent } = self;
|
||||
|
||||
if is_parenthesize_lambda_bodies_enabled(f.context())
|
||||
&& let Expr::Lambda(lambda) = expression
|
||||
&& !f.context().comments().has_leading(lambda)
|
||||
{
|
||||
parenthesize_if_expands(&lambda.format().with_options(ExprLambdaLayout::Assignment))
|
||||
.fmt(f)
|
||||
} else {
|
||||
maybe_parenthesize_expression(expression, *parent, Parenthesize::IfBreaks).fmt(f)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -906,11 +906,10 @@ x = {
|
|||
-)
|
||||
+string_with_escaped_nameescape = "........................................................................... \\N{LAO KO LA}"
|
||||
|
||||
-msg = lambda x: (
|
||||
msg = lambda x: (
|
||||
- f"this is a very very very very long lambda value {x} that doesn't fit on a"
|
||||
- " single line"
|
||||
+msg = (
|
||||
+ lambda x: f"this is a very very very very long lambda value {x} that doesn't fit on a single line"
|
||||
+ f"this is a very very very very long lambda value {x} that doesn't fit on a single line"
|
||||
)
|
||||
|
||||
dict_with_lambda_values = {
|
||||
|
|
@ -1403,8 +1402,8 @@ string_with_escaped_nameescape = "..............................................
|
|||
|
||||
string_with_escaped_nameescape = "........................................................................... \\N{LAO KO LA}"
|
||||
|
||||
msg = (
|
||||
lambda x: f"this is a very very very very long lambda value {x} that doesn't fit on a single line"
|
||||
msg = lambda x: (
|
||||
f"this is a very very very very long lambda value {x} that doesn't fit on a single line"
|
||||
)
|
||||
|
||||
dict_with_lambda_values = {
|
||||
|
|
|
|||
|
|
@ -375,7 +375,7 @@ a = b if """
|
|||
# Another use case
|
||||
data = yaml.load("""\
|
||||
a: 1
|
||||
@@ -77,19 +106,23 @@
|
||||
@@ -77,10 +106,12 @@
|
||||
b: 2
|
||||
""",
|
||||
)
|
||||
|
|
@ -390,19 +390,7 @@ a = b if """
|
|||
|
||||
MULTILINE = """
|
||||
foo
|
||||
""".replace("\n", "")
|
||||
-generated_readme = lambda project_name: """
|
||||
+generated_readme = (
|
||||
+ lambda project_name: """
|
||||
{}
|
||||
|
||||
<Add content here!>
|
||||
""".strip().format(project_name)
|
||||
+)
|
||||
parser.usage += """
|
||||
Custom extra help summary.
|
||||
|
||||
@@ -156,16 +189,24 @@
|
||||
@@ -156,16 +187,24 @@
|
||||
10 LOAD_CONST 0 (None)
|
||||
12 RETURN_VALUE
|
||||
""" % (_C.__init__.__code__.co_firstlineno + 1,)
|
||||
|
|
@ -433,7 +421,7 @@ a = b if """
|
|||
[
|
||||
"""cow
|
||||
moos""",
|
||||
@@ -206,7 +247,9 @@
|
||||
@@ -206,7 +245,9 @@
|
||||
"c"
|
||||
)
|
||||
|
||||
|
|
@ -444,7 +432,7 @@ a = b if """
|
|||
|
||||
assert some_var == expected_result, """
|
||||
test
|
||||
@@ -224,10 +267,8 @@
|
||||
@@ -224,10 +265,8 @@
|
||||
"""Sxxxxxxx xxxxxxxx, xxxxxxx xx xxxxxxxxx
|
||||
xxxxxxxxxxxxx xxxxxxx xxxxxxxxx xxx-xxxxxxxxxx xxxxxx xx xxx-xxxxxx"""
|
||||
),
|
||||
|
|
@ -457,7 +445,7 @@ a = b if """
|
|||
},
|
||||
}
|
||||
|
||||
@@ -246,14 +287,12 @@
|
||||
@@ -246,14 +285,12 @@
|
||||
a
|
||||
a"""
|
||||
),
|
||||
|
|
@ -597,13 +585,11 @@ data = yaml.load(
|
|||
MULTILINE = """
|
||||
foo
|
||||
""".replace("\n", "")
|
||||
generated_readme = (
|
||||
lambda project_name: """
|
||||
generated_readme = lambda project_name: """
|
||||
{}
|
||||
|
||||
<Add content here!>
|
||||
""".strip().format(project_name)
|
||||
)
|
||||
parser.usage += """
|
||||
Custom extra help summary.
|
||||
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load Diff
|
|
@ -1,7 +1,6 @@
|
|||
---
|
||||
source: crates/ruff_python_formatter/tests/fixtures.rs
|
||||
input_file: crates/ruff_python_formatter/resources/test/fixtures/ruff/multiline_string_deviations.py
|
||||
snapshot_kind: text
|
||||
---
|
||||
## Input
|
||||
```python
|
||||
|
|
@ -106,3 +105,22 @@ generated_readme = (
|
|||
""".strip().format(project_name)
|
||||
)
|
||||
```
|
||||
|
||||
|
||||
## Preview changes
|
||||
```diff
|
||||
--- Stable
|
||||
+++ Preview
|
||||
@@ -44,10 +44,8 @@
|
||||
# this by changing `Lambda::needs_parentheses` to return `BestFit` but it causes
|
||||
# issues when the lambda has comments.
|
||||
# Let's keep this as a known deviation for now.
|
||||
-generated_readme = (
|
||||
- lambda project_name: """
|
||||
+generated_readme = lambda project_name: """
|
||||
{}
|
||||
|
||||
<Add content here!>
|
||||
""".strip().format(project_name)
|
||||
-)
|
||||
```
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[package]
|
||||
name = "ruff_wasm"
|
||||
version = "0.14.8"
|
||||
version = "0.14.9"
|
||||
publish = false
|
||||
authors = { workspace = true }
|
||||
edition = { workspace = true }
|
||||
|
|
|
|||
|
|
@ -158,7 +158,7 @@ If left unspecified, ty will try to detect common project layouts and initialize
|
|||
* if a `./<project-name>/<project-name>` directory exists, include `.` and `./<project-name>` in the first party search path
|
||||
* otherwise, default to `.` (flat layout)
|
||||
|
||||
Besides, if a `./python` or `./tests` directory exists and is not a package (i.e. it does not contain an `__init__.py` or `__init__.pyi` file),
|
||||
Additionally, if a `./python` directory exists and is not a package (i.e. it does not contain an `__init__.py` or `__init__.pyi` file),
|
||||
it will also be included in the first party search path.
|
||||
|
||||
**Default value**: `null`
|
||||
|
|
@ -443,7 +443,7 @@ If left unspecified, ty will try to detect common project layouts and initialize
|
|||
* if a `./<project-name>/<project-name>` directory exists, include `.` and `./<project-name>` in the first party search path
|
||||
* otherwise, default to `.` (flat layout)
|
||||
|
||||
Besides, if a `./tests` directory exists and is not a package (i.e. it does not contain an `__init__.py` file),
|
||||
Additionally, if a `./python` directory exists and is not a package (i.e. it does not contain an `__init__.py` file),
|
||||
it will also be included in the first party search path.
|
||||
|
||||
**Default value**: `null`
|
||||
|
|
|
|||
|
|
@ -2390,14 +2390,14 @@ fn default_root_flat_layout() -> anyhow::Result<()> {
|
|||
fn default_root_tests_folder() -> anyhow::Result<()> {
|
||||
let case = CliTest::with_files([
|
||||
("src/foo.py", "foo = 10"),
|
||||
("tests/bar.py", "bar = 20"),
|
||||
("tests/bar.py", "baz = 20"),
|
||||
(
|
||||
"tests/test_bar.py",
|
||||
r#"
|
||||
from foo import foo
|
||||
from bar import bar
|
||||
from bar import baz
|
||||
|
||||
print(f"{foo} {bar}")
|
||||
print(f"{foo} {baz}")
|
||||
"#,
|
||||
),
|
||||
])?;
|
||||
|
|
|
|||
|
|
@ -3624,6 +3624,37 @@ def function():
|
|||
assert_snapshot!(test.hover(), @"Hover provided no content");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn hover_named_expression_target() {
|
||||
let test = CursorTest::builder()
|
||||
.source(
|
||||
"mymod.py",
|
||||
r#"
|
||||
if a<CURSOR> := 10:
|
||||
pass
|
||||
"#,
|
||||
)
|
||||
.build();
|
||||
|
||||
assert_snapshot!(test.hover(), @r###"
|
||||
Literal[10]
|
||||
---------------------------------------------
|
||||
```python
|
||||
Literal[10]
|
||||
```
|
||||
---------------------------------------------
|
||||
info[hover]: Hovered content is
|
||||
--> mymod.py:2:4
|
||||
|
|
||||
2 | if a := 10:
|
||||
| ^- Cursor offset
|
||||
| |
|
||||
| source
|
||||
3 | pass
|
||||
|
|
||||
"###);
|
||||
}
|
||||
|
||||
impl CursorTest {
|
||||
fn hover(&self) -> String {
|
||||
use std::fmt::Write;
|
||||
|
|
|
|||
|
|
@ -6017,9 +6017,9 @@ mod tests {
|
|||
fn test_function_signature_inlay_hint() {
|
||||
let mut test = inlay_hint_test(
|
||||
"
|
||||
def foo(x: int, *y: bool, z: str | int | list[str]): ...
|
||||
def foo(x: int, *y: bool, z: str | int | list[str]): ...
|
||||
|
||||
a = foo",
|
||||
a = foo",
|
||||
);
|
||||
|
||||
assert_snapshot!(test.inlay_hints(), @r#"
|
||||
|
|
@ -6158,18 +6158,35 @@ mod tests {
|
|||
fn test_module_inlay_hint() {
|
||||
let mut test = inlay_hint_test(
|
||||
"
|
||||
import foo
|
||||
import foo
|
||||
|
||||
a = foo",
|
||||
a = foo",
|
||||
);
|
||||
|
||||
test.with_extra_file("foo.py", "'''Foo module'''");
|
||||
|
||||
assert_snapshot!(test.inlay_hints(), @r"
|
||||
assert_snapshot!(test.inlay_hints(), @r#"
|
||||
import foo
|
||||
|
||||
a[: <module 'foo'>] = foo
|
||||
---------------------------------------------
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/types.pyi:423:7
|
||||
|
|
||||
422 | @disjoint_base
|
||||
423 | class ModuleType:
|
||||
| ^^^^^^^^^^
|
||||
424 | """Create a module object.
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:6
|
||||
|
|
||||
2 | import foo
|
||||
3 |
|
||||
4 | a[: <module 'foo'>] = foo
|
||||
| ^^^^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> foo.py:1:1
|
||||
|
|
||||
|
|
@ -6177,32 +6194,531 @@ mod tests {
|
|||
| ^^^^^^^^^^^^^^^^
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:5
|
||||
--> main2.py:4:14
|
||||
|
|
||||
2 | import foo
|
||||
3 |
|
||||
4 | a[: <module 'foo'>] = foo
|
||||
| ^^^^^^^^^^^^^^
|
||||
| ^^^
|
||||
|
|
||||
");
|
||||
"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_literal_type_alias_inlay_hint() {
|
||||
let mut test = inlay_hint_test(
|
||||
"
|
||||
from typing import Literal
|
||||
from typing import Literal
|
||||
|
||||
a = Literal['a', 'b', 'c']",
|
||||
a = Literal['a', 'b', 'c']",
|
||||
);
|
||||
|
||||
assert_snapshot!(test.inlay_hints(), @r#"
|
||||
from typing import Literal
|
||||
|
||||
a[: <special form 'Literal["a", "b", "c"]'>] = Literal['a', 'b', 'c']
|
||||
---------------------------------------------
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/typing.pyi:351:1
|
||||
|
|
||||
349 | Final: _SpecialForm
|
||||
350 |
|
||||
351 | Literal: _SpecialForm
|
||||
| ^^^^^^^
|
||||
352 | TypedDict: _SpecialForm
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:20
|
||||
|
|
||||
2 | from typing import Literal
|
||||
3 |
|
||||
4 | a[: <special form 'Literal["a", "b", "c"]'>] = Literal['a', 'b', 'c']
|
||||
| ^^^^^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/builtins.pyi:915:7
|
||||
|
|
||||
914 | @disjoint_base
|
||||
915 | class str(Sequence[str]):
|
||||
| ^^^
|
||||
916 | """str(object='') -> str
|
||||
917 | str(bytes_or_buffer[, encoding[, errors]]) -> str
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:28
|
||||
|
|
||||
2 | from typing import Literal
|
||||
3 |
|
||||
4 | a[: <special form 'Literal["a", "b", "c"]'>] = Literal['a', 'b', 'c']
|
||||
| ^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/builtins.pyi:915:7
|
||||
|
|
||||
914 | @disjoint_base
|
||||
915 | class str(Sequence[str]):
|
||||
| ^^^
|
||||
916 | """str(object='') -> str
|
||||
917 | str(bytes_or_buffer[, encoding[, errors]]) -> str
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:33
|
||||
|
|
||||
2 | from typing import Literal
|
||||
3 |
|
||||
4 | a[: <special form 'Literal["a", "b", "c"]'>] = Literal['a', 'b', 'c']
|
||||
| ^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/builtins.pyi:915:7
|
||||
|
|
||||
914 | @disjoint_base
|
||||
915 | class str(Sequence[str]):
|
||||
| ^^^
|
||||
916 | """str(object='') -> str
|
||||
917 | str(bytes_or_buffer[, encoding[, errors]]) -> str
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:38
|
||||
|
|
||||
2 | from typing import Literal
|
||||
3 |
|
||||
4 | a[: <special form 'Literal["a", "b", "c"]'>] = Literal['a', 'b', 'c']
|
||||
| ^^^
|
||||
|
|
||||
"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_wrapper_descriptor_inlay_hint() {
|
||||
let mut test = inlay_hint_test(
|
||||
"
|
||||
from types import FunctionType
|
||||
|
||||
a = FunctionType.__get__",
|
||||
);
|
||||
|
||||
assert_snapshot!(test.inlay_hints(), @r#"
|
||||
from types import FunctionType
|
||||
|
||||
a[: <wrapper-descriptor '__get__' of 'function' objects>] = FunctionType.__get__
|
||||
---------------------------------------------
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/types.pyi:670:7
|
||||
|
|
||||
669 | @final
|
||||
670 | class WrapperDescriptorType:
|
||||
| ^^^^^^^^^^^^^^^^^^^^^
|
||||
671 | @property
|
||||
672 | def __name__(self) -> str: ...
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:6
|
||||
|
|
||||
2 | from types import FunctionType
|
||||
3 |
|
||||
4 | a[: <wrapper-descriptor '__get__' of 'function' objects>] = FunctionType.__get__
|
||||
| ^^^^^^^^^^^^^^^^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/types.pyi:77:7
|
||||
|
|
||||
75 | # Make sure this class definition stays roughly in line with `builtins.function`
|
||||
76 | @final
|
||||
77 | class FunctionType:
|
||||
| ^^^^^^^^^^^^
|
||||
78 | """Create a function object.
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:39
|
||||
|
|
||||
2 | from types import FunctionType
|
||||
3 |
|
||||
4 | a[: <wrapper-descriptor '__get__' of 'function' objects>] = FunctionType.__get__
|
||||
| ^^^^^^^^
|
||||
|
|
||||
"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_method_wrapper_inlay_hint() {
|
||||
let mut test = inlay_hint_test(
|
||||
"
|
||||
def f(): ...
|
||||
|
||||
a = f.__call__",
|
||||
);
|
||||
|
||||
assert_snapshot!(test.inlay_hints(), @r#"
|
||||
def f(): ...
|
||||
|
||||
a[: <method-wrapper '__call__' of function 'f'>] = f.__call__
|
||||
---------------------------------------------
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/types.pyi:684:7
|
||||
|
|
||||
683 | @final
|
||||
684 | class MethodWrapperType:
|
||||
| ^^^^^^^^^^^^^^^^^
|
||||
685 | @property
|
||||
686 | def __self__(self) -> object: ...
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:6
|
||||
|
|
||||
2 | def f(): ...
|
||||
3 |
|
||||
4 | a[: <method-wrapper '__call__' of function 'f'>] = f.__call__
|
||||
| ^^^^^^^^^^^^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/types.pyi:134:9
|
||||
|
|
||||
132 | ) -> Self: ...
|
||||
133 |
|
||||
134 | def __call__(self, *args: Any, **kwargs: Any) -> Any:
|
||||
| ^^^^^^^^
|
||||
135 | """Call self as a function."""
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:22
|
||||
|
|
||||
2 | def f(): ...
|
||||
3 |
|
||||
4 | a[: <method-wrapper '__call__' of function 'f'>] = f.__call__
|
||||
| ^^^^^^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/types.pyi:77:7
|
||||
|
|
||||
75 | # Make sure this class definition stays roughly in line with `builtins.function`
|
||||
76 | @final
|
||||
77 | class FunctionType:
|
||||
| ^^^^^^^^^^^^
|
||||
78 | """Create a function object.
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:35
|
||||
|
|
||||
2 | def f(): ...
|
||||
3 |
|
||||
4 | a[: <method-wrapper '__call__' of function 'f'>] = f.__call__
|
||||
| ^^^^^^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> main.py:2:5
|
||||
|
|
||||
2 | def f(): ...
|
||||
| ^
|
||||
3 |
|
||||
4 | a = f.__call__
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:45
|
||||
|
|
||||
2 | def f(): ...
|
||||
3 |
|
||||
4 | a[: <method-wrapper '__call__' of function 'f'>] = f.__call__
|
||||
| ^
|
||||
|
|
||||
"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_newtype_inlay_hint() {
|
||||
let mut test = inlay_hint_test(
|
||||
"
|
||||
from typing import NewType
|
||||
|
||||
N = NewType('N', str)
|
||||
|
||||
Y = N",
|
||||
);
|
||||
|
||||
assert_snapshot!(test.inlay_hints(), @r#"
|
||||
from typing import NewType
|
||||
|
||||
N[: <NewType pseudo-class 'N'>] = NewType([name=]'N', [tp=]str)
|
||||
|
||||
Y[: <NewType pseudo-class 'N'>] = N
|
||||
---------------------------------------------
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/typing.pyi:615:11
|
||||
|
|
||||
613 | TypeGuard: _SpecialForm
|
||||
614 |
|
||||
615 | class NewType:
|
||||
| ^^^^^^^
|
||||
616 | """NewType creates simple unique types with almost zero runtime overhead.
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:6
|
||||
|
|
||||
2 | from typing import NewType
|
||||
3 |
|
||||
4 | N[: <NewType pseudo-class 'N'>] = NewType([name=]'N', [tp=]str)
|
||||
| ^^^^^^^
|
||||
5 |
|
||||
6 | Y[: <NewType pseudo-class 'N'>] = N
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> main.py:4:1
|
||||
|
|
||||
2 | from typing import NewType
|
||||
3 |
|
||||
4 | N = NewType('N', str)
|
||||
| ^
|
||||
5 |
|
||||
6 | Y = N
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:28
|
||||
|
|
||||
2 | from typing import NewType
|
||||
3 |
|
||||
4 | N[: <NewType pseudo-class 'N'>] = NewType([name=]'N', [tp=]str)
|
||||
| ^
|
||||
5 |
|
||||
6 | Y[: <NewType pseudo-class 'N'>] = N
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/typing.pyi:637:28
|
||||
|
|
||||
635 | """
|
||||
636 |
|
||||
637 | def __init__(self, name: str, tp: Any) -> None: ... # AnnotationForm
|
||||
| ^^^^
|
||||
638 | if sys.version_info >= (3, 11):
|
||||
639 | @staticmethod
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:44
|
||||
|
|
||||
2 | from typing import NewType
|
||||
3 |
|
||||
4 | N[: <NewType pseudo-class 'N'>] = NewType([name=]'N', [tp=]str)
|
||||
| ^^^^
|
||||
5 |
|
||||
6 | Y[: <NewType pseudo-class 'N'>] = N
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/typing.pyi:637:39
|
||||
|
|
||||
635 | """
|
||||
636 |
|
||||
637 | def __init__(self, name: str, tp: Any) -> None: ... # AnnotationForm
|
||||
| ^^
|
||||
638 | if sys.version_info >= (3, 11):
|
||||
639 | @staticmethod
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:56
|
||||
|
|
||||
2 | from typing import NewType
|
||||
3 |
|
||||
4 | N[: <NewType pseudo-class 'N'>] = NewType([name=]'N', [tp=]str)
|
||||
| ^^
|
||||
5 |
|
||||
6 | Y[: <NewType pseudo-class 'N'>] = N
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/typing.pyi:615:11
|
||||
|
|
||||
613 | TypeGuard: _SpecialForm
|
||||
614 |
|
||||
615 | class NewType:
|
||||
| ^^^^^^^
|
||||
616 | """NewType creates simple unique types with almost zero runtime overhead.
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:6:6
|
||||
|
|
||||
4 | N[: <NewType pseudo-class 'N'>] = NewType([name=]'N', [tp=]str)
|
||||
5 |
|
||||
6 | Y[: <NewType pseudo-class 'N'>] = N
|
||||
| ^^^^^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> main.py:4:1
|
||||
|
|
||||
2 | from typing import NewType
|
||||
3 |
|
||||
4 | N = NewType('N', str)
|
||||
| ^
|
||||
5 |
|
||||
6 | Y = N
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:6:28
|
||||
|
|
||||
4 | N[: <NewType pseudo-class 'N'>] = NewType([name=]'N', [tp=]str)
|
||||
5 |
|
||||
6 | Y[: <NewType pseudo-class 'N'>] = N
|
||||
| ^
|
||||
|
|
||||
"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_meta_typevar_inlay_hint() {
|
||||
let mut test = inlay_hint_test(
|
||||
"
|
||||
def f[T](x: type[T]):
|
||||
y = x",
|
||||
);
|
||||
|
||||
assert_snapshot!(test.inlay_hints(), @r#"
|
||||
def f[T](x: type[T]):
|
||||
y[: type[T@f]] = x
|
||||
---------------------------------------------
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/builtins.pyi:247:7
|
||||
|
|
||||
246 | @disjoint_base
|
||||
247 | class type:
|
||||
| ^^^^
|
||||
248 | """type(object) -> the object's type
|
||||
249 | type(name, bases, dict, **kwds) -> a new type
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:3:9
|
||||
|
|
||||
2 | def f[T](x: type[T]):
|
||||
3 | y[: type[T@f]] = x
|
||||
| ^^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> main.py:2:7
|
||||
|
|
||||
2 | def f[T](x: type[T]):
|
||||
| ^
|
||||
3 | y = x
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:3:14
|
||||
|
|
||||
2 | def f[T](x: type[T]):
|
||||
3 | y[: type[T@f]] = x
|
||||
| ^^^
|
||||
|
|
||||
|
||||
---------------------------------------------
|
||||
info[inlay-hint-edit]: File after edits
|
||||
info: Source
|
||||
|
||||
def f[T](x: type[T]):
|
||||
y: type[T@f] = x
|
||||
"#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_subscripted_protocol_inlay_hint() {
|
||||
let mut test = inlay_hint_test(
|
||||
"
|
||||
from typing import Protocol, TypeVar
|
||||
T = TypeVar('T')
|
||||
Strange = Protocol[T]",
|
||||
);
|
||||
|
||||
assert_snapshot!(test.inlay_hints(), @r"
|
||||
from typing import Protocol, TypeVar
|
||||
T[: typing.TypeVar] = TypeVar([name=]'T')
|
||||
Strange[: <special form 'typing.Protocol[T]'>] = Protocol[T]
|
||||
---------------------------------------------
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> main.py:3:1
|
||||
|
|
||||
2 | from typing import Protocol, TypeVar
|
||||
3 | T = TypeVar('T')
|
||||
| ^
|
||||
4 | Strange = Protocol[T]
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:3:5
|
||||
|
|
||||
2 | from typing import Protocol, TypeVar
|
||||
3 | T[: typing.TypeVar] = TypeVar([name=]'T')
|
||||
| ^^^^^^^^^^^^^^
|
||||
4 | Strange[: <special form 'typing.Protocol[T]'>] = Protocol[T]
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/typing.pyi:276:13
|
||||
|
|
||||
274 | def __new__(
|
||||
275 | cls,
|
||||
276 | name: str,
|
||||
| ^^^^
|
||||
277 | *constraints: Any, # AnnotationForm
|
||||
278 | bound: Any | None = None, # AnnotationForm
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:3:32
|
||||
|
|
||||
2 | from typing import Protocol, TypeVar
|
||||
3 | T[: typing.TypeVar] = TypeVar([name=]'T')
|
||||
| ^^^^
|
||||
4 | Strange[: <special form 'typing.Protocol[T]'>] = Protocol[T]
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> stdlib/typing.pyi:341:1
|
||||
|
|
||||
340 | Union: _SpecialForm
|
||||
341 | Protocol: _SpecialForm
|
||||
| ^^^^^^^^
|
||||
342 | Callable: _SpecialForm
|
||||
343 | Type: _SpecialForm
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:26
|
||||
|
|
||||
2 | from typing import Protocol, TypeVar
|
||||
3 | T[: typing.TypeVar] = TypeVar([name=]'T')
|
||||
4 | Strange[: <special form 'typing.Protocol[T]'>] = Protocol[T]
|
||||
| ^^^^^^^^^^^^^^^
|
||||
|
|
||||
|
||||
info[inlay-hint-location]: Inlay Hint Target
|
||||
--> main.py:3:1
|
||||
|
|
||||
2 | from typing import Protocol, TypeVar
|
||||
3 | T = TypeVar('T')
|
||||
| ^
|
||||
4 | Strange = Protocol[T]
|
||||
|
|
||||
info: Source
|
||||
--> main2.py:4:42
|
||||
|
|
||||
2 | from typing import Protocol, TypeVar
|
||||
3 | T[: typing.TypeVar] = TypeVar([name=]'T')
|
||||
4 | Strange[: <special form 'typing.Protocol[T]'>] = Protocol[T]
|
||||
| ^
|
||||
|
|
||||
|
||||
---------------------------------------------
|
||||
info[inlay-hint-edit]: File after edits
|
||||
info: Source
|
||||
|
||||
from typing import Protocol, TypeVar
|
||||
T: typing.TypeVar = TypeVar('T')
|
||||
Strange = Protocol[T]
|
||||
");
|
||||
}
|
||||
|
||||
struct InlayHintLocationDiagnostic {
|
||||
source: FileRange,
|
||||
target: FileRange,
|
||||
|
|
|
|||
|
|
@ -84,7 +84,7 @@ pub fn rename(
|
|||
|
||||
/// Helper function to check if a file is included in the project.
|
||||
fn is_file_in_project(db: &dyn Db, file: File) -> bool {
|
||||
db.project().files(db).contains(&file)
|
||||
file.path(db).is_system_virtual_path() || db.project().files(db).contains(&file)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
|
|
|
|||
|
|
@ -302,17 +302,25 @@ impl<'db> SemanticTokenVisitor<'db> {
|
|||
let parsed = parsed_module(db, definition.file(db));
|
||||
let ty = parameter.node(&parsed.load(db)).inferred_type(&model);
|
||||
|
||||
if let Some(ty) = ty
|
||||
&& let Type::TypeVar(type_var) = ty
|
||||
{
|
||||
match type_var.typevar(db).kind(db) {
|
||||
TypeVarKind::TypingSelf => {
|
||||
return Some((SemanticTokenType::SelfParameter, modifiers));
|
||||
if let Some(ty) = ty {
|
||||
let type_var = match ty {
|
||||
Type::TypeVar(type_var) => Some((type_var, false)),
|
||||
Type::SubclassOf(subclass_of) => {
|
||||
subclass_of.into_type_var().map(|var| (var, true))
|
||||
}
|
||||
TypeVarKind::Legacy
|
||||
| TypeVarKind::ParamSpec
|
||||
| TypeVarKind::Pep695ParamSpec
|
||||
| TypeVarKind::Pep695 => {}
|
||||
_ => None,
|
||||
};
|
||||
|
||||
if let Some((type_var, is_cls)) = type_var
|
||||
&& matches!(type_var.typevar(db).kind(db), TypeVarKind::TypingSelf)
|
||||
{
|
||||
let kind = if is_cls {
|
||||
SemanticTokenType::ClsParameter
|
||||
} else {
|
||||
SemanticTokenType::SelfParameter
|
||||
};
|
||||
|
||||
return Some((kind, modifiers));
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -1203,7 +1211,7 @@ class MyClass:
|
|||
"
|
||||
class MyClass:
|
||||
@classmethod
|
||||
def method(cls, x): pass
|
||||
def method(cls, x): print(cls)
|
||||
",
|
||||
);
|
||||
|
||||
|
|
@ -1215,6 +1223,8 @@ class MyClass:
|
|||
"method" @ 41..47: Method [definition]
|
||||
"cls" @ 48..51: ClsParameter [definition]
|
||||
"x" @ 53..54: Parameter [definition]
|
||||
"print" @ 57..62: Function
|
||||
"cls" @ 63..66: ClsParameter
|
||||
"#);
|
||||
}
|
||||
|
||||
|
|
@ -1246,7 +1256,7 @@ class MyClass:
|
|||
class MyClass:
|
||||
def method(instance, x): pass
|
||||
@classmethod
|
||||
def other(klass, y): pass
|
||||
def other(klass, y): print(klass)
|
||||
def complex_method(instance, posonly, /, regular, *args, kwonly, **kwargs): pass
|
||||
",
|
||||
);
|
||||
|
|
@ -1262,13 +1272,15 @@ class MyClass:
|
|||
"other" @ 75..80: Method [definition]
|
||||
"klass" @ 81..86: ClsParameter [definition]
|
||||
"y" @ 88..89: Parameter [definition]
|
||||
"complex_method" @ 105..119: Method [definition]
|
||||
"instance" @ 120..128: SelfParameter [definition]
|
||||
"posonly" @ 130..137: Parameter [definition]
|
||||
"regular" @ 142..149: Parameter [definition]
|
||||
"args" @ 152..156: Parameter [definition]
|
||||
"kwonly" @ 158..164: Parameter [definition]
|
||||
"kwargs" @ 168..174: Parameter [definition]
|
||||
"print" @ 92..97: Function
|
||||
"klass" @ 98..103: ClsParameter
|
||||
"complex_method" @ 113..127: Method [definition]
|
||||
"instance" @ 128..136: SelfParameter [definition]
|
||||
"posonly" @ 138..145: Parameter [definition]
|
||||
"regular" @ 150..157: Parameter [definition]
|
||||
"args" @ 160..164: Parameter [definition]
|
||||
"kwonly" @ 166..172: Parameter [definition]
|
||||
"kwargs" @ 176..182: Parameter [definition]
|
||||
"#);
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -10,10 +10,10 @@ use ruff_db::files::File;
|
|||
use ruff_db::parsed::parsed_module;
|
||||
use ruff_index::{IndexVec, newtype_index};
|
||||
use ruff_python_ast as ast;
|
||||
use ruff_python_ast::name::Name;
|
||||
use ruff_python_ast::name::{Name, UnqualifiedName};
|
||||
use ruff_python_ast::visitor::source_order::{self, SourceOrderVisitor};
|
||||
use ruff_text_size::{Ranged, TextRange};
|
||||
use rustc_hash::FxHashSet;
|
||||
use rustc_hash::{FxHashMap, FxHashSet};
|
||||
use ty_project::Db;
|
||||
use ty_python_semantic::{ModuleName, resolve_module};
|
||||
|
||||
|
|
@ -375,7 +375,11 @@ pub(crate) fn symbols_for_file(db: &dyn Db, file: File) -> FlatSymbols {
|
|||
/// While callers can convert this into a hierarchical collection of
|
||||
/// symbols, it won't result in anything meaningful since the flat list
|
||||
/// returned doesn't include children.
|
||||
#[salsa::tracked(returns(ref), heap_size=ruff_memory_usage::heap_size)]
|
||||
#[salsa::tracked(
|
||||
returns(ref),
|
||||
cycle_initial=symbols_for_file_global_only_cycle_initial,
|
||||
heap_size=ruff_memory_usage::heap_size,
|
||||
)]
|
||||
pub(crate) fn symbols_for_file_global_only(db: &dyn Db, file: File) -> FlatSymbols {
|
||||
let parsed = parsed_module(db, file);
|
||||
let module = parsed.load(db);
|
||||
|
|
@ -394,6 +398,14 @@ pub(crate) fn symbols_for_file_global_only(db: &dyn Db, file: File) -> FlatSymbo
|
|||
visitor.into_flat_symbols()
|
||||
}
|
||||
|
||||
fn symbols_for_file_global_only_cycle_initial(
|
||||
_db: &dyn Db,
|
||||
_id: salsa::Id,
|
||||
_file: File,
|
||||
) -> FlatSymbols {
|
||||
FlatSymbols::default()
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq, get_size2::GetSize)]
|
||||
struct SymbolTree {
|
||||
parent: Option<SymbolId>,
|
||||
|
|
@ -411,6 +423,189 @@ enum ImportKind {
|
|||
Wildcard,
|
||||
}
|
||||
|
||||
/// An abstraction for managing module scope imports.
|
||||
///
|
||||
/// This is meant to recognize the following idioms for updating
|
||||
/// `__all__` in module scope:
|
||||
///
|
||||
/// ```ignore
|
||||
/// __all__ += submodule.__all__
|
||||
/// __all__.extend(submodule.__all__)
|
||||
/// ```
|
||||
///
|
||||
/// # Correctness
|
||||
///
|
||||
/// The approach used here is not correct 100% of the time.
|
||||
/// For example, it is somewhat easy to defeat it:
|
||||
///
|
||||
/// ```ignore
|
||||
/// from numpy import *
|
||||
/// from importlib import resources
|
||||
/// import numpy as np
|
||||
/// np = resources
|
||||
/// __all__ = []
|
||||
/// __all__ += np.__all__
|
||||
/// ```
|
||||
///
|
||||
/// In this example, `np` will still be resolved to the `numpy`
|
||||
/// module instead of the `importlib.resources` module. Namely, this
|
||||
/// abstraction doesn't track all definitions. This would result in a
|
||||
/// silently incorrect `__all__`.
|
||||
///
|
||||
/// This abstraction does handle the case when submodules are imported.
|
||||
/// Namely, we do get this case correct:
|
||||
///
|
||||
/// ```ignore
|
||||
/// from importlib.resources import *
|
||||
/// from importlib import resources
|
||||
/// __all__ = []
|
||||
/// __all__ += resources.__all__
|
||||
/// ```
|
||||
///
|
||||
/// We do this by treating all imports in a `from ... import ...`
|
||||
/// statement as *possible* modules. Then when we lookup `resources`,
|
||||
/// we attempt to resolve it to an actual module. If that fails, then
|
||||
/// we consider `__all__` invalid.
|
||||
///
|
||||
/// There are likely many many other cases that we don't handle as
|
||||
/// well, which ty does (it has its own `__all__` parsing using types
|
||||
/// to deal with this case). We can add handling for those as they
|
||||
/// come up in real world examples.
|
||||
///
|
||||
/// # Performance
|
||||
///
|
||||
/// This abstraction recognizes that, compared to all possible imports,
|
||||
/// it is very rare to use one of them to update `__all__`. Therefore,
|
||||
/// we are careful not to do too much work up-front (like eagerly
|
||||
/// manifesting `ModuleName` values).
|
||||
#[derive(Clone, Debug, Default, get_size2::GetSize)]
|
||||
struct Imports<'db> {
|
||||
/// A map from the name that a module is available
|
||||
/// under to its actual module name (and our level
|
||||
/// of certainty that it ought to be treated as a module).
|
||||
module_names: FxHashMap<&'db str, ImportModuleKind<'db>>,
|
||||
}
|
||||
|
||||
impl<'db> Imports<'db> {
|
||||
/// Track the imports from the given `import ...` statement.
|
||||
fn add_import(&mut self, import: &'db ast::StmtImport) {
|
||||
for alias in &import.names {
|
||||
let asname = alias
|
||||
.asname
|
||||
.as_ref()
|
||||
.map(|ident| &ident.id)
|
||||
.unwrap_or(&alias.name.id);
|
||||
let module_name = ImportModuleName::Import(&alias.name.id);
|
||||
self.module_names
|
||||
.insert(asname, ImportModuleKind::Definitive(module_name));
|
||||
}
|
||||
}
|
||||
|
||||
/// Track the imports from the given `from ... import ...` statement.
|
||||
fn add_import_from(&mut self, import_from: &'db ast::StmtImportFrom) {
|
||||
for alias in &import_from.names {
|
||||
if &alias.name == "*" {
|
||||
// FIXME: We'd ideally include the names
|
||||
// imported from the module, but we don't
|
||||
// want to do this eagerly. So supporting
|
||||
// this requires more infrastructure in
|
||||
// `Imports`.
|
||||
continue;
|
||||
}
|
||||
|
||||
let asname = alias
|
||||
.asname
|
||||
.as_ref()
|
||||
.map(|ident| &ident.id)
|
||||
.unwrap_or(&alias.name.id);
|
||||
let module_name = ImportModuleName::ImportFrom {
|
||||
parent: import_from,
|
||||
child: &alias.name.id,
|
||||
};
|
||||
self.module_names
|
||||
.insert(asname, ImportModuleKind::Possible(module_name));
|
||||
}
|
||||
}
|
||||
|
||||
/// Return the symbols exported by the module referred to by `name`.
|
||||
///
|
||||
/// e.g., This can be used to resolve `__all__ += submodule.__all__`,
|
||||
/// where `name` is `submodule`.
|
||||
fn get_module_symbols(
|
||||
&self,
|
||||
db: &'db dyn Db,
|
||||
importing_file: File,
|
||||
name: &Name,
|
||||
) -> Option<&'db FlatSymbols> {
|
||||
let module_name = match self.module_names.get(name.as_str())? {
|
||||
ImportModuleKind::Definitive(name) | ImportModuleKind::Possible(name) => {
|
||||
name.to_module_name(db, importing_file)?
|
||||
}
|
||||
};
|
||||
let module = resolve_module(db, importing_file, &module_name)?;
|
||||
Some(symbols_for_file_global_only(db, module.file(db)?))
|
||||
}
|
||||
}
|
||||
|
||||
/// Describes the level of certainty that an import is a module.
|
||||
///
|
||||
/// For example, `import foo`, then `foo` is definitively a module.
|
||||
/// But `from quux import foo`, then `quux.foo` is possibly a module.
|
||||
#[derive(Debug, Clone, Copy, get_size2::GetSize)]
|
||||
enum ImportModuleKind<'db> {
|
||||
Definitive(ImportModuleName<'db>),
|
||||
Possible(ImportModuleName<'db>),
|
||||
}
|
||||
|
||||
/// A representation of something that can be turned into a
|
||||
/// `ModuleName`.
|
||||
///
|
||||
/// We don't do this eagerly, and instead represent the constituent
|
||||
/// pieces, in order to avoid the work needed to build a `ModuleName`.
|
||||
/// In particular, it is somewhat rare for the visitor to need
|
||||
/// to access the imports found in a module. At time of writing
|
||||
/// (2025-12-10), this only happens when referencing a submodule
|
||||
/// to augment an `__all__` definition. For example, as found in
|
||||
/// `matplotlib`:
|
||||
///
|
||||
/// ```ignore
|
||||
/// import numpy as np
|
||||
/// __all__ = ['rand', 'randn', 'repmat']
|
||||
/// __all__ += np.__all__
|
||||
/// ```
|
||||
///
|
||||
/// This construct is somewhat rare and it would be sad to allocate a
|
||||
/// `ModuleName` for every imported item unnecessarily.
|
||||
#[derive(Debug, Clone, Copy, get_size2::GetSize)]
|
||||
enum ImportModuleName<'db> {
|
||||
/// The `foo` in `import quux, foo as blah, baz`.
|
||||
Import(&'db Name),
|
||||
/// A possible module in a `from ... import ...` statement.
|
||||
ImportFrom {
|
||||
/// The `..foo` in `from ..foo import quux`.
|
||||
parent: &'db ast::StmtImportFrom,
|
||||
/// The `foo` in `from quux import foo`.
|
||||
child: &'db Name,
|
||||
},
|
||||
}
|
||||
|
||||
impl<'db> ImportModuleName<'db> {
|
||||
/// Converts the lazy representation of a module name into an
|
||||
/// actual `ModuleName` that can be used for module resolution.
|
||||
fn to_module_name(self, db: &'db dyn Db, importing_file: File) -> Option<ModuleName> {
|
||||
match self {
|
||||
ImportModuleName::Import(name) => ModuleName::new(name),
|
||||
ImportModuleName::ImportFrom { parent, child } => {
|
||||
let mut module_name =
|
||||
ModuleName::from_import_statement(db, importing_file, parent).ok()?;
|
||||
let child_module_name = ModuleName::new(child)?;
|
||||
module_name.extend(&child_module_name);
|
||||
Some(module_name)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A visitor over all symbols in a single file.
|
||||
///
|
||||
/// This guarantees that child symbols have a symbol ID greater
|
||||
|
|
@ -431,7 +626,11 @@ struct SymbolVisitor<'db> {
|
|||
/// This is true even when we're inside a function definition
|
||||
/// that is inside a class.
|
||||
in_class: bool,
|
||||
global_only: bool,
|
||||
/// When enabled, the visitor should only try to extract
|
||||
/// symbols from a module that we believed form the "exported"
|
||||
/// interface for that module. i.e., `__all__` is only respected
|
||||
/// when this is enabled. It's otherwise ignored.
|
||||
exports_only: bool,
|
||||
/// The origin of an `__all__` variable, if found.
|
||||
all_origin: Option<DunderAllOrigin>,
|
||||
/// A set of names extracted from `__all__`.
|
||||
|
|
@ -440,6 +639,11 @@ struct SymbolVisitor<'db> {
|
|||
/// `__all__` idioms or there are any invalid elements in
|
||||
/// `__all__`.
|
||||
all_invalid: bool,
|
||||
/// A collection of imports found while visiting the AST.
|
||||
///
|
||||
/// These are used to help resolve references to modules
|
||||
/// in some limited cases.
|
||||
imports: Imports<'db>,
|
||||
}
|
||||
|
||||
impl<'db> SymbolVisitor<'db> {
|
||||
|
|
@ -451,21 +655,27 @@ impl<'db> SymbolVisitor<'db> {
|
|||
symbol_stack: vec![],
|
||||
in_function: false,
|
||||
in_class: false,
|
||||
global_only: false,
|
||||
exports_only: false,
|
||||
all_origin: None,
|
||||
all_names: FxHashSet::default(),
|
||||
all_invalid: false,
|
||||
imports: Imports::default(),
|
||||
}
|
||||
}
|
||||
|
||||
fn globals(db: &'db dyn Db, file: File) -> Self {
|
||||
Self {
|
||||
global_only: true,
|
||||
exports_only: true,
|
||||
..Self::tree(db, file)
|
||||
}
|
||||
}
|
||||
|
||||
fn into_flat_symbols(mut self) -> FlatSymbols {
|
||||
// If `__all__` was found but wasn't recognized,
|
||||
// then we emit a diagnostic message indicating as such.
|
||||
if self.all_invalid {
|
||||
tracing::debug!("Invalid `__all__` in `{}`", self.file.path(self.db));
|
||||
}
|
||||
// We want to filter out some of the symbols we collected.
|
||||
// Specifically, to respect conventions around library
|
||||
// interface.
|
||||
|
|
@ -474,12 +684,28 @@ impl<'db> SymbolVisitor<'db> {
|
|||
// their position in a sequence. So when we filter some
|
||||
// out, we need to remap the identifiers.
|
||||
//
|
||||
// N.B. The remapping could be skipped when `global_only` is
|
||||
// We also want to deduplicate when `exports_only` is
|
||||
// `true`. In particular, dealing with `__all__` can
|
||||
// result in cycles, and we need to make sure our output
|
||||
// is stable for that reason.
|
||||
//
|
||||
// N.B. The remapping could be skipped when `exports_only` is
|
||||
// true, since in that case, none of the symbols have a parent
|
||||
// ID by construction.
|
||||
let mut remap = IndexVec::with_capacity(self.symbols.len());
|
||||
let mut seen = self.exports_only.then(FxHashSet::default);
|
||||
let mut new = IndexVec::with_capacity(self.symbols.len());
|
||||
for mut symbol in std::mem::take(&mut self.symbols) {
|
||||
// If we're deduplicating and we've already seen
|
||||
// this symbol, then skip it.
|
||||
//
|
||||
// FIXME: We should do this without copying every
|
||||
// symbol name. ---AG
|
||||
if let Some(ref mut seen) = seen {
|
||||
if !seen.insert(symbol.name.clone()) {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
if !self.is_part_of_library_interface(&symbol) {
|
||||
remap.push(None);
|
||||
continue;
|
||||
|
|
@ -510,7 +736,7 @@ impl<'db> SymbolVisitor<'db> {
|
|||
}
|
||||
}
|
||||
|
||||
fn visit_body(&mut self, body: &[ast::Stmt]) {
|
||||
fn visit_body(&mut self, body: &'db [ast::Stmt]) {
|
||||
for stmt in body {
|
||||
self.visit_stmt(stmt);
|
||||
}
|
||||
|
|
@ -585,6 +811,11 @@ impl<'db> SymbolVisitor<'db> {
|
|||
///
|
||||
/// If the assignment isn't for `__all__`, then this is a no-op.
|
||||
fn add_all_assignment(&mut self, targets: &[ast::Expr], value: Option<&ast::Expr>) {
|
||||
// We don't care about `__all__` unless we're
|
||||
// specifically looking for exported symbols.
|
||||
if !self.exports_only {
|
||||
return;
|
||||
}
|
||||
if self.in_function || self.in_class {
|
||||
return;
|
||||
}
|
||||
|
|
@ -635,6 +866,31 @@ impl<'db> SymbolVisitor<'db> {
|
|||
ast::Expr::List(ast::ExprList { elts, .. })
|
||||
| ast::Expr::Tuple(ast::ExprTuple { elts, .. })
|
||||
| ast::Expr::Set(ast::ExprSet { elts, .. }) => self.add_all_names(elts),
|
||||
// `__all__ += module.__all__`
|
||||
// `__all__.extend(module.__all__)`
|
||||
ast::Expr::Attribute(ast::ExprAttribute { .. }) => {
|
||||
let Some(unqualified) = UnqualifiedName::from_expr(expr) else {
|
||||
return false;
|
||||
};
|
||||
let Some((&attr, rest)) = unqualified.segments().split_last() else {
|
||||
return false;
|
||||
};
|
||||
if attr != "__all__" {
|
||||
return false;
|
||||
}
|
||||
let possible_module_name = Name::new(rest.join("."));
|
||||
let Some(symbols) =
|
||||
self.imports
|
||||
.get_module_symbols(self.db, self.file, &possible_module_name)
|
||||
else {
|
||||
return false;
|
||||
};
|
||||
let Some(ref all) = symbols.all_names else {
|
||||
return false;
|
||||
};
|
||||
self.all_names.extend(all.iter().cloned());
|
||||
true
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
|
@ -801,14 +1057,11 @@ impl<'db> SymbolVisitor<'db> {
|
|||
// if a name should be part of the exported API of a module
|
||||
// or not. When there is `__all__`, we currently follow it
|
||||
// strictly.
|
||||
if self.all_origin.is_some() {
|
||||
// If `__all__` is somehow invalid, ignore it and fall
|
||||
// through as-if `__all__` didn't exist.
|
||||
if self.all_invalid {
|
||||
tracing::debug!("Invalid `__all__` in `{}`", self.file.path(self.db));
|
||||
} else {
|
||||
return self.all_names.contains(&*symbol.name);
|
||||
}
|
||||
//
|
||||
// If `__all__` is somehow invalid, ignore it and fall
|
||||
// through as-if `__all__` didn't exist.
|
||||
if self.all_origin.is_some() && !self.all_invalid {
|
||||
return self.all_names.contains(&*symbol.name);
|
||||
}
|
||||
|
||||
// "Imported symbols are considered private by default. A fixed
|
||||
|
|
@ -839,8 +1092,8 @@ impl<'db> SymbolVisitor<'db> {
|
|||
}
|
||||
}
|
||||
|
||||
impl SourceOrderVisitor<'_> for SymbolVisitor<'_> {
|
||||
fn visit_stmt(&mut self, stmt: &ast::Stmt) {
|
||||
impl<'db> SourceOrderVisitor<'db> for SymbolVisitor<'db> {
|
||||
fn visit_stmt(&mut self, stmt: &'db ast::Stmt) {
|
||||
match stmt {
|
||||
ast::Stmt::FunctionDef(func_def) => {
|
||||
let kind = if self
|
||||
|
|
@ -865,7 +1118,7 @@ impl SourceOrderVisitor<'_> for SymbolVisitor<'_> {
|
|||
import_kind: None,
|
||||
};
|
||||
|
||||
if self.global_only {
|
||||
if self.exports_only {
|
||||
self.add_symbol(symbol);
|
||||
// If global_only, don't walk function bodies
|
||||
return;
|
||||
|
|
@ -894,7 +1147,7 @@ impl SourceOrderVisitor<'_> for SymbolVisitor<'_> {
|
|||
import_kind: None,
|
||||
};
|
||||
|
||||
if self.global_only {
|
||||
if self.exports_only {
|
||||
self.add_symbol(symbol);
|
||||
// If global_only, don't walk class bodies
|
||||
return;
|
||||
|
|
@ -943,6 +1196,12 @@ impl SourceOrderVisitor<'_> for SymbolVisitor<'_> {
|
|||
ast::Stmt::AugAssign(ast::StmtAugAssign {
|
||||
target, op, value, ..
|
||||
}) => {
|
||||
// We don't care about `__all__` unless we're
|
||||
// specifically looking for exported symbols.
|
||||
if !self.exports_only {
|
||||
return;
|
||||
}
|
||||
|
||||
if self.all_origin.is_none() {
|
||||
// We can't update `__all__` if it doesn't already
|
||||
// exist.
|
||||
|
|
@ -961,6 +1220,12 @@ impl SourceOrderVisitor<'_> for SymbolVisitor<'_> {
|
|||
}
|
||||
}
|
||||
ast::Stmt::Expr(expr) => {
|
||||
// We don't care about `__all__` unless we're
|
||||
// specifically looking for exported symbols.
|
||||
if !self.exports_only {
|
||||
return;
|
||||
}
|
||||
|
||||
if self.all_origin.is_none() {
|
||||
// We can't update `__all__` if it doesn't already exist.
|
||||
return;
|
||||
|
|
@ -990,19 +1255,33 @@ impl SourceOrderVisitor<'_> for SymbolVisitor<'_> {
|
|||
source_order::walk_stmt(self, stmt);
|
||||
}
|
||||
ast::Stmt::Import(import) => {
|
||||
// We ignore any names introduced by imports
|
||||
// unless we're specifically looking for the
|
||||
// set of exported symbols.
|
||||
if !self.exports_only {
|
||||
return;
|
||||
}
|
||||
// We only consider imports in global scope.
|
||||
if self.in_function {
|
||||
return;
|
||||
}
|
||||
self.imports.add_import(import);
|
||||
for alias in &import.names {
|
||||
self.add_import_alias(stmt, alias);
|
||||
}
|
||||
}
|
||||
ast::Stmt::ImportFrom(import_from) => {
|
||||
// We ignore any names introduced by imports
|
||||
// unless we're specifically looking for the
|
||||
// set of exported symbols.
|
||||
if !self.exports_only {
|
||||
return;
|
||||
}
|
||||
// We only consider imports in global scope.
|
||||
if self.in_function {
|
||||
return;
|
||||
}
|
||||
self.imports.add_import_from(import_from);
|
||||
for alias in &import_from.names {
|
||||
if &alias.name == "*" {
|
||||
self.add_exported_from_wildcard(import_from);
|
||||
|
|
@ -1975,6 +2254,363 @@ class X:
|
|||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_statement_plus_equals() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source(
|
||||
"foo.py",
|
||||
"
|
||||
_ZQZQZQ = 1
|
||||
__all__ = ['_ZQZQZQ']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"import foo
|
||||
from foo import *
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__ += foo.__all__
|
||||
",
|
||||
)
|
||||
.build();
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZQZQZQ :: Constant
|
||||
_ZYZYZY :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_statement_extend() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source(
|
||||
"foo.py",
|
||||
"
|
||||
_ZQZQZQ = 1
|
||||
__all__ = ['_ZQZQZQ']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"import foo
|
||||
from foo import *
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__.extend(foo.__all__)
|
||||
",
|
||||
)
|
||||
.build();
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZQZQZQ :: Constant
|
||||
_ZYZYZY :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_statement_alias() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source(
|
||||
"foo.py",
|
||||
"
|
||||
_ZQZQZQ = 1
|
||||
__all__ = ['_ZQZQZQ']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"import foo as blah
|
||||
from foo import *
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__ += blah.__all__
|
||||
",
|
||||
)
|
||||
.build();
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZQZQZQ :: Constant
|
||||
_ZYZYZY :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_statement_nested_alias() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source("parent/__init__.py", "")
|
||||
.source(
|
||||
"parent/foo.py",
|
||||
"
|
||||
_ZQZQZQ = 1
|
||||
__all__ = ['_ZQZQZQ']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"import parent.foo as blah
|
||||
from parent.foo import *
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__ += blah.__all__
|
||||
",
|
||||
)
|
||||
.build();
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZQZQZQ :: Constant
|
||||
_ZYZYZY :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_from_statement_plus_equals() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source("parent/__init__.py", "")
|
||||
.source(
|
||||
"parent/foo.py",
|
||||
"
|
||||
_ZQZQZQ = 1
|
||||
__all__ = ['_ZQZQZQ']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"from parent import foo
|
||||
from parent.foo import *
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__ += foo.__all__
|
||||
",
|
||||
)
|
||||
.build();
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZQZQZQ :: Constant
|
||||
_ZYZYZY :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_from_statement_nested_module_reference() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source("parent/__init__.py", "")
|
||||
.source(
|
||||
"parent/foo.py",
|
||||
"
|
||||
_ZQZQZQ = 1
|
||||
__all__ = ['_ZQZQZQ']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"import parent.foo
|
||||
from parent.foo import *
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__ += parent.foo.__all__
|
||||
",
|
||||
)
|
||||
.build();
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZQZQZQ :: Constant
|
||||
_ZYZYZY :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_from_statement_extend() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source("parent/__init__.py", "")
|
||||
.source(
|
||||
"parent/foo.py",
|
||||
"
|
||||
_ZQZQZQ = 1
|
||||
__all__ = ['_ZQZQZQ']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"import parent.foo
|
||||
from parent.foo import *
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__.extend(parent.foo.__all__)
|
||||
",
|
||||
)
|
||||
.build();
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZQZQZQ :: Constant
|
||||
_ZYZYZY :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_from_statement_alias() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source("parent/__init__.py", "")
|
||||
.source(
|
||||
"parent/foo.py",
|
||||
"
|
||||
_ZQZQZQ = 1
|
||||
__all__ = ['_ZQZQZQ']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"from parent import foo as blah
|
||||
from parent.foo import *
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__ += blah.__all__
|
||||
",
|
||||
)
|
||||
.build();
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZQZQZQ :: Constant
|
||||
_ZYZYZY :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_cycle1() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source(
|
||||
"a.py",
|
||||
"from b import *
|
||||
import b
|
||||
_ZAZAZA = 1
|
||||
__all__ = ['_ZAZAZA']
|
||||
__all__ += b.__all__
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"b.py",
|
||||
"
|
||||
from a import *
|
||||
import a
|
||||
_ZBZBZB = 1
|
||||
__all__ = ['_ZBZBZB']
|
||||
__all__ += a.__all__
|
||||
",
|
||||
)
|
||||
.build();
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("a.py"),
|
||||
@r"
|
||||
_ZBZBZB :: Constant
|
||||
_ZAZAZA :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_statement_failure1() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source(
|
||||
"foo.py",
|
||||
"
|
||||
_ZFZFZF = 1
|
||||
__all__ = ['_ZFZFZF']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"bar.py",
|
||||
"
|
||||
_ZBZBZB = 1
|
||||
__all__ = ['_ZBZBZB']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"import foo
|
||||
import bar
|
||||
from foo import *
|
||||
from bar import *
|
||||
|
||||
foo = bar
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__.extend(foo.__all__)
|
||||
",
|
||||
)
|
||||
.build();
|
||||
// In this test, we resolve `foo.__all__` to the `__all__`
|
||||
// attribute in module `foo` instead of in `bar`. This is
|
||||
// because we don't track redefinitions of imports (as of
|
||||
// 2025-12-11). Handling this correctly would mean exporting
|
||||
// `_ZBZBZB` instead of `_ZFZFZF`.
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZFZFZF :: Constant
|
||||
_ZYZYZY :: Constant
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reexport_and_extend_from_submodule_import_statement_failure2() {
|
||||
let test = PublicTestBuilder::default()
|
||||
.source(
|
||||
"parent/__init__.py",
|
||||
"import parent.foo as foo
|
||||
__all__ = ['foo']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"parent/foo.py",
|
||||
"
|
||||
_ZFZFZF = 1
|
||||
__all__ = ['_ZFZFZF']
|
||||
",
|
||||
)
|
||||
.source(
|
||||
"test.py",
|
||||
"from parent.foo import *
|
||||
from parent import *
|
||||
|
||||
_ZYZYZY = 1
|
||||
__all__ = ['_ZYZYZY']
|
||||
__all__.extend(foo.__all__)
|
||||
",
|
||||
)
|
||||
.build();
|
||||
// This is not quite right either because we end up
|
||||
// considering the `__all__` in `test.py` to be invalid.
|
||||
// Namely, we don't pick up the `foo` that is in scope
|
||||
// from the `from parent import *` import. The correct
|
||||
// answer should just be `_ZFZFZF` and `_ZYZYZY`.
|
||||
insta::assert_snapshot!(
|
||||
test.exports_for("test.py"),
|
||||
@r"
|
||||
_ZFZFZF :: Constant
|
||||
foo :: Module
|
||||
_ZYZYZY :: Constant
|
||||
__all__ :: Variable
|
||||
",
|
||||
);
|
||||
}
|
||||
|
||||
fn matches(query: &str, symbol: &str) -> bool {
|
||||
super::QueryPattern::fuzzy(query).is_match_symbol_name(symbol)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -150,6 +150,62 @@ class Test:
|
|||
");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn ignore_all() {
|
||||
let test = CursorTest::builder()
|
||||
.source(
|
||||
"utils.py",
|
||||
"
|
||||
__all__ = []
|
||||
class Test:
|
||||
def from_path(): ...
|
||||
<CURSOR>",
|
||||
)
|
||||
.build();
|
||||
|
||||
assert_snapshot!(test.workspace_symbols("from"), @r"
|
||||
info[workspace-symbols]: WorkspaceSymbolInfo
|
||||
--> utils.py:4:9
|
||||
|
|
||||
2 | __all__ = []
|
||||
3 | class Test:
|
||||
4 | def from_path(): ...
|
||||
| ^^^^^^^^^
|
||||
|
|
||||
info: Method from_path
|
||||
");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn ignore_imports() {
|
||||
let test = CursorTest::builder()
|
||||
.source(
|
||||
"utils.py",
|
||||
"
|
||||
import re
|
||||
import json as json
|
||||
from collections import defaultdict
|
||||
foo = 1
|
||||
<CURSOR>",
|
||||
)
|
||||
.build();
|
||||
|
||||
assert_snapshot!(test.workspace_symbols("foo"), @r"
|
||||
info[workspace-symbols]: WorkspaceSymbolInfo
|
||||
--> utils.py:5:1
|
||||
|
|
||||
3 | import json as json
|
||||
4 | from collections import defaultdict
|
||||
5 | foo = 1
|
||||
| ^^^
|
||||
|
|
||||
info: Variable foo
|
||||
");
|
||||
assert_snapshot!(test.workspace_symbols("re"), @"No symbols found");
|
||||
assert_snapshot!(test.workspace_symbols("json"), @"No symbols found");
|
||||
assert_snapshot!(test.workspace_symbols("default"), @"No symbols found");
|
||||
}
|
||||
|
||||
impl CursorTest {
|
||||
fn workspace_symbols(&self, query: &str) -> String {
|
||||
let symbols = workspace_symbols(&self.db, query);
|
||||
|
|
|
|||
|
|
@ -285,22 +285,6 @@ impl Options {
|
|||
roots.push(python);
|
||||
}
|
||||
|
||||
// Considering pytest test discovery conventions,
|
||||
// we also include the `tests` directory if it exists and is not a package.
|
||||
let tests_dir = project_root.join("tests");
|
||||
if system.is_directory(&tests_dir)
|
||||
&& !system.is_file(&tests_dir.join("__init__.py"))
|
||||
&& !system.is_file(&tests_dir.join("__init__.pyi"))
|
||||
&& !roots.contains(&tests_dir)
|
||||
{
|
||||
// If the `tests` directory exists and is not a package, include it as a source root.
|
||||
tracing::debug!(
|
||||
"Including `./tests` in `environment.root` because a `./tests` directory exists"
|
||||
);
|
||||
|
||||
roots.push(tests_dir);
|
||||
}
|
||||
|
||||
// The project root should always be included, and should always come
|
||||
// after any subdirectories such as `./src`, `./tests` and/or `./python`.
|
||||
roots.push(project_root.to_path_buf());
|
||||
|
|
@ -532,7 +516,7 @@ pub struct EnvironmentOptions {
|
|||
/// * if a `./<project-name>/<project-name>` directory exists, include `.` and `./<project-name>` in the first party search path
|
||||
/// * otherwise, default to `.` (flat layout)
|
||||
///
|
||||
/// Besides, if a `./python` or `./tests` directory exists and is not a package (i.e. it does not contain an `__init__.py` or `__init__.pyi` file),
|
||||
/// Additionally, if a `./python` directory exists and is not a package (i.e. it does not contain an `__init__.py` or `__init__.pyi` file),
|
||||
/// it will also be included in the first party search path.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
#[option(
|
||||
|
|
@ -674,7 +658,7 @@ pub struct SrcOptions {
|
|||
/// * if a `./<project-name>/<project-name>` directory exists, include `.` and `./<project-name>` in the first party search path
|
||||
/// * otherwise, default to `.` (flat layout)
|
||||
///
|
||||
/// Besides, if a `./tests` directory exists and is not a package (i.e. it does not contain an `__init__.py` file),
|
||||
/// Additionally, if a `./python` directory exists and is not a package (i.e. it does not contain an `__init__.py` file),
|
||||
/// it will also be included in the first party search path.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
#[option(
|
||||
|
|
|
|||
|
|
@ -0,0 +1,7 @@
|
|||
from typing import TypeAlias, TypeVar
|
||||
|
||||
T = TypeVar("T", bound="A[0]")
|
||||
A: TypeAlias = T
|
||||
def _(x: A):
|
||||
if x:
|
||||
pass
|
||||
|
|
@ -0,0 +1 @@
|
|||
def _[T: (T if cond else U)[0], U](): pass
|
||||
|
|
@ -0,0 +1,3 @@
|
|||
def _[T: T[0]](x: T):
|
||||
if x:
|
||||
pass
|
||||
|
|
@ -0,0 +1,4 @@
|
|||
class _[T: (0, T[0])]:
|
||||
def _(x: T):
|
||||
if x:
|
||||
pass
|
||||
|
|
@ -146,9 +146,10 @@ Foo = NewType(name, int)
|
|||
reveal_type(Foo) # revealed: <NewType pseudo-class 'Foo'>
|
||||
```
|
||||
|
||||
## The second argument must be a class type or another newtype
|
||||
## The base must be a class type or another newtype
|
||||
|
||||
Other typing constructs like `Union` are not allowed.
|
||||
Other typing constructs like `Union` are not _generally_ allowed. (However, see the next section for
|
||||
a couple special cases.)
|
||||
|
||||
```py
|
||||
from typing_extensions import NewType
|
||||
|
|
@ -167,6 +168,61 @@ on top of that:
|
|||
Foo = NewType("Foo", 42)
|
||||
```
|
||||
|
||||
## `float` and `complex` special cases
|
||||
|
||||
`float` and `complex` are subject to a special case in the typing spec, which we currently interpret
|
||||
to mean that `float` in type position is `int | float`, and `complex` in type position is
|
||||
`int | float | complex`. This is awkward for `NewType`, because as we just tested above, unions
|
||||
aren't generally valid `NewType` bases. However, `float` and `complex` _are_ valid `NewType` bases,
|
||||
and we accept the unions they expand into.
|
||||
|
||||
```py
|
||||
from typing import NewType
|
||||
|
||||
Foo = NewType("Foo", float)
|
||||
Foo(3.14)
|
||||
Foo(42)
|
||||
Foo("hello") # error: [invalid-argument-type] "Argument is incorrect: Expected `int | float`, found `Literal["hello"]`"
|
||||
|
||||
reveal_type(Foo(3.14).__class__) # revealed: type[int] | type[float]
|
||||
reveal_type(Foo(42).__class__) # revealed: type[int] | type[float]
|
||||
|
||||
Bar = NewType("Bar", complex)
|
||||
Bar(1 + 2j)
|
||||
Bar(3.14)
|
||||
Bar(42)
|
||||
Bar("goodbye") # error: [invalid-argument-type]
|
||||
|
||||
reveal_type(Bar(1 + 2j).__class__) # revealed: type[int] | type[float] | type[complex]
|
||||
reveal_type(Bar(3.14).__class__) # revealed: type[int] | type[float] | type[complex]
|
||||
reveal_type(Bar(42).__class__) # revealed: type[int] | type[float] | type[complex]
|
||||
```
|
||||
|
||||
We don't currently try to distinguish between an implicit union (e.g. `float`) and the equivalent
|
||||
explicit union (e.g. `int | float`), so these two explicit unions are also allowed. But again, most
|
||||
unions are not allowed:
|
||||
|
||||
```py
|
||||
Baz = NewType("Baz", int | float)
|
||||
Baz = NewType("Baz", int | float | complex)
|
||||
Baz = NewType("Baz", int | str) # error: [invalid-newtype] "invalid base for `typing.NewType`"
|
||||
```
|
||||
|
||||
Similarly, a `NewType` of `float` or `complex` is valid as a `Callable` of the corresponding union
|
||||
type:
|
||||
|
||||
```py
|
||||
from collections.abc import Callable
|
||||
|
||||
def f(_: Callable[[int | float], Foo]): ...
|
||||
|
||||
f(Foo)
|
||||
|
||||
def g(_: Callable[[int | float | complex], Bar]): ...
|
||||
|
||||
g(Bar)
|
||||
```
|
||||
|
||||
## A `NewType` definition must be a simple variable assignment
|
||||
|
||||
```py
|
||||
|
|
@ -179,7 +235,7 @@ N: NewType = NewType("N", int) # error: [invalid-newtype] "A `NewType` definiti
|
|||
|
||||
Cyclic newtypes are kind of silly, but it's possible for the user to express them, and it's
|
||||
important that we don't go into infinite recursive loops and crash with a stack overflow. In fact,
|
||||
this is *why* base type evaluation is deferred; otherwise Salsa itself would crash.
|
||||
this is _why_ base type evaluation is deferred; otherwise Salsa itself would crash.
|
||||
|
||||
```py
|
||||
from typing_extensions import NewType, reveal_type, cast
|
||||
|
|
|
|||
|
|
@ -38,6 +38,8 @@ reveal_type(x) # revealed: int
|
|||
|
||||
## Unsupported types
|
||||
|
||||
<!-- snapshot-diagnostics -->
|
||||
|
||||
```py
|
||||
class C:
|
||||
def __isub__(self, other: str) -> int:
|
||||
|
|
|
|||
|
|
@ -2162,8 +2162,8 @@ Some attributes are special-cased, however:
|
|||
import types
|
||||
from ty_extensions import static_assert, TypeOf, is_subtype_of
|
||||
|
||||
reveal_type(f.__get__) # revealed: <method-wrapper `__get__` of `f`>
|
||||
reveal_type(f.__call__) # revealed: <method-wrapper `__call__` of `f`>
|
||||
reveal_type(f.__get__) # revealed: <method-wrapper '__get__' of function 'f'>
|
||||
reveal_type(f.__call__) # revealed: <method-wrapper '__call__' of function 'f'>
|
||||
static_assert(is_subtype_of(TypeOf[f.__get__], types.MethodWrapperType))
|
||||
static_assert(is_subtype_of(TypeOf[f.__call__], types.MethodWrapperType))
|
||||
```
|
||||
|
|
|
|||
|
|
@ -79,31 +79,31 @@ reveal_type(Sub() & Sub()) # revealed: Literal["&"]
|
|||
reveal_type(Sub() // Sub()) # revealed: Literal["//"]
|
||||
|
||||
# No does not implement any of the dunder methods.
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `No`"
|
||||
reveal_type(No() + No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `-` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `-` is not supported between two objects of type `No`"
|
||||
reveal_type(No() - No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `*` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `*` is not supported between two objects of type `No`"
|
||||
reveal_type(No() * No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `@` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `@` is not supported between two objects of type `No`"
|
||||
reveal_type(No() @ No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `/` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `/` is not supported between two objects of type `No`"
|
||||
reveal_type(No() / No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `%` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `%` is not supported between two objects of type `No`"
|
||||
reveal_type(No() % No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `**` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `**` is not supported between two objects of type `No`"
|
||||
reveal_type(No() ** No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `<<` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `<<` is not supported between two objects of type `No`"
|
||||
reveal_type(No() << No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `>>` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `>>` is not supported between two objects of type `No`"
|
||||
reveal_type(No() >> No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `|` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `|` is not supported between two objects of type `No`"
|
||||
reveal_type(No() | No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `^` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `^` is not supported between two objects of type `No`"
|
||||
reveal_type(No() ^ No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `&` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `&` is not supported between two objects of type `No`"
|
||||
reveal_type(No() & No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `//` is not supported between objects of type `No` and `No`"
|
||||
# error: [unsupported-operator] "Operator `//` is not supported between two objects of type `No`"
|
||||
reveal_type(No() // No()) # revealed: Unknown
|
||||
|
||||
# Yes does not implement any of the reflected dunder methods.
|
||||
|
|
@ -293,6 +293,8 @@ reveal_type(Yes() // No()) # revealed: Literal["//"]
|
|||
|
||||
## Classes
|
||||
|
||||
<!-- snapshot-diagnostics -->
|
||||
|
||||
Dunder methods defined in a class are available to instances of that class, but not to the class
|
||||
itself. (For these operators to work on the class itself, they would have to be defined on the
|
||||
class's type, i.e. `type`.)
|
||||
|
|
@ -307,11 +309,11 @@ class Yes:
|
|||
class Sub(Yes): ...
|
||||
class No: ...
|
||||
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `<class 'Yes'>` and `<class 'Yes'>`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'Yes'>`"
|
||||
reveal_type(Yes + Yes) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `<class 'Sub'>` and `<class 'Sub'>`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'Sub'>`"
|
||||
reveal_type(Sub + Sub) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `<class 'No'>` and `<class 'No'>`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'No'>`"
|
||||
reveal_type(No + No) # revealed: Unknown
|
||||
```
|
||||
|
||||
|
|
@ -336,11 +338,11 @@ def sub() -> type[Sub]:
|
|||
def no() -> type[No]:
|
||||
return No
|
||||
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `type[Yes]` and `type[Yes]`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `type[Yes]`"
|
||||
reveal_type(yes() + yes()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `type[Sub]` and `type[Sub]`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `type[Sub]`"
|
||||
reveal_type(sub() + sub()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `type[No]` and `type[No]`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `type[No]`"
|
||||
reveal_type(no() + no()) # revealed: Unknown
|
||||
```
|
||||
|
||||
|
|
@ -350,30 +352,54 @@ reveal_type(no() + no()) # revealed: Unknown
|
|||
def f():
|
||||
pass
|
||||
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f + f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `-` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `-` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f - f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `*` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `*` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f * f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `@` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `@` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f @ f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `/` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `/` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f / f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `%` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `%` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f % f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `**` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `**` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f**f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `<<` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `<<` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f << f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `>>` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `>>` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f >> f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `|` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `|` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f | f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `^` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `^` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f ^ f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `&` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `&` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f & f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Operator `//` is not supported between objects of type `def f() -> Unknown` and `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Operator `//` is not supported between two objects of type `def f() -> Unknown`"
|
||||
reveal_type(f // f) # revealed: Unknown
|
||||
```
|
||||
|
||||
## Classes from different modules with the same name
|
||||
|
||||
We use the fully qualified names in diagnostics if the two classes have the same unqualified name,
|
||||
but are nonetheless different.
|
||||
|
||||
<!-- snapshot-diagnostics -->
|
||||
|
||||
`mod1.py`:
|
||||
|
||||
```py
|
||||
class A: ...
|
||||
```
|
||||
|
||||
`mod2.py`:
|
||||
|
||||
```py
|
||||
import mod1
|
||||
|
||||
class A: ...
|
||||
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `mod2.A` and `mod1.A`"
|
||||
A() + mod1.A()
|
||||
```
|
||||
|
|
|
|||
|
|
@ -412,7 +412,7 @@ class A:
|
|||
def __init__(self):
|
||||
self.__add__ = add_impl
|
||||
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `A` and `A`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `A`"
|
||||
# revealed: Unknown
|
||||
reveal_type(A() + A())
|
||||
```
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ cannot be added, because that would require addition of `int` and `str` or vice
|
|||
def f2(i: int, s: str, int_or_str: int | str):
|
||||
i + i
|
||||
s + s
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `int | str` and `int | str`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `int | str`"
|
||||
reveal_type(int_or_str + int_or_str) # revealed: Unknown
|
||||
```
|
||||
|
||||
|
|
|
|||
|
|
@ -34,7 +34,8 @@ from inspect import getattr_static
|
|||
|
||||
reveal_type(getattr_static(C, "f")) # revealed: def f(self, x: int) -> str
|
||||
|
||||
reveal_type(getattr_static(C, "f").__get__) # revealed: <method-wrapper `__get__` of `f`>
|
||||
# revealed: <method-wrapper '__get__' of function 'f'>
|
||||
reveal_type(getattr_static(C, "f").__get__)
|
||||
|
||||
reveal_type(getattr_static(C, "f").__get__(None, C)) # revealed: def f(self, x: int) -> str
|
||||
reveal_type(getattr_static(C, "f").__get__(C(), C)) # revealed: bound method C.f(x: int) -> str
|
||||
|
|
@ -258,7 +259,7 @@ class C:
|
|||
|
||||
method_wrapper = getattr_static(C, "f").__get__
|
||||
|
||||
reveal_type(method_wrapper) # revealed: <method-wrapper `__get__` of `f`>
|
||||
reveal_type(method_wrapper) # revealed: <method-wrapper '__get__' of function 'f'>
|
||||
|
||||
# All of these are fine:
|
||||
method_wrapper(C(), C)
|
||||
|
|
@ -414,7 +415,8 @@ class C:
|
|||
def f(cls): ...
|
||||
|
||||
reveal_type(getattr_static(C, "f")) # revealed: def f(cls) -> Unknown
|
||||
reveal_type(getattr_static(C, "f").__get__) # revealed: <method-wrapper `__get__` of `f`>
|
||||
# revealed: <method-wrapper '__get__' of function 'f'>
|
||||
reveal_type(getattr_static(C, "f").__get__)
|
||||
```
|
||||
|
||||
But we correctly model how the `classmethod` descriptor works:
|
||||
|
|
@ -632,7 +634,7 @@ class MyClass:
|
|||
|
||||
static_assert(is_assignable_to(types.FunctionType, Callable))
|
||||
|
||||
# revealed: <wrapper-descriptor `__get__` of `function` objects>
|
||||
# revealed: <wrapper-descriptor '__get__' of 'function' objects>
|
||||
reveal_type(types.FunctionType.__get__)
|
||||
static_assert(is_assignable_to(TypeOf[types.FunctionType.__get__], Callable))
|
||||
|
||||
|
|
@ -640,7 +642,7 @@ static_assert(is_assignable_to(TypeOf[types.FunctionType.__get__], Callable))
|
|||
reveal_type(f)
|
||||
static_assert(is_assignable_to(TypeOf[f], Callable))
|
||||
|
||||
# revealed: <method-wrapper `__get__` of `f`>
|
||||
# revealed: <method-wrapper '__get__' of function 'f'>
|
||||
reveal_type(f.__get__)
|
||||
static_assert(is_assignable_to(TypeOf[f.__get__], Callable))
|
||||
|
||||
|
|
@ -648,11 +650,11 @@ static_assert(is_assignable_to(TypeOf[f.__get__], Callable))
|
|||
reveal_type(types.FunctionType.__call__)
|
||||
static_assert(is_assignable_to(TypeOf[types.FunctionType.__call__], Callable))
|
||||
|
||||
# revealed: <method-wrapper `__call__` of `f`>
|
||||
# revealed: <method-wrapper '__call__' of function 'f'>
|
||||
reveal_type(f.__call__)
|
||||
static_assert(is_assignable_to(TypeOf[f.__call__], Callable))
|
||||
|
||||
# revealed: <wrapper-descriptor `__get__` of `property` objects>
|
||||
# revealed: <wrapper-descriptor '__get__' of 'property' objects>
|
||||
reveal_type(property.__get__)
|
||||
static_assert(is_assignable_to(TypeOf[property.__get__], Callable))
|
||||
|
||||
|
|
@ -661,15 +663,15 @@ reveal_type(MyClass.my_property)
|
|||
static_assert(is_assignable_to(TypeOf[property], Callable))
|
||||
static_assert(not is_assignable_to(TypeOf[MyClass.my_property], Callable))
|
||||
|
||||
# revealed: <method-wrapper `__get__` of `property` object>
|
||||
# revealed: <method-wrapper '__get__' of property 'my_property'>
|
||||
reveal_type(MyClass.my_property.__get__)
|
||||
static_assert(is_assignable_to(TypeOf[MyClass.my_property.__get__], Callable))
|
||||
|
||||
# revealed: <wrapper-descriptor `__set__` of `property` objects>
|
||||
# revealed: <wrapper-descriptor '__set__' of 'property' objects>
|
||||
reveal_type(property.__set__)
|
||||
static_assert(is_assignable_to(TypeOf[property.__set__], Callable))
|
||||
|
||||
# revealed: <method-wrapper `__set__` of `property` object>
|
||||
# revealed: <method-wrapper '__set__' of property 'my_property'>
|
||||
reveal_type(MyClass.my_property.__set__)
|
||||
static_assert(is_assignable_to(TypeOf[MyClass.my_property.__set__], Callable))
|
||||
|
||||
|
|
@ -677,7 +679,7 @@ static_assert(is_assignable_to(TypeOf[MyClass.my_property.__set__], Callable))
|
|||
reveal_type(str.startswith)
|
||||
static_assert(is_assignable_to(TypeOf[str.startswith], Callable))
|
||||
|
||||
# revealed: <method-wrapper `startswith` of `str` object>
|
||||
# revealed: <method-wrapper 'startswith' of string 'foo'>
|
||||
reveal_type("foo".startswith)
|
||||
static_assert(is_assignable_to(TypeOf["foo".startswith], Callable))
|
||||
|
||||
|
|
|
|||
|
|
@ -141,3 +141,18 @@ class C:
|
|||
# revealed: (*, kw_only=Unknown | ((*, kw_only=Unknown) -> Unknown)) -> Unknown
|
||||
reveal_type(self.d)
|
||||
```
|
||||
|
||||
## Self-referential implicit attributes
|
||||
|
||||
```py
|
||||
class Cyclic:
|
||||
def __init__(self, data: str | dict):
|
||||
self.data = data
|
||||
|
||||
def update(self):
|
||||
if isinstance(self.data, str):
|
||||
self.data = {"url": self.data}
|
||||
|
||||
# revealed: Unknown | str | dict[Unknown, Unknown] | dict[Unknown | str, Unknown | str]
|
||||
reveal_type(Cyclic("").data)
|
||||
```
|
||||
|
|
|
|||
|
|
@ -596,14 +596,14 @@ def f(x: object) -> str:
|
|||
return "a"
|
||||
|
||||
reveal_type(f) # revealed: def f(x: object) -> str
|
||||
reveal_type(f.__get__) # revealed: <method-wrapper `__get__` of `f`>
|
||||
reveal_type(f.__get__) # revealed: <method-wrapper '__get__' of function 'f'>
|
||||
static_assert(is_subtype_of(TypeOf[f.__get__], types.MethodWrapperType))
|
||||
reveal_type(f.__get__(None, type(f))) # revealed: def f(x: object) -> str
|
||||
reveal_type(f.__get__(None, type(f))(1)) # revealed: str
|
||||
|
||||
wrapper_descriptor = getattr_static(f, "__get__")
|
||||
|
||||
reveal_type(wrapper_descriptor) # revealed: <wrapper-descriptor `__get__` of `function` objects>
|
||||
reveal_type(wrapper_descriptor) # revealed: <wrapper-descriptor '__get__' of 'function' objects>
|
||||
reveal_type(wrapper_descriptor(f, None, type(f))) # revealed: def f(x: object) -> str
|
||||
static_assert(is_subtype_of(TypeOf[wrapper_descriptor], types.WrapperDescriptorType))
|
||||
|
||||
|
|
|
|||
|
|
@ -277,7 +277,7 @@ T = TypeVar("T", int, str)
|
|||
|
||||
def same_constrained_types(t1: T, t2: T) -> T:
|
||||
# TODO: no error
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `T@same_constrained_types` and `T@same_constrained_types`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `T@same_constrained_types`"
|
||||
return t1 + t2
|
||||
```
|
||||
|
||||
|
|
@ -287,7 +287,7 @@ and an `int` and a `str` cannot be added together:
|
|||
|
||||
```py
|
||||
def unions_are_different(t1: int | str, t2: int | str) -> int | str:
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `int | str` and `int | str`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `int | str`"
|
||||
return t1 + t2
|
||||
```
|
||||
|
||||
|
|
|
|||
|
|
@ -283,7 +283,7 @@ reveal_type(OnlyParamSpec[...]().attr) # revealed: (...) -> None
|
|||
def func(c: Callable[P2, None]):
|
||||
reveal_type(OnlyParamSpec[P2]().attr) # revealed: (**P2@func) -> None
|
||||
|
||||
# TODO: error: paramspec is unbound
|
||||
# error: [invalid-type-arguments] "ParamSpec `P2` is unbound"
|
||||
reveal_type(OnlyParamSpec[P2]().attr) # revealed: (...) -> None
|
||||
|
||||
# error: [invalid-type-arguments] "No type argument provided for required type variable `P1` of class `OnlyParamSpec`"
|
||||
|
|
@ -327,15 +327,14 @@ reveal_type(TypeVarAndParamSpec[int, [int, str]]().attr) # revealed: (int, str,
|
|||
reveal_type(TypeVarAndParamSpec[int, [str]]().attr) # revealed: (str, /) -> int
|
||||
reveal_type(TypeVarAndParamSpec[int, ...]().attr) # revealed: (...) -> int
|
||||
|
||||
# TODO: We could still specialize for `T1` as the type is valid which would reveal `(...) -> int`
|
||||
# TODO: error: paramspec is unbound
|
||||
reveal_type(TypeVarAndParamSpec[int, P2]().attr) # revealed: (...) -> Unknown
|
||||
# error: [invalid-type-arguments] "ParamSpec `P2` is unbound"
|
||||
reveal_type(TypeVarAndParamSpec[int, P2]().attr) # revealed: (...) -> int
|
||||
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be either a list of types, `ParamSpec`, `Concatenate`, or `...`"
|
||||
reveal_type(TypeVarAndParamSpec[int, int]().attr) # revealed: (...) -> int
|
||||
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
|
||||
reveal_type(TypeVarAndParamSpec[int, int]().attr) # revealed: (...) -> Unknown
|
||||
reveal_type(TypeVarAndParamSpec[int, ()]().attr) # revealed: (...) -> int
|
||||
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
|
||||
reveal_type(TypeVarAndParamSpec[int, ()]().attr) # revealed: (...) -> Unknown
|
||||
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
|
||||
reveal_type(TypeVarAndParamSpec[int, (int, str)]().attr) # revealed: (...) -> Unknown
|
||||
reveal_type(TypeVarAndParamSpec[int, (int, str)]().attr) # revealed: (...) -> int
|
||||
```
|
||||
|
||||
Nor can they be omitted when there are more than one `ParamSpec`s.
|
||||
|
|
|
|||
|
|
@ -104,6 +104,34 @@ S = TypeVar("S", **{"bound": int})
|
|||
reveal_type(S) # revealed: TypeVar
|
||||
```
|
||||
|
||||
### No explicit specialization
|
||||
|
||||
A type variable itself cannot be explicitly specialized; the result of the specialization is
|
||||
`Unknown`. However, generic PEP 613 type aliases that point to type variables can be explicitly
|
||||
specialized.
|
||||
|
||||
```py
|
||||
from typing import TypeVar, TypeAlias
|
||||
|
||||
T = TypeVar("T")
|
||||
ImplicitPositive = T
|
||||
Positive: TypeAlias = T
|
||||
|
||||
def _(
|
||||
# error: [invalid-type-form] "A type variable itself cannot be specialized"
|
||||
a: T[int],
|
||||
# error: [invalid-type-form] "A type variable itself cannot be specialized"
|
||||
b: T[T],
|
||||
# error: [invalid-type-form] "A type variable itself cannot be specialized"
|
||||
c: ImplicitPositive[int],
|
||||
d: Positive[int],
|
||||
):
|
||||
reveal_type(a) # revealed: Unknown
|
||||
reveal_type(b) # revealed: Unknown
|
||||
reveal_type(c) # revealed: Unknown
|
||||
reveal_type(d) # revealed: int
|
||||
```
|
||||
|
||||
### Type variables with a default
|
||||
|
||||
Note that the `__default__` property is only available in Python ≥3.13.
|
||||
|
|
|
|||
|
|
@ -68,13 +68,91 @@ reveal_type(C[int, int]) # revealed: <type alias 'C[Unknown]'>
|
|||
And non-generic types cannot be specialized:
|
||||
|
||||
```py
|
||||
from typing import TypeVar, Protocol, TypedDict
|
||||
|
||||
type B = ...
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias"
|
||||
reveal_type(B[int]) # revealed: Unknown
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias"
|
||||
def _(b: B[int]): ...
|
||||
def _(b: B[int]):
|
||||
reveal_type(b) # revealed: Unknown
|
||||
|
||||
type IntOrStr = int | str
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias"
|
||||
def _(c: IntOrStr[int]):
|
||||
reveal_type(c) # revealed: Unknown
|
||||
|
||||
type ListOfInts = list[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias: `list[int]` is already specialized"
|
||||
def _(l: ListOfInts[int]):
|
||||
reveal_type(l) # revealed: Unknown
|
||||
|
||||
type List[T] = list[T]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias: Double specialization is not allowed"
|
||||
def _(l: List[int][int]):
|
||||
reveal_type(l) # revealed: Unknown
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type: `<class 'list[T@DoubleSpecialization]'>` is already specialized"
|
||||
type DoubleSpecialization[T] = list[T][T]
|
||||
|
||||
def _(d: DoubleSpecialization[int]):
|
||||
reveal_type(d) # revealed: Unknown
|
||||
|
||||
type Tuple = tuple[int, str]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias: `tuple[int, str]` is already specialized"
|
||||
def _(doubly_specialized: Tuple[int]):
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
class LegacyProto(Protocol[T]):
|
||||
pass
|
||||
|
||||
type LegacyProtoInt = LegacyProto[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias: `LegacyProto[int]` is already specialized"
|
||||
def _(x: LegacyProtoInt[int]):
|
||||
reveal_type(x) # revealed: Unknown
|
||||
|
||||
class Proto[T](Protocol):
|
||||
pass
|
||||
|
||||
type ProtoInt = Proto[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias: `Proto[int]` is already specialized"
|
||||
def _(x: ProtoInt[int]):
|
||||
reveal_type(x) # revealed: Unknown
|
||||
|
||||
# TODO: TypedDict is just a function object at runtime, we should emit an error
|
||||
class LegacyDict(TypedDict[T]):
|
||||
x: T
|
||||
|
||||
type LegacyDictInt = LegacyDict[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias"
|
||||
def _(x: LegacyDictInt[int]):
|
||||
reveal_type(x) # revealed: Unknown
|
||||
|
||||
class Dict[T](TypedDict):
|
||||
x: T
|
||||
|
||||
type DictInt = Dict[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias: `Dict` is already specialized"
|
||||
def _(x: DictInt[int]):
|
||||
reveal_type(x) # revealed: Unknown
|
||||
|
||||
type Union = list[str] | list[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias: `list[str] | list[int]` is already specialized"
|
||||
def _(x: Union[int]):
|
||||
reveal_type(x) # revealed: Unknown
|
||||
```
|
||||
|
||||
If the type variable has an upper bound, the specialized type must satisfy that bound:
|
||||
|
|
@ -98,6 +176,15 @@ reveal_type(BoundedByUnion[int]) # revealed: <type alias 'BoundedByUnion[int]'>
|
|||
reveal_type(BoundedByUnion[IntSubclass]) # revealed: <type alias 'BoundedByUnion[IntSubclass]'>
|
||||
reveal_type(BoundedByUnion[str]) # revealed: <type alias 'BoundedByUnion[str]'>
|
||||
reveal_type(BoundedByUnion[int | str]) # revealed: <type alias 'BoundedByUnion[int | str]'>
|
||||
|
||||
type TupleOfIntAndStr[T: int, U: str] = tuple[T, U]
|
||||
|
||||
def _(x: TupleOfIntAndStr[int, str]):
|
||||
reveal_type(x) # revealed: tuple[int, str]
|
||||
|
||||
# error: [invalid-type-arguments] "Type `int` is not assignable to upper bound `str` of type variable `U@TupleOfIntAndStr`"
|
||||
def _(x: TupleOfIntAndStr[int, int]):
|
||||
reveal_type(x) # revealed: tuple[int, Unknown]
|
||||
```
|
||||
|
||||
If the type variable is constrained, the specialized type must satisfy those constraints:
|
||||
|
|
@ -119,6 +206,15 @@ reveal_type(Constrained[int | str]) # revealed: <type alias 'Constrained[int |
|
|||
|
||||
# error: [invalid-type-arguments] "Type `object` does not satisfy constraints `int`, `str` of type variable `T@Constrained`"
|
||||
reveal_type(Constrained[object]) # revealed: <type alias 'Constrained[Unknown]'>
|
||||
|
||||
type TupleOfIntOrStr[T: (int, str), U: (int, str)] = tuple[T, U]
|
||||
|
||||
def _(x: TupleOfIntOrStr[int, str]):
|
||||
reveal_type(x) # revealed: tuple[int, str]
|
||||
|
||||
# error: [invalid-type-arguments] "Type `object` does not satisfy constraints `int`, `str` of type variable `U@TupleOfIntOrStr`"
|
||||
def _(x: TupleOfIntOrStr[int, object]):
|
||||
reveal_type(x) # revealed: tuple[int, Unknown]
|
||||
```
|
||||
|
||||
If the type variable has a default, it can be omitted:
|
||||
|
|
|
|||
|
|
@ -246,7 +246,7 @@ methods that are compatible with the return type, so the `return` expression is
|
|||
```py
|
||||
def same_constrained_types[T: (int, str)](t1: T, t2: T) -> T:
|
||||
# TODO: no error
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `T@same_constrained_types` and `T@same_constrained_types`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `T@same_constrained_types`"
|
||||
return t1 + t2
|
||||
```
|
||||
|
||||
|
|
@ -256,7 +256,7 @@ and an `int` and a `str` cannot be added together:
|
|||
|
||||
```py
|
||||
def unions_are_different(t1: int | str, t2: int | str) -> int | str:
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between objects of type `int | str` and `int | str`"
|
||||
# error: [unsupported-operator] "Operator `+` is not supported between two objects of type `int | str`"
|
||||
return t1 + t2
|
||||
```
|
||||
|
||||
|
|
|
|||
|
|
@ -237,7 +237,7 @@ def func[**P2](c: Callable[P2, None]):
|
|||
|
||||
P2 = ParamSpec("P2")
|
||||
|
||||
# TODO: error: paramspec is unbound
|
||||
# error: [invalid-type-arguments] "ParamSpec `P2` is unbound"
|
||||
reveal_type(OnlyParamSpec[P2]().attr) # revealed: (...) -> None
|
||||
|
||||
# error: [invalid-type-arguments] "No type argument provided for required type variable `P1` of class `OnlyParamSpec`"
|
||||
|
|
@ -281,14 +281,14 @@ reveal_type(TypeVarAndParamSpec[int, [int, str]]().attr) # revealed: (int, str,
|
|||
reveal_type(TypeVarAndParamSpec[int, [str]]().attr) # revealed: (str, /) -> int
|
||||
reveal_type(TypeVarAndParamSpec[int, ...]().attr) # revealed: (...) -> int
|
||||
|
||||
# TODO: error: paramspec is unbound
|
||||
reveal_type(TypeVarAndParamSpec[int, P2]().attr) # revealed: (...) -> Unknown
|
||||
# error: [invalid-type-arguments] "ParamSpec `P2` is unbound"
|
||||
reveal_type(TypeVarAndParamSpec[int, P2]().attr) # revealed: (...) -> int
|
||||
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
|
||||
reveal_type(TypeVarAndParamSpec[int, int]().attr) # revealed: (...) -> Unknown
|
||||
reveal_type(TypeVarAndParamSpec[int, int]().attr) # revealed: (...) -> int
|
||||
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
|
||||
reveal_type(TypeVarAndParamSpec[int, ()]().attr) # revealed: (...) -> Unknown
|
||||
reveal_type(TypeVarAndParamSpec[int, ()]().attr) # revealed: (...) -> int
|
||||
# error: [invalid-type-arguments] "Type argument for `ParamSpec` must be"
|
||||
reveal_type(TypeVarAndParamSpec[int, (int, str)]().attr) # revealed: (...) -> Unknown
|
||||
reveal_type(TypeVarAndParamSpec[int, (int, str)]().attr) # revealed: (...) -> int
|
||||
```
|
||||
|
||||
Nor can they be omitted when there are more than one `ParamSpec`.
|
||||
|
|
|
|||
|
|
@ -98,6 +98,26 @@ def f[T: (int,)]():
|
|||
pass
|
||||
```
|
||||
|
||||
### No explicit specialization
|
||||
|
||||
A type variable itself cannot be explicitly specialized; the result of the specialization is
|
||||
`Unknown`. However, generic type aliases that point to type variables can be explicitly specialized.
|
||||
|
||||
```py
|
||||
type Positive[T] = T
|
||||
|
||||
def _[T](
|
||||
# error: [invalid-type-form] "A type variable itself cannot be specialized"
|
||||
a: T[int],
|
||||
# error: [invalid-type-form] "A type variable itself cannot be specialized"
|
||||
b: T[T],
|
||||
c: Positive[int],
|
||||
):
|
||||
reveal_type(a) # revealed: Unknown
|
||||
reveal_type(b) # revealed: Unknown
|
||||
reveal_type(c) # revealed: int
|
||||
```
|
||||
|
||||
## Invalid uses
|
||||
|
||||
Note that many of the invalid uses of legacy typevars do not apply to PEP 695 typevars, since the
|
||||
|
|
|
|||
|
|
@ -102,7 +102,7 @@ class C[T]:
|
|||
return "a"
|
||||
|
||||
reveal_type(getattr_static(C[int], "f")) # revealed: def f(self, x: int) -> str
|
||||
reveal_type(getattr_static(C[int], "f").__get__) # revealed: <method-wrapper `__get__` of `f`>
|
||||
reveal_type(getattr_static(C[int], "f").__get__) # revealed: <method-wrapper '__get__' of function 'f'>
|
||||
reveal_type(getattr_static(C[int], "f").__get__(None, C[int])) # revealed: def f(self, x: int) -> str
|
||||
# revealed: bound method C[int].f(x: int) -> str
|
||||
reveal_type(getattr_static(C[int], "f").__get__(C[int](), C[int]))
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ An unbounded typevar can specialize to any type. We will specialize the typevar
|
|||
bound of all of the types that satisfy the constraint set.
|
||||
|
||||
```py
|
||||
from typing import Never
|
||||
from typing import Any, Never
|
||||
from ty_extensions import ConstraintSet, generic_context
|
||||
|
||||
# fmt: off
|
||||
|
|
@ -26,6 +26,8 @@ def unbounded[T]():
|
|||
reveal_type(generic_context(unbounded).specialize_constrained(ConstraintSet.always()))
|
||||
# revealed: ty_extensions.Specialization[T@unbounded = object]
|
||||
reveal_type(generic_context(unbounded).specialize_constrained(ConstraintSet.range(Never, T, object)))
|
||||
# revealed: ty_extensions.Specialization[T@unbounded = Any]
|
||||
reveal_type(generic_context(unbounded).specialize_constrained(ConstraintSet.range(Never, T, Any)))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(unbounded).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
|
|
@ -68,6 +70,10 @@ class Unrelated: ...
|
|||
def bounded[T: Base]():
|
||||
# revealed: ty_extensions.Specialization[T@bounded = Base]
|
||||
reveal_type(generic_context(bounded).specialize_constrained(ConstraintSet.always()))
|
||||
# revealed: ty_extensions.Specialization[T@bounded = Base]
|
||||
reveal_type(generic_context(bounded).specialize_constrained(ConstraintSet.range(Never, T, object)))
|
||||
# revealed: ty_extensions.Specialization[T@bounded = Base & Any]
|
||||
reveal_type(generic_context(bounded).specialize_constrained(ConstraintSet.range(Never, T, Any)))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(bounded).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
|
|
@ -94,11 +100,17 @@ def bounded_by_gradual[T: Any]():
|
|||
# TODO: revealed: ty_extensions.Specialization[T@bounded_by_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual = object]
|
||||
reveal_type(generic_context(bounded_by_gradual).specialize_constrained(ConstraintSet.always()))
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual = object]
|
||||
reveal_type(generic_context(bounded_by_gradual).specialize_constrained(ConstraintSet.range(Never, T, object)))
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual = Any]
|
||||
reveal_type(generic_context(bounded_by_gradual).specialize_constrained(ConstraintSet.range(Never, T, Any)))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(bounded_by_gradual).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual = Base]
|
||||
reveal_type(generic_context(bounded_by_gradual).specialize_constrained(ConstraintSet.range(Never, T, Base)))
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual = object]
|
||||
reveal_type(generic_context(bounded_by_gradual).specialize_constrained(ConstraintSet.range(Base, T, object)))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual = Unrelated]
|
||||
reveal_type(generic_context(bounded_by_gradual).specialize_constrained(ConstraintSet.range(Never, T, Unrelated)))
|
||||
|
|
@ -106,14 +118,24 @@ def bounded_by_gradual[T: Any]():
|
|||
def bounded_by_gradual_list[T: list[Any]]():
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual_list = Top[list[Any]]]
|
||||
reveal_type(generic_context(bounded_by_gradual_list).specialize_constrained(ConstraintSet.always()))
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual_list = list[object]]
|
||||
reveal_type(generic_context(bounded_by_gradual_list).specialize_constrained(ConstraintSet.range(Never, T, list[object])))
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual_list = list[Any]]
|
||||
reveal_type(generic_context(bounded_by_gradual_list).specialize_constrained(ConstraintSet.range(Never, T, list[Any])))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(bounded_by_gradual_list).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual_list = list[Base]]
|
||||
reveal_type(generic_context(bounded_by_gradual_list).specialize_constrained(ConstraintSet.range(Never, T, list[Base])))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@bounded_by_gradual_list = list[Base]]
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual_list = Top[list[Any]]]
|
||||
reveal_type(generic_context(bounded_by_gradual_list).specialize_constrained(ConstraintSet.range(list[Base], T, object)))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual_list = list[Unrelated]]
|
||||
reveal_type(generic_context(bounded_by_gradual_list).specialize_constrained(ConstraintSet.range(Never, T, list[Unrelated])))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@bounded_by_gradual_list = list[Unrelated]]
|
||||
# revealed: ty_extensions.Specialization[T@bounded_by_gradual_list = Top[list[Any]]]
|
||||
reveal_type(generic_context(bounded_by_gradual_list).specialize_constrained(ConstraintSet.range(list[Unrelated], T, object)))
|
||||
```
|
||||
|
||||
## Constrained typevar
|
||||
|
|
@ -142,12 +164,21 @@ def constrained[T: (Base, Unrelated)]():
|
|||
# revealed: None
|
||||
reveal_type(generic_context(constrained).specialize_constrained(ConstraintSet.always()))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(constrained).specialize_constrained(ConstraintSet.range(Never, T, object)))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(constrained).specialize_constrained(ConstraintSet.range(Never, T, Any)))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(constrained).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@constrained = Base]
|
||||
reveal_type(generic_context(constrained).specialize_constrained(ConstraintSet.range(Never, T, Base)))
|
||||
# revealed: ty_extensions.Specialization[T@constrained = Base]
|
||||
reveal_type(generic_context(constrained).specialize_constrained(ConstraintSet.range(Base, T, object)))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@constrained = Unrelated]
|
||||
reveal_type(generic_context(constrained).specialize_constrained(ConstraintSet.range(Never, T, Unrelated)))
|
||||
# revealed: ty_extensions.Specialization[T@constrained = Unrelated]
|
||||
reveal_type(generic_context(constrained).specialize_constrained(ConstraintSet.range(Unrelated, T, object)))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@constrained = Base]
|
||||
reveal_type(generic_context(constrained).specialize_constrained(ConstraintSet.range(Never, T, Super)))
|
||||
|
|
@ -178,15 +209,25 @@ def constrained_by_gradual[T: (Base, Any)]():
|
|||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual = Base]
|
||||
reveal_type(generic_context(constrained_by_gradual).specialize_constrained(ConstraintSet.range(Never, T, object)))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual = Base & Any]
|
||||
reveal_type(generic_context(constrained_by_gradual).specialize_constrained(ConstraintSet.range(Never, T, Any)))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(constrained_by_gradual).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual = Base]
|
||||
reveal_type(generic_context(constrained_by_gradual).specialize_constrained(ConstraintSet.range(Never, T, Base)))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual = Base]
|
||||
reveal_type(generic_context(constrained_by_gradual).specialize_constrained(ConstraintSet.range(Base, T, object)))
|
||||
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual = Unrelated]
|
||||
reveal_type(generic_context(constrained_by_gradual).specialize_constrained(ConstraintSet.range(Never, T, Unrelated)))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual = object]
|
||||
reveal_type(generic_context(constrained_by_gradual).specialize_constrained(ConstraintSet.range(Unrelated, T, object)))
|
||||
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual = Base]
|
||||
|
|
@ -206,6 +247,11 @@ def constrained_by_two_gradual[T: (Any, Any)]():
|
|||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual = object]
|
||||
reveal_type(generic_context(constrained_by_two_gradual).specialize_constrained(ConstraintSet.always()))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_two_gradual = Any]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual = object]
|
||||
reveal_type(generic_context(constrained_by_two_gradual).specialize_constrained(ConstraintSet.range(Never, T, object)))
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual = Any]
|
||||
reveal_type(generic_context(constrained_by_two_gradual).specialize_constrained(ConstraintSet.range(Never, T, Any)))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(constrained_by_two_gradual).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
|
|
@ -233,14 +279,24 @@ def constrained_by_two_gradual[T: (Any, Any)]():
|
|||
def constrained_by_gradual_list[T: (list[Base], list[Any])]():
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list = list[Base]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list).specialize_constrained(ConstraintSet.always()))
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list = list[object]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list).specialize_constrained(ConstraintSet.range(Never, T, list[object])))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual_list = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list = list[Base] & list[Any]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list).specialize_constrained(ConstraintSet.range(Never, T, list[Any])))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(constrained_by_gradual_list).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list = list[Base]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list).specialize_constrained(ConstraintSet.range(Never, T, list[Base])))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list = list[Base]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list).specialize_constrained(ConstraintSet.range(list[Base], T, object)))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list = list[Unrelated]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list).specialize_constrained(ConstraintSet.range(Never, T, list[Unrelated])))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Unrelated]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list = Top[list[Any]]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list).specialize_constrained(ConstraintSet.range(list[Unrelated], T, object)))
|
||||
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list = list[Super]]
|
||||
|
|
@ -257,14 +313,25 @@ def constrained_by_gradual_list[T: (list[Base], list[Any])]():
|
|||
def constrained_by_gradual_list_reverse[T: (list[Any], list[Base])]():
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list_reverse = list[Base]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list_reverse).specialize_constrained(ConstraintSet.always()))
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list_reverse = list[object]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list_reverse).specialize_constrained(ConstraintSet.range(Never, T, list[object])))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual_list_reverse = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list_reverse = list[Base] & list[Any]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list_reverse).specialize_constrained(ConstraintSet.range(Never, T, list[Any])))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(constrained_by_gradual_list_reverse).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list_reverse = list[Base]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list_reverse).specialize_constrained(ConstraintSet.range(Never, T, list[Base])))
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list_reverse = list[Base]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list_reverse).specialize_constrained(ConstraintSet.range(list[Base], T, object)))
|
||||
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list_reverse = list[Unrelated]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list_reverse).specialize_constrained(ConstraintSet.range(Never, T, list[Unrelated])))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Unrelated]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list_reverse = Top[list[Any]]]
|
||||
reveal_type(generic_context(constrained_by_gradual_list_reverse).specialize_constrained(ConstraintSet.range(list[Unrelated], T, object)))
|
||||
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_gradual_list_reverse = list[Super]]
|
||||
|
|
@ -280,15 +347,26 @@ def constrained_by_two_gradual_lists[T: (list[Any], list[Any])]():
|
|||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual_lists = Top[list[Any]]]
|
||||
reveal_type(generic_context(constrained_by_two_gradual_lists).specialize_constrained(ConstraintSet.always()))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_two_gradual_lists = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual_lists = Top[list[Any]]]
|
||||
reveal_type(generic_context(constrained_by_two_gradual_lists).specialize_constrained(ConstraintSet.range(Never, T, object)))
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual_lists = list[Any]]
|
||||
reveal_type(generic_context(constrained_by_two_gradual_lists).specialize_constrained(ConstraintSet.range(Never, T, list[Any])))
|
||||
# revealed: None
|
||||
reveal_type(generic_context(constrained_by_two_gradual_lists).specialize_constrained(ConstraintSet.never()))
|
||||
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual_lists = list[Base]]
|
||||
reveal_type(generic_context(constrained_by_two_gradual_lists).specialize_constrained(ConstraintSet.range(Never, T, list[Base])))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Base]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual_lists = Top[list[Any]]]
|
||||
reveal_type(generic_context(constrained_by_two_gradual_lists).specialize_constrained(ConstraintSet.range(list[Base], T, object)))
|
||||
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual_lists = list[Unrelated]]
|
||||
reveal_type(generic_context(constrained_by_two_gradual_lists).specialize_constrained(ConstraintSet.range(Never, T, list[Unrelated])))
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Unrelated]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual_lists = Top[list[Any]]]
|
||||
reveal_type(generic_context(constrained_by_two_gradual_lists).specialize_constrained(ConstraintSet.range(list[Unrelated], T, object)))
|
||||
|
||||
# TODO: revealed: ty_extensions.Specialization[T@constrained_by_gradual = list[Any]]
|
||||
# revealed: ty_extensions.Specialization[T@constrained_by_two_gradual_lists = list[Super]]
|
||||
|
|
|
|||
|
|
@ -214,7 +214,7 @@ def _(int_or_int: IntOrInt, list_of_int_or_list_of_int: ListOfIntOrListOfInt):
|
|||
`NoneType` has no special or-operator behavior, so this is an error:
|
||||
|
||||
```py
|
||||
None | None # error: [unsupported-operator] "Operator `|` is not supported between objects of type `None` and `None`"
|
||||
None | None # error: [unsupported-operator] "Operator `|` is not supported between two objects of type `None`"
|
||||
```
|
||||
|
||||
When constructing something nonsensical like `int | 1`, we emit a diagnostic for the expression
|
||||
|
|
@ -414,6 +414,7 @@ def _(
|
|||
list_or_tuple_legacy: ListOrTupleLegacy[int],
|
||||
my_callable: MyCallable[[str, bytes], int],
|
||||
annotated_int: AnnotatedType[int],
|
||||
# error: [invalid-type-form] "A type variable itself cannot be specialized"
|
||||
transparent_alias: TransparentAlias[int],
|
||||
optional_int: MyOptional[int],
|
||||
):
|
||||
|
|
@ -427,7 +428,7 @@ def _(
|
|||
reveal_type(list_or_tuple_legacy) # revealed: list[int] | tuple[int, ...]
|
||||
reveal_type(my_callable) # revealed: (str, bytes, /) -> int
|
||||
reveal_type(annotated_int) # revealed: int
|
||||
reveal_type(transparent_alias) # revealed: int
|
||||
reveal_type(transparent_alias) # revealed: Unknown
|
||||
reveal_type(optional_int) # revealed: int | None
|
||||
```
|
||||
|
||||
|
|
@ -653,13 +654,92 @@ def g(obj: Y[bool, range]):
|
|||
|
||||
A generic alias that is already fully specialized cannot be specialized again:
|
||||
|
||||
```toml
|
||||
[environment]
|
||||
python-version = "3.12"
|
||||
```
|
||||
|
||||
```py
|
||||
from typing import Protocol, TypeVar, TypedDict
|
||||
|
||||
ListOfInts = list[int]
|
||||
|
||||
# error: [invalid-type-arguments] "Too many type arguments: expected 0, got 1"
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type: `<class 'list[int]'>` is already specialized"
|
||||
def _(doubly_specialized: ListOfInts[int]):
|
||||
# TODO: This should ideally be `list[Unknown]` or `Unknown`
|
||||
reveal_type(doubly_specialized) # revealed: list[int]
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
|
||||
type ListOfInts2 = list[int]
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias: `list[int]` is already specialized"
|
||||
DoublySpecialized = ListOfInts2[int]
|
||||
|
||||
def _(doubly_specialized: DoublySpecialized):
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type: `<class 'list[int]'>` is already specialized"
|
||||
List = list[int][int]
|
||||
|
||||
def _(doubly_specialized: List):
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
|
||||
Tuple = tuple[int, str]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type: `<class 'tuple[int, str]'>` is already specialized"
|
||||
def _(doubly_specialized: Tuple[int]):
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
class LegacyProto(Protocol[T]):
|
||||
pass
|
||||
|
||||
LegacyProtoInt = LegacyProto[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type: `<class 'LegacyProto[int]'>` is already specialized"
|
||||
def _(doubly_specialized: LegacyProtoInt[int]):
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
|
||||
class Proto[T](Protocol):
|
||||
pass
|
||||
|
||||
ProtoInt = Proto[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type: `<class 'Proto[int]'>` is already specialized"
|
||||
def _(doubly_specialized: ProtoInt[int]):
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
|
||||
# TODO: TypedDict is just a function object at runtime, we should emit an error
|
||||
class LegacyDict(TypedDict[T]):
|
||||
x: T
|
||||
|
||||
# TODO: should be a `non-subscriptable` error
|
||||
LegacyDictInt = LegacyDict[int]
|
||||
|
||||
# TODO: should be a `non-subscriptable` error
|
||||
def _(doubly_specialized: LegacyDictInt[int]):
|
||||
# TODO: should be `Unknown`
|
||||
reveal_type(doubly_specialized) # revealed: @Todo(Inference of subscript on special form)
|
||||
|
||||
class Dict[T](TypedDict):
|
||||
x: T
|
||||
|
||||
DictInt = Dict[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type: `<class 'Dict[int]'>` is already specialized"
|
||||
def _(doubly_specialized: DictInt[int]):
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
|
||||
Union = list[str] | list[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type: `<types.UnionType special form 'list[str] | list[int]'>` is already specialized"
|
||||
def _(doubly_specialized: Union[int]):
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
|
||||
type MyListAlias[T] = list[T]
|
||||
MyListOfInts = MyListAlias[int]
|
||||
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type alias: Double specialization is not allowed"
|
||||
def _(doubly_specialized: MyListOfInts[int]):
|
||||
reveal_type(doubly_specialized) # revealed: Unknown
|
||||
```
|
||||
|
||||
Specializing a generic implicit type alias with an incorrect number of type arguments also results
|
||||
|
|
@ -695,23 +775,21 @@ def this_does_not_work() -> TypeOf[IntOrStr]:
|
|||
raise NotImplementedError()
|
||||
|
||||
def _(
|
||||
# TODO: Better error message (of kind `invalid-type-form`)?
|
||||
# error: [invalid-type-arguments] "Too many type arguments: expected 0, got 1"
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type"
|
||||
specialized: this_does_not_work()[int],
|
||||
):
|
||||
reveal_type(specialized) # revealed: int | str
|
||||
reveal_type(specialized) # revealed: Unknown
|
||||
```
|
||||
|
||||
Similarly, if you try to specialize a union type without a binding context, we emit an error:
|
||||
|
||||
```py
|
||||
# TODO: Better error message (of kind `invalid-type-form`)?
|
||||
# error: [invalid-type-arguments] "Too many type arguments: expected 0, got 1"
|
||||
# error: [non-subscriptable] "Cannot subscript non-generic type"
|
||||
x: (list[T] | set[T])[int]
|
||||
|
||||
def _():
|
||||
# TODO: `list[Unknown] | set[Unknown]` might be better
|
||||
reveal_type(x) # revealed: list[typing.TypeVar] | set[typing.TypeVar]
|
||||
reveal_type(x) # revealed: Unknown
|
||||
```
|
||||
|
||||
### Multiple definitions
|
||||
|
|
|
|||
|
|
@ -8,12 +8,87 @@ two projects in a monorepo have conflicting definitions (but we want to analyze
|
|||
|
||||
In practice these tests cover what we call "desperate module resolution" which, when an import
|
||||
fails, results in us walking up the ancestor directories of the importing file and trying those as
|
||||
"desperate search-paths".
|
||||
"desperate search-paths" until one works.
|
||||
|
||||
Currently desperate search-paths are restricted to subdirectories of the first-party search-path
|
||||
(the directory you're running `ty` in). Currently we only consider one desperate search-path: the
|
||||
closest ancestor directory containing a `pyproject.toml`. In the future we may want to try every
|
||||
ancestor `pyproject.toml` or every ancestor directory.
|
||||
(typically, the directory you're running `ty` in).
|
||||
|
||||
There are two styles of desperate search-path we consider: "absolute" and "relative". Absolute
|
||||
desperate search-paths are used for resolving absolute imports (`import a.b.c`) while relative
|
||||
desperate search-paths are used for resolving relative imports (`from .c import x`).
|
||||
|
||||
Only the closest directory that contains either a `pyproject.toml` or `ty.toml` is a valid relative
|
||||
desperate search-path.
|
||||
|
||||
All ancestor directories that *do not* contain an `__init__.py(i)` are valid absolute desperate
|
||||
search-paths.
|
||||
|
||||
(Distracting detail: to ensure relative desperate search-paths are always valid absolute desperate
|
||||
search-paths, a directory that contains an `__init__.py(i)` *and* either a `pyproject.toml` or
|
||||
`ty.toml` is also a valid absolute search-path, but this shouldn't matter in practice, as you do not
|
||||
typically have those two kinds of file in the same directory.)
|
||||
|
||||
## Relative Desperate Search-Paths
|
||||
|
||||
We do not directly resolve relative imports. Instead we have a two-phase process:
|
||||
|
||||
1. Convert the relative module name `.c` to an absolute one `a.b.c`
|
||||
1. Resolve the absolute import `a.b.c`
|
||||
|
||||
(This allows us to transparently handle packaging semantics that mandate separate directories should
|
||||
be "logically combined" into a single directory, like namespace packages and stub packages.)
|
||||
|
||||
Relative desperate search-paths only appear in step 1, where we compute the module name of the
|
||||
importing file as the first step in resolving `.` to an absolute module name.
|
||||
|
||||
In practice, relative desperate search-paths are rarely needed because it usually doesn't matter if
|
||||
we think `.` is `a.b` or `b` when resolving `.c`: the fact that we computed `a.b` using our
|
||||
search-paths means `a.b.c` is what will resolve with those search-paths!
|
||||
|
||||
There are three caveats to this:
|
||||
|
||||
- If the module name we compute is *too short* then too many relative levels will fail to resolve
|
||||
(`..c` resolves in `a.b` but not `b`).
|
||||
- If the module name is *too long* then we may encounter directories that aren't valid module names,
|
||||
and reject the import (`my-proj.a.b.c` is not a valid module name).
|
||||
- Sloppiness will break relative imports in any kind of packaging situation where different
|
||||
directories are supposed to be "logically combined".
|
||||
|
||||
The fact that we restrict desperate resolution to the first-party search-path ("the project you're
|
||||
working on") allows us to largely dismiss the last concern for the purposes of this discussion. The
|
||||
remaining two concerns encourage us to find "the longest possible module name without stumbling into
|
||||
random nonsense directories". When we need relative desperate search-paths we are usually running
|
||||
into the "too long" problem and "snap to the parent `pyproject.toml` (or `ty.toml`)" tends to
|
||||
resolve it well!
|
||||
|
||||
As a more aesthetic concern, this approach also ensures that all the files under a given
|
||||
`pyproject.toml` will, when faced with desperation, agree on eachother's relative module names. This
|
||||
may or may not be important, but it's definitely *reassuring* and *satisfying*!
|
||||
|
||||
## Absolute Desperate Search-Paths
|
||||
|
||||
Absolute desperate search-paths are much more load-bearing, because if we're handed the absolute
|
||||
import `a.b.c` then there is only one possible search-path that will properly resolve this the way
|
||||
the user wants, and if that search-path isn't configured we will fail.
|
||||
|
||||
Basic heuristics like checking for `<working-dir>/src/` and resolving editables in the local `.venv`
|
||||
work well in most cases, but desperate resolution is needed in a couple key scenarios:
|
||||
|
||||
- Test or script directories have a tendency to assume extra search-paths that aren't structurally
|
||||
obvious ([notably pytest](https://docs.pytest.org/en/stable/explanation/pythonpath.html))
|
||||
- If you open the root of a monorepo in an IDE, you will often have many separate projects but no
|
||||
configuration explaining this. Absolute imports within each project should resolve things in
|
||||
that project.
|
||||
|
||||
The latter case is often handled reasonably well by the the `pyproject.toml` rule that relative
|
||||
desperate search-paths have. However the more complex testing/scripting scenarios tend to fall over
|
||||
here -- in the limit pytest will add literally every ancestor to the search-path, and so we simply
|
||||
need to try every single one and hope *one* works for every absolute import (and it might be a
|
||||
different one for different imports).
|
||||
|
||||
We exclude directories that contain an `__init__.py(i)` because there shouldn't be any reasonable
|
||||
scenario where we need to "truncate" a regular package like that (and pytest's Exciting behaviour
|
||||
here is explicitly disabled by `__init__.py`).
|
||||
|
||||
## Invalid Names
|
||||
|
||||
|
|
@ -134,13 +209,11 @@ from .mod1 import x
|
|||
|
||||
# error: [unresolved-import]
|
||||
from . import mod2
|
||||
|
||||
# error: [unresolved-import]
|
||||
import mod3
|
||||
|
||||
reveal_type(x) # revealed: Unknown
|
||||
reveal_type(mod2.y) # revealed: Unknown
|
||||
reveal_type(mod3.z) # revealed: Unknown
|
||||
reveal_type(mod3.z) # revealed: int
|
||||
```
|
||||
|
||||
`my-proj/tests/mod1.py`:
|
||||
|
|
@ -338,21 +411,6 @@ create, and we are now very sensitive to precise search-path ordering.**
|
|||
|
||||
Here the use of editables means that `a/` has higher priority than `a/src/a/`.
|
||||
|
||||
Somehow this results in `a/tests/test1.py` being able to resolve `.setup` but not `.`.
|
||||
|
||||
My best guess is that in this state we can resolve regular modules in `a/tests/` but not namespace
|
||||
packages because we have some extra validation for namespace packages conflicted by regular
|
||||
packages, but that validation isn't applied when we successfully resolve a submodule of the
|
||||
namespace package.
|
||||
|
||||
In this case, as we find that `a/tests/test1.py` matches on the first-party path as `a.tests.test1`
|
||||
and is syntactically valid. We then resolve `a.tests.test1` and because the namespace package
|
||||
(`/a/`) comes first we succeed. We then syntactically compute `.` to be `a.tests`.
|
||||
|
||||
When we go to lookup `a.tests.setup`, whatever grace that allowed `a.tests.test1` to resolve still
|
||||
works so it resolves too. However when we try to resolve `a.tests` on its own some additional
|
||||
validation rejects the namespace package conflicting with the regular package.
|
||||
|
||||
```toml
|
||||
[environment]
|
||||
# Setup a venv with editables for a/src/ and b/src/
|
||||
|
|
@ -385,17 +443,13 @@ b/src/
|
|||
`a/tests/test1.py`:
|
||||
|
||||
```py
|
||||
# TODO: there should be no errors in this file.
|
||||
|
||||
from .setup import x
|
||||
|
||||
# error: [unresolved-import]
|
||||
from . import setup
|
||||
from a import y
|
||||
import a
|
||||
|
||||
reveal_type(x) # revealed: int
|
||||
reveal_type(setup.x) # revealed: Unknown
|
||||
reveal_type(setup.x) # revealed: int
|
||||
reveal_type(y) # revealed: int
|
||||
reveal_type(a.y) # revealed: int
|
||||
```
|
||||
|
|
@ -422,17 +476,13 @@ y: int = 10
|
|||
`b/tests/test1.py`:
|
||||
|
||||
```py
|
||||
# TODO: there should be no errors in this file
|
||||
|
||||
from .setup import x
|
||||
|
||||
# error: [unresolved-import]
|
||||
from . import setup
|
||||
from b import y
|
||||
import b
|
||||
|
||||
reveal_type(x) # revealed: str
|
||||
reveal_type(setup.x) # revealed: Unknown
|
||||
reveal_type(setup.x) # revealed: str
|
||||
reveal_type(y) # revealed: str
|
||||
reveal_type(b.y) # revealed: str
|
||||
```
|
||||
|
|
|
|||
|
|
@ -271,8 +271,8 @@ method, which means that it is a *data* descriptor (if there is no setter, `__se
|
|||
available but yields an `AttributeError` at runtime).
|
||||
|
||||
```py
|
||||
reveal_type(type(attr_property).__get__) # revealed: <wrapper-descriptor `__get__` of `property` objects>
|
||||
reveal_type(type(attr_property).__set__) # revealed: <wrapper-descriptor `__set__` of `property` objects>
|
||||
reveal_type(type(attr_property).__get__) # revealed: <wrapper-descriptor '__get__' of 'property' objects>
|
||||
reveal_type(type(attr_property).__set__) # revealed: <wrapper-descriptor '__set__' of 'property' objects>
|
||||
```
|
||||
|
||||
When we access `c.attr`, the `__get__` method of the `property` class is called, passing the
|
||||
|
|
|
|||
|
|
@ -19,12 +19,15 @@ mdtest path: crates/ty_python_semantic/resources/mdtest/assignment/annotations.m
|
|||
# Diagnostics
|
||||
|
||||
```
|
||||
error[unsupported-operator]: Operator `|` is not supported between objects of type `<class 'int'>` and `<class 'str'>`
|
||||
error[unsupported-operator]: Unsupported `|` operation
|
||||
--> src/mdtest_snippet.py:2:12
|
||||
|
|
||||
1 | # error: [unsupported-operator]
|
||||
2 | IntOrStr = int | str
|
||||
| ^^^^^^^^^
|
||||
| ---^^^---
|
||||
| | |
|
||||
| | Has type `<class 'str'>`
|
||||
| Has type `<class 'int'>`
|
||||
|
|
||||
info: Note that `X | Y` PEP 604 union syntax is only available in Python 3.10 and later
|
||||
info: Python 3.9 was assumed when resolving types because it was specified on the command line
|
||||
|
|
|
|||
|
|
@ -0,0 +1,44 @@
|
|||
---
|
||||
source: crates/ty_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: augmented.md - Augmented assignment - Unsupported types
|
||||
mdtest path: crates/ty_python_semantic/resources/mdtest/assignment/augmented.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | class C:
|
||||
2 | def __isub__(self, other: str) -> int:
|
||||
3 | return 42
|
||||
4 |
|
||||
5 | x = C()
|
||||
6 | # error: [unsupported-operator] "Operator `-=` is not supported between objects of type `C` and `Literal[1]`"
|
||||
7 | x -= 1
|
||||
8 |
|
||||
9 | reveal_type(x) # revealed: int
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error[unsupported-operator]: Unsupported `-=` operation
|
||||
--> src/mdtest_snippet.py:7:1
|
||||
|
|
||||
5 | x = C()
|
||||
6 | # error: [unsupported-operator] "Operator `-=` is not supported between objects of type `C` and `Literal[1]`"
|
||||
7 | x -= 1
|
||||
| -^^^^-
|
||||
| | |
|
||||
| | Has type `Literal[1]`
|
||||
| Has type `C`
|
||||
8 |
|
||||
9 | reveal_type(x) # revealed: int
|
||||
|
|
||||
info: rule `unsupported-operator` is enabled by default
|
||||
|
||||
```
|
||||
|
|
@ -0,0 +1,80 @@
|
|||
---
|
||||
source: crates/ty_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: custom.md - Custom binary operations - Classes
|
||||
mdtest path: crates/ty_python_semantic/resources/mdtest/binary/custom.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mdtest_snippet.py
|
||||
|
||||
```
|
||||
1 | from typing import Literal
|
||||
2 |
|
||||
3 | class Yes:
|
||||
4 | def __add__(self, other) -> Literal["+"]:
|
||||
5 | return "+"
|
||||
6 |
|
||||
7 | class Sub(Yes): ...
|
||||
8 | class No: ...
|
||||
9 |
|
||||
10 | # error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'Yes'>`"
|
||||
11 | reveal_type(Yes + Yes) # revealed: Unknown
|
||||
12 | # error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'Sub'>`"
|
||||
13 | reveal_type(Sub + Sub) # revealed: Unknown
|
||||
14 | # error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'No'>`"
|
||||
15 | reveal_type(No + No) # revealed: Unknown
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error[unsupported-operator]: Unsupported `+` operation
|
||||
--> src/mdtest_snippet.py:11:13
|
||||
|
|
||||
10 | # error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'Yes'>`"
|
||||
11 | reveal_type(Yes + Yes) # revealed: Unknown
|
||||
| ---^^^---
|
||||
| |
|
||||
| Both operands have type `<class 'Yes'>`
|
||||
12 | # error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'Sub'>`"
|
||||
13 | reveal_type(Sub + Sub) # revealed: Unknown
|
||||
|
|
||||
info: rule `unsupported-operator` is enabled by default
|
||||
|
||||
```
|
||||
|
||||
```
|
||||
error[unsupported-operator]: Unsupported `+` operation
|
||||
--> src/mdtest_snippet.py:13:13
|
||||
|
|
||||
11 | reveal_type(Yes + Yes) # revealed: Unknown
|
||||
12 | # error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'Sub'>`"
|
||||
13 | reveal_type(Sub + Sub) # revealed: Unknown
|
||||
| ---^^^---
|
||||
| |
|
||||
| Both operands have type `<class 'Sub'>`
|
||||
14 | # error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'No'>`"
|
||||
15 | reveal_type(No + No) # revealed: Unknown
|
||||
|
|
||||
info: rule `unsupported-operator` is enabled by default
|
||||
|
||||
```
|
||||
|
||||
```
|
||||
error[unsupported-operator]: Unsupported `+` operation
|
||||
--> src/mdtest_snippet.py:15:13
|
||||
|
|
||||
13 | reveal_type(Sub + Sub) # revealed: Unknown
|
||||
14 | # error: [unsupported-operator] "Operator `+` is not supported between two objects of type `<class 'No'>`"
|
||||
15 | reveal_type(No + No) # revealed: Unknown
|
||||
| --^^^--
|
||||
| |
|
||||
| Both operands have type `<class 'No'>`
|
||||
|
|
||||
info: rule `unsupported-operator` is enabled by default
|
||||
|
||||
```
|
||||
|
|
@ -0,0 +1,44 @@
|
|||
---
|
||||
source: crates/ty_test/src/lib.rs
|
||||
expression: snapshot
|
||||
---
|
||||
---
|
||||
mdtest name: custom.md - Custom binary operations - Classes from different modules with the same name
|
||||
mdtest path: crates/ty_python_semantic/resources/mdtest/binary/custom.md
|
||||
---
|
||||
|
||||
# Python source files
|
||||
|
||||
## mod1.py
|
||||
|
||||
```
|
||||
1 | class A: ...
|
||||
```
|
||||
|
||||
## mod2.py
|
||||
|
||||
```
|
||||
1 | import mod1
|
||||
2 |
|
||||
3 | class A: ...
|
||||
4 |
|
||||
5 | # error: [unsupported-operator] "Operator `+` is not supported between objects of type `mod2.A` and `mod1.A`"
|
||||
6 | A() + mod1.A()
|
||||
```
|
||||
|
||||
# Diagnostics
|
||||
|
||||
```
|
||||
error[unsupported-operator]: Unsupported `+` operation
|
||||
--> src/mod2.py:6:1
|
||||
|
|
||||
5 | # error: [unsupported-operator] "Operator `+` is not supported between objects of type `mod2.A` and `mod1.A`"
|
||||
6 | A() + mod1.A()
|
||||
| ---^^^--------
|
||||
| | |
|
||||
| | Has type `mod1.A`
|
||||
| Has type `mod2.A`
|
||||
|
|
||||
info: rule `unsupported-operator` is enabled by default
|
||||
|
||||
```
|
||||
|
|
@ -24,11 +24,11 @@ reveal_type(+Sub()) # revealed: bool
|
|||
reveal_type(-Sub()) # revealed: str
|
||||
reveal_type(~Sub()) # revealed: int
|
||||
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for type `No`"
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for object of type `No`"
|
||||
reveal_type(+No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for type `No`"
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for object of type `No`"
|
||||
reveal_type(-No()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for type `No`"
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for object of type `No`"
|
||||
reveal_type(~No()) # revealed: Unknown
|
||||
```
|
||||
|
||||
|
|
@ -52,25 +52,25 @@ class Yes:
|
|||
class Sub(Yes): ...
|
||||
class No: ...
|
||||
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for type `<class 'Yes'>`"
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for object of type `<class 'Yes'>`"
|
||||
reveal_type(+Yes) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for type `<class 'Yes'>`"
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for object of type `<class 'Yes'>`"
|
||||
reveal_type(-Yes) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for type `<class 'Yes'>`"
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for object of type `<class 'Yes'>`"
|
||||
reveal_type(~Yes) # revealed: Unknown
|
||||
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for type `<class 'Sub'>`"
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for object of type `<class 'Sub'>`"
|
||||
reveal_type(+Sub) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for type `<class 'Sub'>`"
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for object of type `<class 'Sub'>`"
|
||||
reveal_type(-Sub) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for type `<class 'Sub'>`"
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for object of type `<class 'Sub'>`"
|
||||
reveal_type(~Sub) # revealed: Unknown
|
||||
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for type `<class 'No'>`"
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for object of type `<class 'No'>`"
|
||||
reveal_type(+No) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for type `<class 'No'>`"
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for object of type `<class 'No'>`"
|
||||
reveal_type(-No) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for type `<class 'No'>`"
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for object of type `<class 'No'>`"
|
||||
reveal_type(~No) # revealed: Unknown
|
||||
```
|
||||
|
||||
|
|
@ -80,11 +80,11 @@ reveal_type(~No) # revealed: Unknown
|
|||
def f():
|
||||
pass
|
||||
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for type `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for object of type `def f() -> Unknown`"
|
||||
reveal_type(+f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for type `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for object of type `def f() -> Unknown`"
|
||||
reveal_type(-f) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for type `def f() -> Unknown`"
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for object of type `def f() -> Unknown`"
|
||||
reveal_type(~f) # revealed: Unknown
|
||||
```
|
||||
|
||||
|
|
@ -113,25 +113,25 @@ def sub() -> type[Sub]:
|
|||
def no() -> type[No]:
|
||||
return No
|
||||
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for type `type[Yes]`"
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for object of type `type[Yes]`"
|
||||
reveal_type(+yes()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for type `type[Yes]`"
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for object of type `type[Yes]`"
|
||||
reveal_type(-yes()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for type `type[Yes]`"
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for object of type `type[Yes]`"
|
||||
reveal_type(~yes()) # revealed: Unknown
|
||||
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for type `type[Sub]`"
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for object of type `type[Sub]`"
|
||||
reveal_type(+sub()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for type `type[Sub]`"
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for object of type `type[Sub]`"
|
||||
reveal_type(-sub()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for type `type[Sub]`"
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for object of type `type[Sub]`"
|
||||
reveal_type(~sub()) # revealed: Unknown
|
||||
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for type `type[No]`"
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for object of type `type[No]`"
|
||||
reveal_type(+no()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for type `type[No]`"
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for object of type `type[No]`"
|
||||
reveal_type(-no()) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for type `type[No]`"
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for object of type `type[No]`"
|
||||
reveal_type(~no()) # revealed: Unknown
|
||||
```
|
||||
|
||||
|
|
@ -160,10 +160,10 @@ reveal_type(+Sub) # revealed: bool
|
|||
reveal_type(-Sub) # revealed: str
|
||||
reveal_type(~Sub) # revealed: int
|
||||
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for type `<class 'No'>`"
|
||||
# error: [unsupported-operator] "Unary operator `+` is not supported for object of type `<class 'No'>`"
|
||||
reveal_type(+No) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for type `<class 'No'>`"
|
||||
# error: [unsupported-operator] "Unary operator `-` is not supported for object of type `<class 'No'>`"
|
||||
reveal_type(-No) # revealed: Unknown
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for type `<class 'No'>`"
|
||||
# error: [unsupported-operator] "Unary operator `~` is not supported for object of type `<class 'No'>`"
|
||||
reveal_type(~No) # revealed: Unknown
|
||||
```
|
||||
|
|
|
|||
|
|
@ -27,7 +27,7 @@ reveal_type(~a) # revealed: Literal[True]
|
|||
class NoDunder: ...
|
||||
|
||||
b = NoDunder()
|
||||
+b # error: [unsupported-operator] "Unary operator `+` is not supported for type `NoDunder`"
|
||||
-b # error: [unsupported-operator] "Unary operator `-` is not supported for type `NoDunder`"
|
||||
~b # error: [unsupported-operator] "Unary operator `~` is not supported for type `NoDunder`"
|
||||
+b # error: [unsupported-operator] "Unary operator `+` is not supported for object of type `NoDunder`"
|
||||
-b # error: [unsupported-operator] "Unary operator `-` is not supported for object of type `NoDunder`"
|
||||
~b # error: [unsupported-operator] "Unary operator `~` is not supported for object of type `NoDunder`"
|
||||
```
|
||||
|
|
|
|||
|
|
@ -336,7 +336,14 @@ pub(crate) fn file_to_module(db: &dyn Db, file: File) -> Option<Module<'_>> {
|
|||
path,
|
||||
search_paths(db, ModuleResolveMode::StubsAllowed),
|
||||
)
|
||||
.or_else(|| file_to_module_impl(db, file, path, desperate_search_paths(db, file).iter()))
|
||||
.or_else(|| {
|
||||
file_to_module_impl(
|
||||
db,
|
||||
file,
|
||||
path,
|
||||
relative_desperate_search_paths(db, file).iter(),
|
||||
)
|
||||
})
|
||||
}
|
||||
|
||||
fn file_to_module_impl<'db, 'a>(
|
||||
|
|
@ -388,11 +395,81 @@ pub(crate) fn search_paths(db: &dyn Db, resolve_mode: ModuleResolveMode) -> Sear
|
|||
Program::get(db).search_paths(db).iter(db, resolve_mode)
|
||||
}
|
||||
|
||||
/// Get the search-paths that should be used for desperate resolution of imports in this file
|
||||
/// Get the search-paths for desperate resolution of absolute imports in this file.
|
||||
///
|
||||
/// Currently this is "the closest ancestor dir that contains a pyproject.toml", which is
|
||||
/// a completely arbitrary decision. We could potentially change this to return an iterator
|
||||
/// of every ancestor with a pyproject.toml or every ancestor.
|
||||
/// Currently this is "all ancestor directories that don't contain an `__init__.py(i)`"
|
||||
/// (from closest-to-importing-file to farthest).
|
||||
///
|
||||
/// (For paranoia purposes, all relative desperate search-paths are also absolute
|
||||
/// valid desperate search-paths, but don't worry about that.)
|
||||
///
|
||||
/// We exclude `__init__.py(i)` dirs to avoid truncating packages.
|
||||
#[salsa::tracked(heap_size=ruff_memory_usage::heap_size)]
|
||||
fn absolute_desperate_search_paths(db: &dyn Db, importing_file: File) -> Option<Vec<SearchPath>> {
|
||||
let system = db.system();
|
||||
let importing_path = importing_file.path(db).as_system_path()?;
|
||||
|
||||
// Only allow this if the importing_file is under the first-party search path
|
||||
let (base_path, rel_path) =
|
||||
search_paths(db, ModuleResolveMode::StubsAllowed).find_map(|search_path| {
|
||||
if !search_path.is_first_party() {
|
||||
return None;
|
||||
}
|
||||
Some((
|
||||
search_path.as_system_path()?,
|
||||
search_path.relativize_system_path_only(importing_path)?,
|
||||
))
|
||||
})?;
|
||||
|
||||
// Read the revision on the corresponding file root to
|
||||
// register an explicit dependency on this directory. When
|
||||
// the revision gets bumped, the cache that Salsa creates
|
||||
// for this routine will be invalidated.
|
||||
//
|
||||
// (This is conditional because ruff uses this code too and doesn't set roots)
|
||||
if let Some(root) = db.files().root(db, base_path) {
|
||||
let _ = root.revision(db);
|
||||
}
|
||||
|
||||
// Only allow searching up to the first-party path's root
|
||||
let mut search_paths = Vec::new();
|
||||
for rel_dir in rel_path.ancestors() {
|
||||
let candidate_path = base_path.join(rel_dir);
|
||||
if !system.is_directory(&candidate_path) {
|
||||
continue;
|
||||
}
|
||||
// Any dir that isn't a proper package is plausibly some test/script dir that could be
|
||||
// added as a search-path at runtime. Notably this reflects pytest's default mode where
|
||||
// it adds every dir with a .py to the search-paths (making all test files root modules),
|
||||
// unless they see an `__init__.py`, in which case they assume you don't want that.
|
||||
let isnt_regular_package = !system.is_file(&candidate_path.join("__init__.py"))
|
||||
&& !system.is_file(&candidate_path.join("__init__.pyi"));
|
||||
// Any dir with a pyproject.toml or ty.toml is a valid relative desperate search-path and
|
||||
// we want all of those to also be valid absolute desperate search-paths. It doesn't
|
||||
// make any sense for a folder to have `pyproject.toml` and `__init__.py` but let's
|
||||
// not let something cursed and spooky happen, ok? d
|
||||
if isnt_regular_package
|
||||
|| system.is_file(&candidate_path.join("pyproject.toml"))
|
||||
|| system.is_file(&candidate_path.join("ty.toml"))
|
||||
{
|
||||
let search_path = SearchPath::first_party(system, candidate_path).ok()?;
|
||||
search_paths.push(search_path);
|
||||
}
|
||||
}
|
||||
|
||||
if search_paths.is_empty() {
|
||||
None
|
||||
} else {
|
||||
Some(search_paths)
|
||||
}
|
||||
}
|
||||
|
||||
/// Get the search-paths for desperate resolution of relative imports in this file.
|
||||
///
|
||||
/// Currently this is "the closest ancestor dir that contains a pyproject.toml (or ty.toml)",
|
||||
/// which is a completely arbitrary decision. However it's farily important that relative
|
||||
/// desperate search-paths pick a single "best" answer because every one is *valid* but one
|
||||
/// that's too long or too short may cause problems.
|
||||
///
|
||||
/// For now this works well in common cases where we have some larger workspace that contains
|
||||
/// one or more python projects in sub-directories, and those python projects assume that
|
||||
|
|
@ -402,7 +479,7 @@ pub(crate) fn search_paths(db: &dyn Db, resolve_mode: ModuleResolveMode) -> Sear
|
|||
/// chaotic things. In particular, all files under a given pyproject.toml will currently
|
||||
/// agree on this being their desperate search-path, which is really nice.
|
||||
#[salsa::tracked(heap_size=ruff_memory_usage::heap_size)]
|
||||
fn desperate_search_paths(db: &dyn Db, importing_file: File) -> Option<SearchPath> {
|
||||
fn relative_desperate_search_paths(db: &dyn Db, importing_file: File) -> Option<SearchPath> {
|
||||
let system = db.system();
|
||||
let importing_path = importing_file.path(db).as_system_path()?;
|
||||
|
||||
|
|
@ -431,13 +508,15 @@ fn desperate_search_paths(db: &dyn Db, importing_file: File) -> Option<SearchPat
|
|||
// Only allow searching up to the first-party path's root
|
||||
for rel_dir in rel_path.ancestors() {
|
||||
let candidate_path = base_path.join(rel_dir);
|
||||
if system.path_exists(&candidate_path.join("pyproject.toml"))
|
||||
|| system.path_exists(&candidate_path.join("ty.toml"))
|
||||
// Any dir with a pyproject.toml or ty.toml might be a project root
|
||||
if system.is_file(&candidate_path.join("pyproject.toml"))
|
||||
|| system.is_file(&candidate_path.join("ty.toml"))
|
||||
{
|
||||
let search_path = SearchPath::first_party(system, candidate_path).ok()?;
|
||||
return Some(search_path);
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
#[derive(Clone, Debug, PartialEq, Eq, get_size2::GetSize)]
|
||||
|
|
@ -960,8 +1039,8 @@ fn desperately_resolve_name(
|
|||
name: &ModuleName,
|
||||
mode: ModuleResolveMode,
|
||||
) -> Option<ResolvedName> {
|
||||
let search_paths = desperate_search_paths(db, importing_file);
|
||||
resolve_name_impl(db, name, mode, search_paths.iter())
|
||||
let search_paths = absolute_desperate_search_paths(db, importing_file);
|
||||
resolve_name_impl(db, name, mode, search_paths.iter().flatten())
|
||||
}
|
||||
|
||||
fn resolve_name_impl<'a>(
|
||||
|
|
|
|||
|
|
@ -935,7 +935,30 @@ impl<'db> Type<'db> {
|
|||
// type for each overload of each function definition.
|
||||
(Type::FunctionLiteral(_), Type::FunctionLiteral(_)) => self,
|
||||
|
||||
_ => UnionType::from_elements_cycle_recovery(db, [self, previous]),
|
||||
_ => {
|
||||
// Also avoid unioning in a previous type which contains a Divergent from the
|
||||
// current cycle, if the most-recent type does not. This cannot cause an
|
||||
// oscillation, since Divergent is only introduced at the start of fixpoint
|
||||
// iteration.
|
||||
let has_divergent_type_in_cycle = |ty| {
|
||||
any_over_type(
|
||||
db,
|
||||
ty,
|
||||
&|nested_ty| {
|
||||
matches!(
|
||||
nested_ty,
|
||||
Type::Dynamic(DynamicType::Divergent(DivergentType { id }))
|
||||
if cycle.head_ids().contains(&id))
|
||||
},
|
||||
false,
|
||||
)
|
||||
};
|
||||
if has_divergent_type_in_cycle(previous) && !has_divergent_type_in_cycle(self) {
|
||||
self
|
||||
} else {
|
||||
UnionType::from_elements_cycle_recovery(db, [self, previous])
|
||||
}
|
||||
}
|
||||
}
|
||||
.recursive_type_normalized(db, cycle)
|
||||
}
|
||||
|
|
@ -1002,6 +1025,41 @@ impl<'db> Type<'db> {
|
|||
matches!(self, Type::GenericAlias(_))
|
||||
}
|
||||
|
||||
/// Returns whether the definition of this type is generic
|
||||
/// (this is different from whether this type *is* a generic type; a type that is already fully specialized is not a generic type).
|
||||
pub(crate) fn is_definition_generic(self, db: &'db dyn Db) -> bool {
|
||||
match self {
|
||||
Type::Union(union) => union
|
||||
.elements(db)
|
||||
.iter()
|
||||
.any(|ty| ty.is_definition_generic(db)),
|
||||
Type::Intersection(intersection) => {
|
||||
intersection
|
||||
.positive(db)
|
||||
.iter()
|
||||
.any(|ty| ty.is_definition_generic(db))
|
||||
|| intersection
|
||||
.negative(db)
|
||||
.iter()
|
||||
.any(|ty| ty.is_definition_generic(db))
|
||||
}
|
||||
Type::NominalInstance(instance_type) => instance_type.is_definition_generic(),
|
||||
Type::ProtocolInstance(protocol) => {
|
||||
matches!(protocol.inner, Protocol::FromClass(class) if class.is_generic())
|
||||
}
|
||||
Type::TypedDict(typed_dict) => typed_dict
|
||||
.defining_class()
|
||||
.is_some_and(ClassType::is_generic),
|
||||
Type::Dynamic(dynamic) => {
|
||||
matches!(dynamic, DynamicType::UnknownGeneric(_))
|
||||
}
|
||||
// Due to inheritance rules, enums cannot be generic.
|
||||
Type::EnumLiteral(_) => false,
|
||||
// Once generic NewType is officially specified, handle it.
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
const fn is_dynamic(&self) -> bool {
|
||||
matches!(self, Type::Dynamic(_))
|
||||
}
|
||||
|
|
@ -1798,7 +1856,7 @@ impl<'db> Type<'db> {
|
|||
Type::GenericAlias(alias) => Some(ClassType::Generic(alias).into_callable(db)),
|
||||
|
||||
Type::NewTypeInstance(newtype) => {
|
||||
Type::instance(db, newtype.base_class_type(db)).try_upcast_to_callable(db)
|
||||
newtype.concrete_base_type(db).try_upcast_to_callable(db)
|
||||
}
|
||||
|
||||
// TODO: This is unsound so in future we can consider an opt-in option to disable it.
|
||||
|
|
@ -2018,21 +2076,9 @@ impl<'db> Type<'db> {
|
|||
// only has to hold when the typevar has a valid specialization (i.e., one that
|
||||
// satisfies the upper bound/constraints).
|
||||
if let Type::TypeVar(bound_typevar) = self {
|
||||
return ConstraintSet::constrain_typevar(
|
||||
db,
|
||||
bound_typevar,
|
||||
Type::Never,
|
||||
target,
|
||||
relation,
|
||||
);
|
||||
return ConstraintSet::constrain_typevar(db, bound_typevar, Type::Never, target);
|
||||
} else if let Type::TypeVar(bound_typevar) = target {
|
||||
return ConstraintSet::constrain_typevar(
|
||||
db,
|
||||
bound_typevar,
|
||||
self,
|
||||
Type::object(),
|
||||
relation,
|
||||
);
|
||||
return ConstraintSet::constrain_typevar(db, bound_typevar, self, Type::object());
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -2975,17 +3021,16 @@ impl<'db> Type<'db> {
|
|||
self_newtype.has_relation_to_impl(db, target_newtype)
|
||||
}
|
||||
|
||||
(
|
||||
Type::NewTypeInstance(self_newtype),
|
||||
Type::NominalInstance(target_nominal_instance),
|
||||
) => self_newtype.base_class_type(db).has_relation_to_impl(
|
||||
db,
|
||||
target_nominal_instance.class(db),
|
||||
inferable,
|
||||
relation,
|
||||
relation_visitor,
|
||||
disjointness_visitor,
|
||||
),
|
||||
(Type::NewTypeInstance(self_newtype), _) => {
|
||||
self_newtype.concrete_base_type(db).has_relation_to_impl(
|
||||
db,
|
||||
target,
|
||||
inferable,
|
||||
relation,
|
||||
relation_visitor,
|
||||
disjointness_visitor,
|
||||
)
|
||||
}
|
||||
|
||||
(Type::PropertyInstance(_), _) => {
|
||||
KnownClass::Property.to_instance(db).has_relation_to_impl(
|
||||
|
|
@ -3006,10 +3051,9 @@ impl<'db> Type<'db> {
|
|||
disjointness_visitor,
|
||||
),
|
||||
|
||||
// Other than the special cases enumerated above, nominal-instance types, and
|
||||
// newtype-instance types are never subtypes of any other variants
|
||||
// Other than the special cases enumerated above, nominal-instance types are never
|
||||
// subtypes of any other variants
|
||||
(Type::NominalInstance(_), _) => ConstraintSet::from(false),
|
||||
(Type::NewTypeInstance(_), _) => ConstraintSet::from(false),
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -3938,7 +3982,7 @@ impl<'db> Type<'db> {
|
|||
left.is_disjoint_from_impl(db, right)
|
||||
}
|
||||
(Type::NewTypeInstance(newtype), other) | (other, Type::NewTypeInstance(newtype)) => {
|
||||
Type::instance(db, newtype.base_class_type(db)).is_disjoint_from_impl(
|
||||
newtype.concrete_base_type(db).is_disjoint_from_impl(
|
||||
db,
|
||||
other,
|
||||
inferable,
|
||||
|
|
@ -4169,9 +4213,7 @@ impl<'db> Type<'db> {
|
|||
Type::TypeIs(type_is) => type_is.is_bound(db),
|
||||
Type::TypedDict(_) => false,
|
||||
Type::TypeAlias(alias) => alias.value_type(db).is_singleton(db),
|
||||
Type::NewTypeInstance(newtype) => {
|
||||
Type::instance(db, newtype.base_class_type(db)).is_singleton(db)
|
||||
}
|
||||
Type::NewTypeInstance(newtype) => newtype.concrete_base_type(db).is_singleton(db),
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -4222,9 +4264,7 @@ impl<'db> Type<'db> {
|
|||
}
|
||||
|
||||
Type::NominalInstance(instance) => instance.is_single_valued(db),
|
||||
Type::NewTypeInstance(newtype) => {
|
||||
Type::instance(db, newtype.base_class_type(db)).is_single_valued(db)
|
||||
}
|
||||
Type::NewTypeInstance(newtype) => newtype.concrete_base_type(db).is_single_valued(db),
|
||||
|
||||
Type::BoundSuper(_) => {
|
||||
// At runtime two super instances never compare equal, even if their arguments are identical.
|
||||
|
|
@ -4476,7 +4516,9 @@ impl<'db> Type<'db> {
|
|||
Type::Dynamic(_) | Type::Never => Place::bound(self).into(),
|
||||
|
||||
Type::NominalInstance(instance) => instance.class(db).instance_member(db, name),
|
||||
Type::NewTypeInstance(newtype) => newtype.base_class_type(db).instance_member(db, name),
|
||||
Type::NewTypeInstance(newtype) => {
|
||||
newtype.concrete_base_type(db).instance_member(db, name)
|
||||
}
|
||||
|
||||
Type::ProtocolInstance(protocol) => protocol.instance_member(db, name),
|
||||
|
||||
|
|
@ -5592,8 +5634,11 @@ impl<'db> Type<'db> {
|
|||
.value_type(db)
|
||||
.try_bool_impl(db, allow_short_circuit, visitor)
|
||||
})?,
|
||||
Type::NewTypeInstance(newtype) => Type::instance(db, newtype.base_class_type(db))
|
||||
.try_bool_impl(db, allow_short_circuit, visitor)?,
|
||||
Type::NewTypeInstance(newtype) => {
|
||||
newtype
|
||||
.concrete_base_type(db)
|
||||
.try_bool_impl(db, allow_short_circuit, visitor)?
|
||||
}
|
||||
};
|
||||
|
||||
Ok(truthiness)
|
||||
|
|
@ -6561,7 +6606,7 @@ impl<'db> Type<'db> {
|
|||
|
||||
match ty {
|
||||
Type::NominalInstance(nominal) => nominal.tuple_spec(db),
|
||||
Type::NewTypeInstance(newtype) => non_async_special_case(db, Type::instance(db, newtype.base_class_type(db))),
|
||||
Type::NewTypeInstance(newtype) => non_async_special_case(db, newtype.concrete_base_type(db)),
|
||||
Type::GenericAlias(alias) if alias.origin(db).is_tuple(db) => {
|
||||
Some(Cow::Owned(TupleSpec::homogeneous(todo_type!(
|
||||
"*tuple[] annotations"
|
||||
|
|
@ -7293,29 +7338,12 @@ impl<'db> Type<'db> {
|
|||
// https://typing.python.org/en/latest/spec/special-types.html#special-cases-for-float-and-complex
|
||||
Type::ClassLiteral(class) => {
|
||||
let ty = match class.known(db) {
|
||||
Some(KnownClass::Complex) => UnionType::from_elements(
|
||||
db,
|
||||
[
|
||||
KnownClass::Int.to_instance(db),
|
||||
KnownClass::Float.to_instance(db),
|
||||
KnownClass::Complex.to_instance(db),
|
||||
],
|
||||
),
|
||||
Some(KnownClass::Float) => UnionType::from_elements(
|
||||
db,
|
||||
[
|
||||
KnownClass::Int.to_instance(db),
|
||||
KnownClass::Float.to_instance(db),
|
||||
],
|
||||
),
|
||||
_ if class.is_typed_dict(db) => {
|
||||
Type::typed_dict(class.default_specialization(db))
|
||||
}
|
||||
Some(KnownClass::Complex) => KnownUnion::Complex.to_type(db),
|
||||
Some(KnownClass::Float) => KnownUnion::Float.to_type(db),
|
||||
_ => Type::instance(db, class.default_specialization(db)),
|
||||
};
|
||||
Ok(ty)
|
||||
}
|
||||
Type::GenericAlias(alias) if alias.is_typed_dict(db) => Ok(Type::typed_dict(*alias)),
|
||||
Type::GenericAlias(alias) => Ok(Type::instance(db, ClassType::from(*alias))),
|
||||
|
||||
Type::SubclassOf(_)
|
||||
|
|
@ -7674,7 +7702,7 @@ impl<'db> Type<'db> {
|
|||
),
|
||||
},
|
||||
Type::TypeAlias(alias) => alias.value_type(db).to_meta_type(db),
|
||||
Type::NewTypeInstance(newtype) => Type::from(newtype.base_class_type(db)),
|
||||
Type::NewTypeInstance(newtype) => newtype.concrete_base_type(db).to_meta_type(db),
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -8023,7 +8051,7 @@ impl<'db> Type<'db> {
|
|||
) {
|
||||
let matching_typevar = |bound_typevar: &BoundTypeVarInstance<'db>| {
|
||||
match bound_typevar.typevar(db).kind(db) {
|
||||
TypeVarKind::Legacy | TypeVarKind::TypingSelf
|
||||
TypeVarKind::Legacy | TypeVarKind::Pep613Alias | TypeVarKind::TypingSelf
|
||||
if binding_context.is_none_or(|binding_context| {
|
||||
bound_typevar.binding_context(db)
|
||||
== BindingContext::Definition(binding_context)
|
||||
|
|
@ -8343,6 +8371,7 @@ impl<'db> Type<'db> {
|
|||
KnownInstanceType::TypeAliasType(type_alias) => {
|
||||
type_alias.definition(db).map(TypeDefinition::TypeAlias)
|
||||
}
|
||||
KnownInstanceType::NewType(newtype) => Some(TypeDefinition::NewType(newtype.definition(db))),
|
||||
_ => None,
|
||||
},
|
||||
|
||||
|
|
@ -8880,9 +8909,7 @@ fn walk_known_instance_type<'db, V: visitor::TypeVisitor<'db> + ?Sized>(
|
|||
visitor.visit_callable_type(db, callable);
|
||||
}
|
||||
KnownInstanceType::NewType(newtype) => {
|
||||
if let ClassType::Generic(generic_alias) = newtype.base_class_type(db) {
|
||||
visitor.visit_generic_alias_type(db, generic_alias);
|
||||
}
|
||||
visitor.visit_type(db, newtype.concrete_base_type(db));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -9502,6 +9529,8 @@ pub enum TypeVarKind {
|
|||
ParamSpec,
|
||||
/// `def foo[**P]() -> None: ...`
|
||||
Pep695ParamSpec,
|
||||
/// `Alias: typing.TypeAlias = T`
|
||||
Pep613Alias,
|
||||
}
|
||||
|
||||
impl TypeVarKind {
|
||||
|
|
@ -14032,6 +14061,61 @@ impl<'db> UnionType<'db> {
|
|||
|
||||
ConstraintSet::from(sorted_self == other.normalized(db))
|
||||
}
|
||||
|
||||
/// Identify some specific unions of known classes, currently the ones that `float` and
|
||||
/// `complex` expand into in type position.
|
||||
pub(crate) fn known(self, db: &'db dyn Db) -> Option<KnownUnion> {
|
||||
let mut has_int = false;
|
||||
let mut has_float = false;
|
||||
let mut has_complex = false;
|
||||
for element in self.elements(db) {
|
||||
if let Type::NominalInstance(nominal) = element
|
||||
&& let Some(known) = nominal.known_class(db)
|
||||
{
|
||||
match known {
|
||||
KnownClass::Int => has_int = true,
|
||||
KnownClass::Float => has_float = true,
|
||||
KnownClass::Complex => has_complex = true,
|
||||
_ => return None,
|
||||
}
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
match (has_int, has_float, has_complex) {
|
||||
(true, true, false) => Some(KnownUnion::Float),
|
||||
(true, true, true) => Some(KnownUnion::Complex),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub(crate) enum KnownUnion {
|
||||
Float, // `int | float`
|
||||
Complex, // `int | float | complex`
|
||||
}
|
||||
|
||||
impl KnownUnion {
|
||||
pub(crate) fn to_type(self, db: &dyn Db) -> Type<'_> {
|
||||
match self {
|
||||
KnownUnion::Float => UnionType::from_elements(
|
||||
db,
|
||||
[
|
||||
KnownClass::Int.to_instance(db),
|
||||
KnownClass::Float.to_instance(db),
|
||||
],
|
||||
),
|
||||
KnownUnion::Complex => UnionType::from_elements(
|
||||
db,
|
||||
[
|
||||
KnownClass::Int.to_instance(db),
|
||||
KnownClass::Float.to_instance(db),
|
||||
KnownClass::Complex.to_instance(db),
|
||||
],
|
||||
),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[salsa::interned(debug, heap_size=IntersectionType::heap_size)]
|
||||
|
|
|
|||
|
|
@ -429,7 +429,7 @@ impl<'db> BoundSuperType<'db> {
|
|||
);
|
||||
}
|
||||
Type::NewTypeInstance(newtype) => {
|
||||
return delegate_to(Type::instance(db, newtype.base_class_type(db)));
|
||||
return delegate_to(newtype.concrete_base_type(db));
|
||||
}
|
||||
Type::Callable(callable) if callable.is_function_like(db) => {
|
||||
return delegate_to(KnownClass::FunctionType.to_instance(db));
|
||||
|
|
|
|||
|
|
@ -668,6 +668,7 @@ pub(crate) struct IntersectionBuilder<'db> {
|
|||
// but if a union is added to the intersection, we'll distribute ourselves over that union and
|
||||
// create a union of intersections.
|
||||
intersections: Vec<InnerIntersectionBuilder<'db>>,
|
||||
order_elements: bool,
|
||||
db: &'db dyn Db,
|
||||
}
|
||||
|
||||
|
|
@ -675,6 +676,7 @@ impl<'db> IntersectionBuilder<'db> {
|
|||
pub(crate) fn new(db: &'db dyn Db) -> Self {
|
||||
Self {
|
||||
db,
|
||||
order_elements: false,
|
||||
intersections: vec![InnerIntersectionBuilder::default()],
|
||||
}
|
||||
}
|
||||
|
|
@ -682,14 +684,25 @@ impl<'db> IntersectionBuilder<'db> {
|
|||
fn empty(db: &'db dyn Db) -> Self {
|
||||
Self {
|
||||
db,
|
||||
order_elements: false,
|
||||
intersections: vec![],
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn order_elements(mut self, val: bool) -> Self {
|
||||
self.order_elements = val;
|
||||
self
|
||||
}
|
||||
|
||||
pub(crate) fn add_positive(self, ty: Type<'db>) -> Self {
|
||||
self.add_positive_impl(ty, &mut vec![])
|
||||
}
|
||||
|
||||
pub(crate) fn add_positive_in_place(&mut self, ty: Type<'db>) {
|
||||
let updated = std::mem::replace(self, Self::empty(self.db)).add_positive(ty);
|
||||
*self = updated;
|
||||
}
|
||||
|
||||
pub(crate) fn add_positive_impl(
|
||||
mut self,
|
||||
ty: Type<'db>,
|
||||
|
|
@ -898,13 +911,16 @@ impl<'db> IntersectionBuilder<'db> {
|
|||
pub(crate) fn build(mut self) -> Type<'db> {
|
||||
// Avoid allocating the UnionBuilder unnecessarily if we have just one intersection:
|
||||
if self.intersections.len() == 1 {
|
||||
self.intersections.pop().unwrap().build(self.db)
|
||||
self.intersections
|
||||
.pop()
|
||||
.unwrap()
|
||||
.build(self.db, self.order_elements)
|
||||
} else {
|
||||
UnionType::from_elements(
|
||||
self.db,
|
||||
self.intersections
|
||||
.into_iter()
|
||||
.map(|inner| inner.build(self.db)),
|
||||
.map(|inner| inner.build(self.db, self.order_elements)),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
|
@ -1238,7 +1254,7 @@ impl<'db> InnerIntersectionBuilder<'db> {
|
|||
}
|
||||
}
|
||||
|
||||
fn build(mut self, db: &'db dyn Db) -> Type<'db> {
|
||||
fn build(mut self, db: &'db dyn Db, order_elements: bool) -> Type<'db> {
|
||||
self.simplify_constrained_typevars(db);
|
||||
|
||||
// If any typevars are in `self.positive`, speculatively solve all bounded type variables
|
||||
|
|
@ -1279,6 +1295,12 @@ impl<'db> InnerIntersectionBuilder<'db> {
|
|||
_ => {
|
||||
self.positive.shrink_to_fit();
|
||||
self.negative.shrink_to_fit();
|
||||
if order_elements {
|
||||
self.positive
|
||||
.sort_unstable_by(|l, r| union_or_intersection_elements_ordering(db, l, r));
|
||||
self.negative
|
||||
.sort_unstable_by(|l, r| union_or_intersection_elements_ordering(db, l, r));
|
||||
}
|
||||
Type::Intersection(IntersectionType::new(db, self.positive, self.negative))
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -152,11 +152,9 @@ impl<'db> ClassBase<'db> {
|
|||
|
||||
Type::TypeAlias(alias) => Self::try_from_type(db, alias.value_type(db), subclass),
|
||||
|
||||
Type::NewTypeInstance(newtype) => ClassBase::try_from_type(
|
||||
db,
|
||||
Type::instance(db, newtype.base_class_type(db)),
|
||||
subclass,
|
||||
),
|
||||
Type::NewTypeInstance(newtype) => {
|
||||
ClassBase::try_from_type(db, newtype.concrete_base_type(db), subclass)
|
||||
}
|
||||
|
||||
Type::PropertyInstance(_)
|
||||
| Type::BooleanLiteral(_)
|
||||
|
|
|
|||
|
|
@ -80,8 +80,8 @@ use crate::types::visitor::{
|
|||
TypeCollector, TypeVisitor, any_over_type, walk_type_with_recursion_guard,
|
||||
};
|
||||
use crate::types::{
|
||||
BoundTypeVarIdentity, BoundTypeVarInstance, IntersectionType, Type, TypeRelation,
|
||||
TypeVarBoundOrConstraints, UnionType, walk_bound_type_var_type,
|
||||
BoundTypeVarIdentity, BoundTypeVarInstance, IntersectionBuilder, IntersectionType, Type,
|
||||
TypeVarBoundOrConstraints, UnionBuilder, UnionType, walk_bound_type_var_type,
|
||||
};
|
||||
use crate::{Db, FxOrderSet};
|
||||
|
||||
|
|
@ -200,21 +200,7 @@ impl<'db> ConstraintSet<'db> {
|
|||
typevar: BoundTypeVarInstance<'db>,
|
||||
lower: Type<'db>,
|
||||
upper: Type<'db>,
|
||||
relation: TypeRelation<'db>,
|
||||
) -> Self {
|
||||
let (lower, upper) = match relation {
|
||||
TypeRelation::Subtyping
|
||||
| TypeRelation::Redundancy
|
||||
| TypeRelation::SubtypingAssuming(_) => (
|
||||
lower.top_materialization(db),
|
||||
upper.bottom_materialization(db),
|
||||
),
|
||||
TypeRelation::Assignability | TypeRelation::ConstraintSetAssignability => (
|
||||
lower.bottom_materialization(db),
|
||||
upper.top_materialization(db),
|
||||
),
|
||||
};
|
||||
|
||||
Self {
|
||||
node: ConstrainedTypeVar::new_node(db, typevar, lower, upper),
|
||||
}
|
||||
|
|
@ -446,7 +432,7 @@ impl<'db> ConstraintSet<'db> {
|
|||
typevar: BoundTypeVarInstance<'db>,
|
||||
upper: Type<'db>,
|
||||
) -> Self {
|
||||
Self::constrain_typevar(db, typevar, lower, upper, TypeRelation::Assignability)
|
||||
Self::constrain_typevar(db, typevar, lower, upper)
|
||||
}
|
||||
|
||||
#[expect(dead_code)] // Keep this around for debugging purposes
|
||||
|
|
@ -517,9 +503,6 @@ impl<'db> ConstrainedTypeVar<'db> {
|
|||
mut lower: Type<'db>,
|
||||
mut upper: Type<'db>,
|
||||
) -> Node<'db> {
|
||||
debug_assert_eq!(lower, lower.bottom_materialization(db));
|
||||
debug_assert_eq!(upper, upper.top_materialization(db));
|
||||
|
||||
// It's not useful for an upper bound to be an intersection type, or for a lower bound to
|
||||
// be a union type. Because the following equivalences hold, we can break these bounds
|
||||
// apart and create an equivalent BDD with more nodes but simpler constraints. (Fewer,
|
||||
|
|
@ -3508,8 +3491,8 @@ impl<'db> GenericContext<'db> {
|
|||
// do with that, so instead we just report the ambiguity as a specialization failure.
|
||||
let mut satisfied = false;
|
||||
let mut unconstrained = false;
|
||||
let mut greatest_lower_bound = Type::Never;
|
||||
let mut least_upper_bound = Type::object();
|
||||
let mut greatest_lower_bound = UnionBuilder::new(db).order_elements(true);
|
||||
let mut least_upper_bound = IntersectionBuilder::new(db).order_elements(true);
|
||||
let identity = bound_typevar.identity(db);
|
||||
tracing::trace!(
|
||||
target: "ty_python_semantic::types::constraints::specialize_constrained",
|
||||
|
|
@ -3528,10 +3511,8 @@ impl<'db> GenericContext<'db> {
|
|||
upper_bound = %upper_bound.display(db),
|
||||
"found representative type",
|
||||
);
|
||||
greatest_lower_bound =
|
||||
UnionType::from_elements(db, [greatest_lower_bound, lower_bound]);
|
||||
least_upper_bound =
|
||||
IntersectionType::from_elements(db, [least_upper_bound, upper_bound]);
|
||||
greatest_lower_bound.add_in_place(lower_bound);
|
||||
least_upper_bound.add_positive_in_place(upper_bound);
|
||||
}
|
||||
None => {
|
||||
unconstrained = true;
|
||||
|
|
@ -3565,6 +3546,8 @@ impl<'db> GenericContext<'db> {
|
|||
|
||||
// If `lower ≰ upper`, then there is no type that satisfies all of the paths in the
|
||||
// BDD. That's an ambiguous specialization, as described above.
|
||||
let greatest_lower_bound = greatest_lower_bound.build();
|
||||
let least_upper_bound = least_upper_bound.build();
|
||||
if !greatest_lower_bound.is_constraint_set_assignable_to(db, least_upper_bound) {
|
||||
tracing::debug!(
|
||||
target: "ty_python_semantic::types::constraints::specialize_constrained",
|
||||
|
|
|
|||
|
|
@ -40,7 +40,7 @@ use ruff_db::{
|
|||
use ruff_diagnostics::{Edit, Fix};
|
||||
use ruff_python_ast::name::Name;
|
||||
use ruff_python_ast::token::parentheses_iterator;
|
||||
use ruff_python_ast::{self as ast, AnyNodeRef, StringFlags};
|
||||
use ruff_python_ast::{self as ast, AnyNodeRef, PythonVersion, StringFlags};
|
||||
use ruff_text_size::{Ranged, TextRange};
|
||||
use rustc_hash::FxHashSet;
|
||||
use std::fmt::{self, Formatter};
|
||||
|
|
@ -4155,6 +4155,120 @@ pub(super) fn report_unsupported_comparison<'db>(
|
|||
}
|
||||
}
|
||||
|
||||
pub(super) fn report_unsupported_augmented_assignment<'db>(
|
||||
context: &InferContext<'db, '_>,
|
||||
stmt: &ast::StmtAugAssign,
|
||||
left_ty: Type<'db>,
|
||||
right_ty: Type<'db>,
|
||||
) {
|
||||
report_unsupported_binary_operation_impl(
|
||||
context,
|
||||
stmt.range(),
|
||||
&stmt.target,
|
||||
&stmt.value,
|
||||
left_ty,
|
||||
right_ty,
|
||||
OperatorDisplay {
|
||||
operator: stmt.op,
|
||||
is_augmented_assignment: true,
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
pub(super) fn report_unsupported_binary_operation<'db>(
|
||||
context: &InferContext<'db, '_>,
|
||||
binary_expression: &ast::ExprBinOp,
|
||||
left_ty: Type<'db>,
|
||||
right_ty: Type<'db>,
|
||||
operator: ast::Operator,
|
||||
) {
|
||||
let Some(mut diagnostic) = report_unsupported_binary_operation_impl(
|
||||
context,
|
||||
binary_expression.range(),
|
||||
&binary_expression.left,
|
||||
&binary_expression.right,
|
||||
left_ty,
|
||||
right_ty,
|
||||
OperatorDisplay {
|
||||
operator,
|
||||
is_augmented_assignment: false,
|
||||
},
|
||||
) else {
|
||||
return;
|
||||
};
|
||||
let db = context.db();
|
||||
if operator == ast::Operator::BitOr
|
||||
&& (left_ty.is_subtype_of(db, KnownClass::Type.to_instance(db))
|
||||
|| right_ty.is_subtype_of(db, KnownClass::Type.to_instance(db)))
|
||||
&& Program::get(db).python_version(db) < PythonVersion::PY310
|
||||
{
|
||||
diagnostic.info(
|
||||
"Note that `X | Y` PEP 604 union syntax is only available in Python 3.10 and later",
|
||||
);
|
||||
add_inferred_python_version_hint_to_diagnostic(db, &mut diagnostic, "resolving types");
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Copy, Clone)]
|
||||
struct OperatorDisplay {
|
||||
operator: ast::Operator,
|
||||
is_augmented_assignment: bool,
|
||||
}
|
||||
|
||||
impl std::fmt::Display for OperatorDisplay {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
if self.is_augmented_assignment {
|
||||
write!(f, "{}=", self.operator)
|
||||
} else {
|
||||
write!(f, "{}", self.operator)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn report_unsupported_binary_operation_impl<'a>(
|
||||
context: &'a InferContext<'a, 'a>,
|
||||
range: TextRange,
|
||||
left: &ast::Expr,
|
||||
right: &ast::Expr,
|
||||
left_ty: Type<'a>,
|
||||
right_ty: Type<'a>,
|
||||
operator: OperatorDisplay,
|
||||
) -> Option<LintDiagnosticGuard<'a, 'a>> {
|
||||
let db = context.db();
|
||||
let diagnostic_builder = context.report_lint(&UNSUPPORTED_OPERATOR, range)?;
|
||||
let display_settings = DisplaySettings::from_possibly_ambiguous_types(db, [left_ty, right_ty]);
|
||||
|
||||
let mut diagnostic =
|
||||
diagnostic_builder.into_diagnostic(format_args!("Unsupported `{operator}` operation"));
|
||||
|
||||
if left_ty == right_ty {
|
||||
diagnostic.set_primary_message(format_args!(
|
||||
"Both operands have type `{}`",
|
||||
left_ty.display_with(db, display_settings.clone())
|
||||
));
|
||||
diagnostic.annotate(context.secondary(left));
|
||||
diagnostic.annotate(context.secondary(right));
|
||||
diagnostic.set_concise_message(format_args!(
|
||||
"Operator `{operator}` is not supported between two objects of type `{}`",
|
||||
left_ty.display_with(db, display_settings.clone())
|
||||
));
|
||||
} else {
|
||||
for (ty, expr) in [(left_ty, left), (right_ty, right)] {
|
||||
diagnostic.annotate(context.secondary(expr).message(format_args!(
|
||||
"Has type `{}`",
|
||||
ty.display_with(db, display_settings.clone())
|
||||
)));
|
||||
}
|
||||
diagnostic.set_concise_message(format_args!(
|
||||
"Operator `{operator}` is not supported between objects of type `{}` and `{}`",
|
||||
left_ty.display_with(db, display_settings.clone()),
|
||||
right_ty.display_with(db, display_settings.clone())
|
||||
));
|
||||
}
|
||||
|
||||
Some(diagnostic)
|
||||
}
|
||||
|
||||
/// This function receives an unresolved `from foo import bar` import,
|
||||
/// where `foo` can be resolved to a module but that module does not
|
||||
/// have a `bar` member or submodule.
|
||||
|
|
|
|||
|
|
@ -14,6 +14,7 @@ use ruff_text_size::{TextLen, TextRange, TextSize};
|
|||
use rustc_hash::{FxHashMap, FxHashSet};
|
||||
|
||||
use crate::Db;
|
||||
use crate::place::Place;
|
||||
use crate::types::class::{ClassLiteral, ClassType, GenericAlias};
|
||||
use crate::types::function::{FunctionType, OverloadLiteral};
|
||||
use crate::types::generics::{GenericContext, Specialization};
|
||||
|
|
@ -642,11 +643,13 @@ impl<'db> FmtDetailed<'db> for DisplayRepresentation<'db> {
|
|||
Type::PropertyInstance(_) => f.with_type(self.ty).write_str("property"),
|
||||
Type::ModuleLiteral(module) => {
|
||||
f.set_invalid_syntax();
|
||||
write!(
|
||||
f.with_type(self.ty),
|
||||
"<module '{}'>",
|
||||
module.module(self.db).name(self.db)
|
||||
)
|
||||
f.write_char('<')?;
|
||||
f.with_type(KnownClass::ModuleType.to_class_literal(self.db))
|
||||
.write_str("module")?;
|
||||
f.write_str(" '")?;
|
||||
f.with_type(self.ty)
|
||||
.write_str(module.module(self.db).name(self.db))?;
|
||||
f.write_str("'>")
|
||||
}
|
||||
Type::ClassLiteral(class) => {
|
||||
f.set_invalid_syntax();
|
||||
|
|
@ -692,11 +695,17 @@ impl<'db> FmtDetailed<'db> for DisplayRepresentation<'db> {
|
|||
write!(f.with_type(Type::Dynamic(dynamic)), "{dynamic}")?;
|
||||
f.write_char(']')
|
||||
}
|
||||
SubclassOfInner::TypeVar(bound_typevar) => write!(
|
||||
f,
|
||||
"type[{}]",
|
||||
bound_typevar.identity(self.db).display(self.db)
|
||||
),
|
||||
SubclassOfInner::TypeVar(bound_typevar) => {
|
||||
f.with_type(KnownClass::Type.to_class_literal(self.db))
|
||||
.write_str("type")?;
|
||||
f.write_char('[')?;
|
||||
write!(
|
||||
f.with_type(Type::TypeVar(bound_typevar)),
|
||||
"{}",
|
||||
bound_typevar.identity(self.db).display(self.db)
|
||||
)?;
|
||||
f.write_char(']')
|
||||
}
|
||||
},
|
||||
Type::SpecialForm(special_form) => {
|
||||
f.set_invalid_syntax();
|
||||
|
|
@ -763,61 +772,115 @@ impl<'db> FmtDetailed<'db> for DisplayRepresentation<'db> {
|
|||
}
|
||||
Type::KnownBoundMethod(method_type) => {
|
||||
f.set_invalid_syntax();
|
||||
match method_type {
|
||||
KnownBoundMethodType::FunctionTypeDunderGet(function) => {
|
||||
write!(
|
||||
f,
|
||||
"<method-wrapper `__get__` of `{function}`>",
|
||||
function = function.name(self.db),
|
||||
)
|
||||
}
|
||||
KnownBoundMethodType::FunctionTypeDunderCall(function) => {
|
||||
write!(
|
||||
f,
|
||||
"<method-wrapper `__call__` of `{function}`>",
|
||||
function = function.name(self.db),
|
||||
)
|
||||
}
|
||||
KnownBoundMethodType::PropertyDunderGet(_) => {
|
||||
f.write_str("<method-wrapper `__get__` of `property` object>")
|
||||
}
|
||||
KnownBoundMethodType::PropertyDunderSet(_) => {
|
||||
f.write_str("<method-wrapper `__set__` of `property` object>")
|
||||
}
|
||||
KnownBoundMethodType::StrStartswith(_) => {
|
||||
f.write_str("<method-wrapper `startswith` of `str` object>")
|
||||
}
|
||||
let (cls, member_name, cls_name, ty, ty_name) = match method_type {
|
||||
KnownBoundMethodType::FunctionTypeDunderGet(function) => (
|
||||
KnownClass::FunctionType,
|
||||
"__get__",
|
||||
"function",
|
||||
Type::FunctionLiteral(function),
|
||||
Some(&**function.name(self.db)),
|
||||
),
|
||||
KnownBoundMethodType::FunctionTypeDunderCall(function) => (
|
||||
KnownClass::FunctionType,
|
||||
"__call__",
|
||||
"function",
|
||||
Type::FunctionLiteral(function),
|
||||
Some(&**function.name(self.db)),
|
||||
),
|
||||
KnownBoundMethodType::PropertyDunderGet(property) => (
|
||||
KnownClass::Property,
|
||||
"__get__",
|
||||
"property",
|
||||
Type::PropertyInstance(property),
|
||||
property
|
||||
.getter(self.db)
|
||||
.and_then(Type::as_function_literal)
|
||||
.map(|getter| &**getter.name(self.db)),
|
||||
),
|
||||
KnownBoundMethodType::PropertyDunderSet(property) => (
|
||||
KnownClass::Property,
|
||||
"__set__",
|
||||
"property",
|
||||
Type::PropertyInstance(property),
|
||||
property
|
||||
.getter(self.db)
|
||||
.and_then(Type::as_function_literal)
|
||||
.map(|getter| &**getter.name(self.db)),
|
||||
),
|
||||
KnownBoundMethodType::StrStartswith(literal) => (
|
||||
KnownClass::Property,
|
||||
"startswith",
|
||||
"string",
|
||||
Type::StringLiteral(literal),
|
||||
Some(literal.value(self.db)),
|
||||
),
|
||||
KnownBoundMethodType::ConstraintSetRange => {
|
||||
f.write_str("bound method `ConstraintSet.range`")
|
||||
return f.write_str("bound method `ConstraintSet.range`");
|
||||
}
|
||||
KnownBoundMethodType::ConstraintSetAlways => {
|
||||
f.write_str("bound method `ConstraintSet.always`")
|
||||
return f.write_str("bound method `ConstraintSet.always`");
|
||||
}
|
||||
KnownBoundMethodType::ConstraintSetNever => {
|
||||
f.write_str("bound method `ConstraintSet.never`")
|
||||
return f.write_str("bound method `ConstraintSet.never`");
|
||||
}
|
||||
KnownBoundMethodType::ConstraintSetImpliesSubtypeOf(_) => {
|
||||
f.write_str("bound method `ConstraintSet.implies_subtype_of`")
|
||||
return f.write_str("bound method `ConstraintSet.implies_subtype_of`");
|
||||
}
|
||||
KnownBoundMethodType::ConstraintSetSatisfies(_) => {
|
||||
f.write_str("bound method `ConstraintSet.satisfies`")
|
||||
return f.write_str("bound method `ConstraintSet.satisfies`");
|
||||
}
|
||||
KnownBoundMethodType::ConstraintSetSatisfiedByAllTypeVars(_) => {
|
||||
f.write_str("bound method `ConstraintSet.satisfied_by_all_typevars`")
|
||||
return f
|
||||
.write_str("bound method `ConstraintSet.satisfied_by_all_typevars`");
|
||||
}
|
||||
KnownBoundMethodType::GenericContextSpecializeConstrained(_) => {
|
||||
f.write_str("bound method `GenericContext.specialize_constrained`")
|
||||
return f.write_str("bound method `GenericContext.specialize_constrained`");
|
||||
}
|
||||
};
|
||||
|
||||
let class_ty = cls.to_class_literal(self.db);
|
||||
f.write_char('<')?;
|
||||
f.with_type(KnownClass::MethodWrapperType.to_class_literal(self.db))
|
||||
.write_str("method-wrapper")?;
|
||||
f.write_str(" '")?;
|
||||
if let Place::Defined(member_ty, _, _) = class_ty.member(self.db, member_name).place
|
||||
{
|
||||
f.with_type(member_ty).write_str(member_name)?;
|
||||
} else {
|
||||
f.write_str(member_name)?;
|
||||
}
|
||||
f.write_str("' of ")?;
|
||||
f.with_type(class_ty).write_str(cls_name)?;
|
||||
if let Some(name) = ty_name {
|
||||
f.write_str(" '")?;
|
||||
f.with_type(ty).write_str(name)?;
|
||||
f.write_str("'>")
|
||||
} else {
|
||||
f.write_str("' object>")
|
||||
}
|
||||
}
|
||||
Type::WrapperDescriptor(kind) => {
|
||||
f.set_invalid_syntax();
|
||||
let (method, object) = match kind {
|
||||
WrapperDescriptorKind::FunctionTypeDunderGet => ("__get__", "function"),
|
||||
WrapperDescriptorKind::PropertyDunderGet => ("__get__", "property"),
|
||||
WrapperDescriptorKind::PropertyDunderSet => ("__set__", "property"),
|
||||
let (method, object, cls) = match kind {
|
||||
WrapperDescriptorKind::FunctionTypeDunderGet => {
|
||||
("__get__", "function", KnownClass::FunctionType)
|
||||
}
|
||||
WrapperDescriptorKind::PropertyDunderGet => {
|
||||
("__get__", "property", KnownClass::Property)
|
||||
}
|
||||
WrapperDescriptorKind::PropertyDunderSet => {
|
||||
("__set__", "property", KnownClass::Property)
|
||||
}
|
||||
};
|
||||
write!(f, "<wrapper-descriptor `{method}` of `{object}` objects>")
|
||||
f.write_char('<')?;
|
||||
f.with_type(KnownClass::WrapperDescriptorType.to_class_literal(self.db))
|
||||
.write_str("wrapper-descriptor")?;
|
||||
f.write_str(" '")?;
|
||||
f.write_str(method)?;
|
||||
f.write_str("' of '")?;
|
||||
f.with_type(cls.to_class_literal(self.db))
|
||||
.write_str(object)?;
|
||||
f.write_str("' objects>")
|
||||
}
|
||||
Type::DataclassDecorator(_) => {
|
||||
f.set_invalid_syntax();
|
||||
|
|
@ -907,7 +970,10 @@ impl<'db> FmtDetailed<'db> for DisplayRepresentation<'db> {
|
|||
.fmt_detailed(f),
|
||||
Type::TypedDict(TypedDictType::Synthesized(synthesized)) => {
|
||||
f.set_invalid_syntax();
|
||||
f.write_str("<TypedDict with items ")?;
|
||||
f.write_char('<')?;
|
||||
f.with_type(Type::SpecialForm(SpecialFormType::TypedDict))
|
||||
.write_str("TypedDict")?;
|
||||
f.write_str(" with items ")?;
|
||||
let items = synthesized.items(self.db);
|
||||
for (i, name) in items.keys().enumerate() {
|
||||
let is_last = i == items.len() - 1;
|
||||
|
|
@ -1318,10 +1384,13 @@ impl<'db> DisplayGenericContext<'_, 'db> {
|
|||
f.set_invalid_syntax();
|
||||
let typevar = bound_typevar.typevar(self.db);
|
||||
if typevar.is_paramspec(self.db) {
|
||||
write!(f, "**{}", typevar.name(self.db))?;
|
||||
} else {
|
||||
f.write_str(typevar.name(self.db))?;
|
||||
f.write_str("**")?;
|
||||
}
|
||||
write!(
|
||||
f.with_type(Type::TypeVar(*bound_typevar)),
|
||||
"{}",
|
||||
typevar.name(self.db)
|
||||
)?;
|
||||
}
|
||||
f.write_char(']')
|
||||
}
|
||||
|
|
@ -1334,7 +1403,11 @@ impl<'db> DisplayGenericContext<'_, 'db> {
|
|||
f.write_str(", ")?;
|
||||
}
|
||||
f.set_invalid_syntax();
|
||||
write!(f, "{}", bound_typevar.identity(self.db).display(self.db))?;
|
||||
write!(
|
||||
f.with_type(Type::TypeVar(bound_typevar)),
|
||||
"{}",
|
||||
bound_typevar.identity(self.db).display(self.db)
|
||||
)?;
|
||||
}
|
||||
f.write_char(']')
|
||||
}
|
||||
|
|
@ -2262,15 +2335,17 @@ impl<'db> FmtDetailed<'db> for DisplayKnownInstanceRepr<'db> {
|
|||
KnownInstanceType::SubscriptedProtocol(generic_context) => {
|
||||
f.set_invalid_syntax();
|
||||
f.write_str("<special form '")?;
|
||||
f.with_type(ty).write_str("typing.Protocol")?;
|
||||
f.write_str(&generic_context.display(self.db).to_string())?;
|
||||
f.with_type(Type::SpecialForm(SpecialFormType::Protocol))
|
||||
.write_str("typing.Protocol")?;
|
||||
generic_context.display(self.db).fmt_detailed(f)?;
|
||||
f.write_str("'>")
|
||||
}
|
||||
KnownInstanceType::SubscriptedGeneric(generic_context) => {
|
||||
f.set_invalid_syntax();
|
||||
f.write_str("<special form '")?;
|
||||
f.with_type(ty).write_str("typing.Generic")?;
|
||||
f.write_str(&generic_context.display(self.db).to_string())?;
|
||||
f.with_type(Type::SpecialForm(SpecialFormType::Generic))
|
||||
.write_str("typing.Generic")?;
|
||||
generic_context.display(self.db).fmt_detailed(f)?;
|
||||
f.write_str("'>")
|
||||
}
|
||||
KnownInstanceType::TypeAliasType(alias) => {
|
||||
|
|
@ -2278,15 +2353,9 @@ impl<'db> FmtDetailed<'db> for DisplayKnownInstanceRepr<'db> {
|
|||
f.set_invalid_syntax();
|
||||
f.write_str("<type alias '")?;
|
||||
f.with_type(ty).write_str(alias.name(self.db))?;
|
||||
f.write_str(
|
||||
&specialization
|
||||
.display_short(
|
||||
self.db,
|
||||
TupleSpecialization::No,
|
||||
DisplaySettings::default(),
|
||||
)
|
||||
.to_string(),
|
||||
)?;
|
||||
specialization
|
||||
.display_short(self.db, TupleSpecialization::No, DisplaySettings::default())
|
||||
.fmt_detailed(f)?;
|
||||
f.write_str("'>")
|
||||
} else {
|
||||
f.with_type(ty).write_str("typing.TypeAliasType")
|
||||
|
|
@ -2306,7 +2375,9 @@ impl<'db> FmtDetailed<'db> for DisplayKnownInstanceRepr<'db> {
|
|||
KnownInstanceType::Field(field) => {
|
||||
f.with_type(ty).write_str("dataclasses.Field")?;
|
||||
if let Some(default_ty) = field.default_type(self.db) {
|
||||
write!(f, "[{}]", default_ty.display(self.db))?;
|
||||
f.write_char('[')?;
|
||||
write!(f.with_type(default_ty), "{}", default_ty.display(self.db))?;
|
||||
f.write_char(']')?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
|
@ -2325,51 +2396,58 @@ impl<'db> FmtDetailed<'db> for DisplayKnownInstanceRepr<'db> {
|
|||
KnownInstanceType::UnionType(union) => {
|
||||
f.set_invalid_syntax();
|
||||
f.write_char('<')?;
|
||||
f.with_type(ty).write_str("types.UnionType")?;
|
||||
f.with_type(KnownClass::UnionType.to_class_literal(self.db))
|
||||
.write_str("types.UnionType")?;
|
||||
f.write_str(" special form")?;
|
||||
if let Ok(ty) = union.union_type(self.db) {
|
||||
write!(f, " '{}'", ty.display(self.db))?;
|
||||
f.write_str(" '")?;
|
||||
ty.display(self.db).fmt_detailed(f)?;
|
||||
f.write_char('\'')?;
|
||||
}
|
||||
f.write_char('>')
|
||||
}
|
||||
KnownInstanceType::Literal(inner) => {
|
||||
f.set_invalid_syntax();
|
||||
write!(
|
||||
f,
|
||||
"<special form '{}'>",
|
||||
inner.inner(self.db).display(self.db)
|
||||
)
|
||||
f.write_str("<special form '")?;
|
||||
inner.inner(self.db).display(self.db).fmt_detailed(f)?;
|
||||
f.write_str("'>")
|
||||
}
|
||||
KnownInstanceType::Annotated(inner) => {
|
||||
f.set_invalid_syntax();
|
||||
f.write_str("<special form '")?;
|
||||
f.with_type(ty).write_str("typing.Annotated")?;
|
||||
write!(
|
||||
f,
|
||||
"[{}, <metadata>]'>",
|
||||
inner.inner(self.db).display(self.db)
|
||||
)
|
||||
f.with_type(Type::SpecialForm(SpecialFormType::Annotated))
|
||||
.write_str("typing.Annotated")?;
|
||||
f.write_char('[')?;
|
||||
inner.inner(self.db).display(self.db).fmt_detailed(f)?;
|
||||
f.write_str(", <metadata>]'>")
|
||||
}
|
||||
KnownInstanceType::Callable(callable) => {
|
||||
f.set_invalid_syntax();
|
||||
f.write_char('<')?;
|
||||
f.with_type(ty).write_str("typing.Callable")?;
|
||||
write!(f, " special form '{}'>", callable.display(self.db))
|
||||
f.with_type(Type::SpecialForm(SpecialFormType::Callable))
|
||||
.write_str("typing.Callable")?;
|
||||
f.write_str(" special form '")?;
|
||||
callable.display(self.db).fmt_detailed(f)?;
|
||||
f.write_str("'>")
|
||||
}
|
||||
KnownInstanceType::TypeGenericAlias(inner) => {
|
||||
f.set_invalid_syntax();
|
||||
f.write_str("<special form '")?;
|
||||
write!(
|
||||
f.with_type(ty),
|
||||
"type[{}]",
|
||||
inner.inner(self.db).display(self.db)
|
||||
)?;
|
||||
f.write_str("'>")
|
||||
f.with_type(KnownClass::Type.to_class_literal(self.db))
|
||||
.write_str("type")?;
|
||||
f.write_char('[')?;
|
||||
inner.inner(self.db).display(self.db).fmt_detailed(f)?;
|
||||
f.write_str("]'>")
|
||||
}
|
||||
KnownInstanceType::LiteralStringAlias(_) => f.write_str("str"),
|
||||
KnownInstanceType::LiteralStringAlias(_) => f
|
||||
.with_type(KnownClass::Str.to_class_literal(self.db))
|
||||
.write_str("str"),
|
||||
KnownInstanceType::NewType(declaration) => {
|
||||
f.set_invalid_syntax();
|
||||
f.write_str("<NewType pseudo-class '")?;
|
||||
f.write_char('<')?;
|
||||
f.with_type(KnownClass::NewType.to_class_literal(self.db))
|
||||
.write_str("NewType")?;
|
||||
f.write_str(" pseudo-class '")?;
|
||||
f.with_type(ty).write_str(declaration.name(self.db))?;
|
||||
f.write_str("'>")
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1247,10 +1247,9 @@ fn is_instance_truthiness<'db>(
|
|||
|
||||
Type::NominalInstance(..) => always_true_if(is_instance(&ty)),
|
||||
|
||||
Type::NewTypeInstance(newtype) => always_true_if(is_instance(&Type::instance(
|
||||
db,
|
||||
newtype.base_class_type(db),
|
||||
))),
|
||||
Type::NewTypeInstance(newtype) => {
|
||||
always_true_if(is_instance(&newtype.concrete_base_type(db)))
|
||||
}
|
||||
|
||||
Type::BooleanLiteral(..)
|
||||
| Type::BytesLiteral(..)
|
||||
|
|
|
|||
|
|
@ -7,8 +7,9 @@ use crate::semantic_index::definition::DefinitionKind;
|
|||
use crate::semantic_index::{attribute_scopes, global_scope, semantic_index, use_def_map};
|
||||
use crate::types::call::{CallArguments, MatchedArgument};
|
||||
use crate::types::signatures::{ParameterKind, Signature};
|
||||
use crate::types::{CallDunderError, UnionType};
|
||||
use crate::types::{CallableTypes, ClassBase, KnownClass, Type, TypeContext};
|
||||
use crate::types::{
|
||||
CallDunderError, CallableTypes, ClassBase, KnownUnion, Type, TypeContext, UnionType,
|
||||
};
|
||||
use crate::{Db, DisplaySettings, HasType, SemanticModel};
|
||||
use ruff_db::files::FileRange;
|
||||
use ruff_db::parsed::parsed_module;
|
||||
|
|
@ -193,21 +194,8 @@ pub fn definitions_for_name<'db>(
|
|||
|
||||
fn is_float_or_complex_annotation(db: &dyn Db, ty: UnionType, name: &str) -> bool {
|
||||
let float_or_complex_ty = match name {
|
||||
"float" => UnionType::from_elements(
|
||||
db,
|
||||
[
|
||||
KnownClass::Int.to_instance(db),
|
||||
KnownClass::Float.to_instance(db),
|
||||
],
|
||||
),
|
||||
"complex" => UnionType::from_elements(
|
||||
db,
|
||||
[
|
||||
KnownClass::Int.to_instance(db),
|
||||
KnownClass::Float.to_instance(db),
|
||||
KnownClass::Complex.to_instance(db),
|
||||
],
|
||||
),
|
||||
"float" => KnownUnion::Float.to_type(db),
|
||||
"complex" => KnownUnion::Complex.to_type(db),
|
||||
_ => return false,
|
||||
}
|
||||
.expect_union();
|
||||
|
|
|
|||
|
|
@ -80,7 +80,8 @@ use crate::types::diagnostic::{
|
|||
report_invalid_type_checking_constant, report_named_tuple_field_with_leading_underscore,
|
||||
report_namedtuple_field_without_default_after_field_with_default, report_non_subscriptable,
|
||||
report_possibly_missing_attribute, report_possibly_unresolved_reference,
|
||||
report_rebound_typevar, report_slice_step_size_zero, report_unsupported_comparison,
|
||||
report_rebound_typevar, report_slice_step_size_zero, report_unsupported_augmented_assignment,
|
||||
report_unsupported_binary_operation, report_unsupported_comparison,
|
||||
};
|
||||
use crate::types::function::{
|
||||
FunctionDecorators, FunctionLiteral, FunctionType, KnownFunction, OverloadLiteral,
|
||||
|
|
@ -104,13 +105,13 @@ use crate::types::visitor::any_over_type;
|
|||
use crate::types::{
|
||||
BoundTypeVarInstance, CallDunderError, CallableBinding, CallableType, CallableTypeKind,
|
||||
ClassLiteral, ClassType, DataclassParams, DynamicType, InternedType, IntersectionBuilder,
|
||||
IntersectionType, KnownClass, KnownInstanceType, LintDiagnosticGuard, MemberLookupPolicy,
|
||||
MetaclassCandidate, PEP695TypeAliasType, ParamSpecAttrKind, Parameter, ParameterForm,
|
||||
Parameters, Signature, SpecialFormType, SubclassOfType, TrackedConstraintSet, Truthiness, Type,
|
||||
TypeAliasType, TypeAndQualifiers, TypeContext, TypeQualifiers, TypeVarBoundOrConstraints,
|
||||
TypeVarBoundOrConstraintsEvaluation, TypeVarDefaultEvaluation, TypeVarIdentity,
|
||||
TypeVarInstance, TypeVarKind, TypeVarVariance, TypedDictType, UnionBuilder, UnionType,
|
||||
UnionTypeInstance, binding_type, infer_scope_types, todo_type,
|
||||
IntersectionType, KnownClass, KnownInstanceType, KnownUnion, LintDiagnosticGuard,
|
||||
MemberLookupPolicy, MetaclassCandidate, PEP695TypeAliasType, ParamSpecAttrKind, Parameter,
|
||||
ParameterForm, Parameters, Signature, SpecialFormType, SubclassOfType, TrackedConstraintSet,
|
||||
Truthiness, Type, TypeAliasType, TypeAndQualifiers, TypeContext, TypeQualifiers,
|
||||
TypeVarBoundOrConstraints, TypeVarBoundOrConstraintsEvaluation, TypeVarDefaultEvaluation,
|
||||
TypeVarIdentity, TypeVarInstance, TypeVarKind, TypeVarVariance, TypedDictType, UnionBuilder,
|
||||
UnionType, UnionTypeInstance, binding_type, infer_scope_types, todo_type,
|
||||
};
|
||||
use crate::types::{CallableTypes, overrides};
|
||||
use crate::types::{ClassBase, add_inferred_python_version_hint_to_diagnostic};
|
||||
|
|
@ -3541,10 +3542,17 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
return Ok(param_type);
|
||||
}
|
||||
|
||||
Type::KnownInstance(known_instance)
|
||||
Type::KnownInstance(known_instance @ KnownInstanceType::TypeVar(typevar))
|
||||
if known_instance.class(self.db()) == KnownClass::ParamSpec =>
|
||||
{
|
||||
// TODO: Emit diagnostic: "ParamSpec "P" is unbound"
|
||||
if let Some(diagnostic_builder) =
|
||||
self.context.report_lint(&INVALID_TYPE_ARGUMENTS, expr)
|
||||
{
|
||||
diagnostic_builder.into_diagnostic(format_args!(
|
||||
"ParamSpec `{}` is unbound",
|
||||
typevar.name(self.db())
|
||||
));
|
||||
}
|
||||
return Err(());
|
||||
}
|
||||
|
||||
|
|
@ -5629,28 +5637,35 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
|
||||
// Infer the deferred base type of a NewType.
|
||||
fn infer_newtype_assignment_deferred(&mut self, arguments: &ast::Arguments) {
|
||||
match self.infer_type_expression(&arguments.args[1]) {
|
||||
Type::NominalInstance(_) | Type::NewTypeInstance(_) => {}
|
||||
let inferred = self.infer_type_expression(&arguments.args[1]);
|
||||
match inferred {
|
||||
Type::NominalInstance(_) | Type::NewTypeInstance(_) => return,
|
||||
// There are exactly two union types allowed as bases for NewType: `int | float` and
|
||||
// `int | float | complex`. These are allowed because that's what `float` and `complex`
|
||||
// expand into in type position. We don't currently ask whether the union was implicit
|
||||
// or explicit, so the explicit version is also allowed.
|
||||
Type::Union(union_ty) => {
|
||||
if let Some(KnownUnion::Float | KnownUnion::Complex) = union_ty.known(self.db()) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
// `Unknown` is likely to be the result of an unresolved import or a typo, which will
|
||||
// already get a diagnostic, so don't pile on an extra diagnostic here.
|
||||
Type::Dynamic(DynamicType::Unknown) => {}
|
||||
other_type => {
|
||||
if let Some(builder) = self
|
||||
.context
|
||||
.report_lint(&INVALID_NEWTYPE, &arguments.args[1])
|
||||
{
|
||||
let mut diag = builder.into_diagnostic("invalid base for `typing.NewType`");
|
||||
diag.set_primary_message(format!("type `{}`", other_type.display(self.db())));
|
||||
if matches!(other_type, Type::ProtocolInstance(_)) {
|
||||
diag.info("The base of a `NewType` is not allowed to be a protocol class.");
|
||||
} else if matches!(other_type, Type::TypedDict(_)) {
|
||||
diag.info("The base of a `NewType` is not allowed to be a `TypedDict`.");
|
||||
} else {
|
||||
diag.info(
|
||||
"The base of a `NewType` must be a class type or another `NewType`.",
|
||||
);
|
||||
}
|
||||
}
|
||||
Type::Dynamic(DynamicType::Unknown) => return,
|
||||
_ => {}
|
||||
}
|
||||
if let Some(builder) = self
|
||||
.context
|
||||
.report_lint(&INVALID_NEWTYPE, &arguments.args[1])
|
||||
{
|
||||
let mut diag = builder.into_diagnostic("invalid base for `typing.NewType`");
|
||||
diag.set_primary_message(format!("type `{}`", inferred.display(self.db())));
|
||||
if matches!(inferred, Type::ProtocolInstance(_)) {
|
||||
diag.info("The base of a `NewType` is not allowed to be a protocol class.");
|
||||
} else if matches!(inferred, Type::TypedDict(_)) {
|
||||
diag.info("The base of a `NewType` is not allowed to be a `TypedDict`.");
|
||||
} else {
|
||||
diag.info("The base of a `NewType` must be a class type or another `NewType`.");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -5851,6 +5866,24 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
};
|
||||
|
||||
if is_pep_613_type_alias {
|
||||
let inferred_ty =
|
||||
if let Type::KnownInstance(KnownInstanceType::TypeVar(typevar)) = inferred_ty {
|
||||
let identity = TypeVarIdentity::new(
|
||||
self.db(),
|
||||
typevar.identity(self.db()).name(self.db()),
|
||||
typevar.identity(self.db()).definition(self.db()),
|
||||
TypeVarKind::Pep613Alias,
|
||||
);
|
||||
Type::KnownInstance(KnownInstanceType::TypeVar(TypeVarInstance::new(
|
||||
self.db(),
|
||||
identity,
|
||||
typevar._bound_or_constraints(self.db()),
|
||||
typevar.explicit_variance(self.db()),
|
||||
typevar._default(self.db()),
|
||||
)))
|
||||
} else {
|
||||
inferred_ty
|
||||
};
|
||||
self.add_declaration_with_binding(
|
||||
target.into(),
|
||||
definition,
|
||||
|
|
@ -5911,22 +5944,16 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
let op = assignment.op;
|
||||
let db = self.db();
|
||||
|
||||
let report_unsupported_augmented_op = |ctx: &mut InferContext| {
|
||||
let Some(builder) = ctx.report_lint(&UNSUPPORTED_OPERATOR, assignment) else {
|
||||
return;
|
||||
};
|
||||
builder.into_diagnostic(format_args!(
|
||||
"Operator `{op}=` is not supported between objects of type `{}` and `{}`",
|
||||
target_type.display(db),
|
||||
value_type.display(db)
|
||||
));
|
||||
};
|
||||
|
||||
// Fall back to non-augmented binary operator inference.
|
||||
let mut binary_return_ty = || {
|
||||
self.infer_binary_expression_type(assignment.into(), false, target_type, value_type, op)
|
||||
.unwrap_or_else(|| {
|
||||
report_unsupported_augmented_op(&mut self.context);
|
||||
report_unsupported_augmented_assignment(
|
||||
&self.context,
|
||||
assignment,
|
||||
target_type,
|
||||
value_type,
|
||||
);
|
||||
Type::unknown()
|
||||
})
|
||||
};
|
||||
|
|
@ -5950,7 +5977,12 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
UnionType::from_elements(db, [outcome.return_type(db), binary_return_ty()])
|
||||
}
|
||||
Err(CallDunderError::CallError(_, bindings)) => {
|
||||
report_unsupported_augmented_op(&mut self.context);
|
||||
report_unsupported_augmented_assignment(
|
||||
&self.context,
|
||||
assignment,
|
||||
target_type,
|
||||
value_type,
|
||||
);
|
||||
bindings.return_type(db)
|
||||
}
|
||||
}
|
||||
|
|
@ -8150,10 +8182,10 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
value,
|
||||
} = named;
|
||||
|
||||
self.infer_expression(target, TypeContext::default());
|
||||
|
||||
self.add_binding(named.target.as_ref().into(), definition, |builder, tcx| {
|
||||
builder.infer_expression(value, tcx)
|
||||
let ty = builder.infer_expression(value, tcx);
|
||||
builder.store_expression_type(target, ty);
|
||||
ty
|
||||
})
|
||||
}
|
||||
|
||||
|
|
@ -9685,7 +9717,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
self.context.report_lint(&UNSUPPORTED_OPERATOR, unary)
|
||||
{
|
||||
builder.into_diagnostic(format_args!(
|
||||
"Unary operator `{op}` is not supported for type `{}`",
|
||||
"Unary operator `{op}` is not supported for object of type `{}`",
|
||||
operand_type.display(self.db()),
|
||||
));
|
||||
}
|
||||
|
|
@ -9718,26 +9750,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
|
||||
self.infer_binary_expression_type(binary.into(), false, left_ty, right_ty, *op)
|
||||
.unwrap_or_else(|| {
|
||||
let db = self.db();
|
||||
|
||||
if let Some(builder) = self.context.report_lint(&UNSUPPORTED_OPERATOR, binary) {
|
||||
let mut diag = builder.into_diagnostic(format_args!(
|
||||
"Operator `{op}` is not supported between objects of type `{}` and `{}`",
|
||||
left_ty.display(db),
|
||||
right_ty.display(db)
|
||||
));
|
||||
|
||||
if op == &ast::Operator::BitOr
|
||||
&& (left_ty.is_subtype_of(db, KnownClass::Type.to_instance(db))
|
||||
|| right_ty.is_subtype_of(db, KnownClass::Type.to_instance(db)))
|
||||
&& Program::get(db).python_version(db) < PythonVersion::PY310
|
||||
{
|
||||
diag.info(
|
||||
"Note that `X | Y` PEP 604 union syntax is only available in Python 3.10 and later",
|
||||
);
|
||||
add_inferred_python_version_hint_to_diagnostic(db, &mut diag, "resolving types");
|
||||
}
|
||||
}
|
||||
report_unsupported_binary_operation(&self.context, binary, left_ty, right_ty, *op);
|
||||
Type::unknown()
|
||||
})
|
||||
}
|
||||
|
|
@ -11629,6 +11642,17 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
generic_context: GenericContext<'db>,
|
||||
specialize: impl FnOnce(&[Option<Type<'db>>]) -> Type<'db>,
|
||||
) -> Type<'db> {
|
||||
enum ExplicitSpecializationError {
|
||||
InvalidParamSpec,
|
||||
UnsatisfiedBound,
|
||||
UnsatisfiedConstraints,
|
||||
/// These two errors override the errors above, causing all specializations to be `Unknown`.
|
||||
MissingTypeVars,
|
||||
TooManyArguments,
|
||||
/// This error overrides the errors above, causing the type itself to be `Unknown`.
|
||||
NonGeneric,
|
||||
}
|
||||
|
||||
fn add_typevar_definition<'db>(
|
||||
db: &'db dyn Db,
|
||||
diagnostic: &mut Diagnostic,
|
||||
|
|
@ -11681,7 +11705,7 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
}
|
||||
};
|
||||
|
||||
let mut has_error = false;
|
||||
let mut error: Option<ExplicitSpecializationError> = None;
|
||||
|
||||
for (index, item) in typevars.zip_longest(type_arguments.iter()).enumerate() {
|
||||
match item {
|
||||
|
|
@ -11697,8 +11721,8 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
) {
|
||||
Ok(paramspec_value) => paramspec_value,
|
||||
Err(()) => {
|
||||
has_error = true;
|
||||
Type::unknown()
|
||||
error = Some(ExplicitSpecializationError::InvalidParamSpec);
|
||||
Type::paramspec_value_callable(db, Parameters::unknown())
|
||||
}
|
||||
}
|
||||
} else {
|
||||
|
|
@ -11730,8 +11754,10 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
));
|
||||
add_typevar_definition(db, &mut diagnostic, typevar);
|
||||
}
|
||||
has_error = true;
|
||||
continue;
|
||||
error = Some(ExplicitSpecializationError::UnsatisfiedBound);
|
||||
specialization_types.push(Some(Type::unknown()));
|
||||
} else {
|
||||
specialization_types.push(Some(provided_type));
|
||||
}
|
||||
}
|
||||
Some(TypeVarBoundOrConstraints::Constraints(constraints)) => {
|
||||
|
|
@ -11764,14 +11790,16 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
));
|
||||
add_typevar_definition(db, &mut diagnostic, typevar);
|
||||
}
|
||||
has_error = true;
|
||||
continue;
|
||||
error = Some(ExplicitSpecializationError::UnsatisfiedConstraints);
|
||||
specialization_types.push(Some(Type::unknown()));
|
||||
} else {
|
||||
specialization_types.push(Some(provided_type));
|
||||
}
|
||||
}
|
||||
None => {}
|
||||
None => {
|
||||
specialization_types.push(Some(provided_type));
|
||||
}
|
||||
}
|
||||
|
||||
specialization_types.push(Some(provided_type));
|
||||
}
|
||||
EitherOrBoth::Left(typevar) => {
|
||||
if typevar.default_type(db).is_none() {
|
||||
|
|
@ -11806,33 +11834,57 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
}
|
||||
));
|
||||
}
|
||||
has_error = true;
|
||||
error = Some(ExplicitSpecializationError::MissingTypeVars);
|
||||
}
|
||||
|
||||
if let Some(first_excess_type_argument_index) = first_excess_type_argument_index {
|
||||
let node = get_node(first_excess_type_argument_index);
|
||||
if let Some(builder) = self.context.report_lint(&INVALID_TYPE_ARGUMENTS, node) {
|
||||
let description = CallableDescription::new(db, value_ty);
|
||||
builder.into_diagnostic(format_args!(
|
||||
"Too many type arguments{}: expected {}, got {}",
|
||||
if let Some(CallableDescription { kind, name }) = description {
|
||||
format!(" to {kind} `{name}`")
|
||||
} else {
|
||||
String::new()
|
||||
},
|
||||
if typevar_with_defaults == 0 {
|
||||
format!("{typevars_len}")
|
||||
} else {
|
||||
format!(
|
||||
"between {} and {}",
|
||||
typevars_len - typevar_with_defaults,
|
||||
typevars_len
|
||||
)
|
||||
},
|
||||
type_arguments.len(),
|
||||
));
|
||||
if typevars_len == 0 {
|
||||
// Type parameter list cannot be empty, so if we reach here, `value_ty` is not a generic type.
|
||||
if let Some(builder) = self
|
||||
.context
|
||||
.report_lint(&NON_SUBSCRIPTABLE, &*subscript.value)
|
||||
{
|
||||
let mut diagnostic =
|
||||
builder.into_diagnostic("Cannot subscript non-generic type");
|
||||
if match value_ty {
|
||||
Type::GenericAlias(_) => true,
|
||||
Type::KnownInstance(KnownInstanceType::UnionType(union)) => union
|
||||
.value_expression_types(db)
|
||||
.is_ok_and(|mut tys| tys.any(|ty| ty.is_generic_alias())),
|
||||
_ => false,
|
||||
} {
|
||||
diagnostic.set_primary_message(format_args!(
|
||||
"`{}` is already specialized",
|
||||
value_ty.display(db)
|
||||
));
|
||||
}
|
||||
}
|
||||
error = Some(ExplicitSpecializationError::NonGeneric);
|
||||
} else {
|
||||
let node = get_node(first_excess_type_argument_index);
|
||||
if let Some(builder) = self.context.report_lint(&INVALID_TYPE_ARGUMENTS, node) {
|
||||
let description = CallableDescription::new(db, value_ty);
|
||||
builder.into_diagnostic(format_args!(
|
||||
"Too many type arguments{}: expected {}, got {}",
|
||||
if let Some(CallableDescription { kind, name }) = description {
|
||||
format!(" to {kind} `{name}`")
|
||||
} else {
|
||||
String::new()
|
||||
},
|
||||
if typevar_with_defaults == 0 {
|
||||
format!("{typevars_len}")
|
||||
} else {
|
||||
format!(
|
||||
"between {} and {}",
|
||||
typevars_len - typevar_with_defaults,
|
||||
typevars_len
|
||||
)
|
||||
},
|
||||
type_arguments.len(),
|
||||
));
|
||||
}
|
||||
error = Some(ExplicitSpecializationError::TooManyArguments);
|
||||
}
|
||||
has_error = true;
|
||||
}
|
||||
|
||||
if store_inferred_type_arguments {
|
||||
|
|
@ -11842,21 +11894,31 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
);
|
||||
}
|
||||
|
||||
if has_error {
|
||||
let unknowns = generic_context
|
||||
.variables(self.db())
|
||||
.map(|typevar| {
|
||||
Some(if typevar.is_paramspec(db) {
|
||||
Type::paramspec_value_callable(db, Parameters::unknown())
|
||||
} else {
|
||||
Type::unknown()
|
||||
match error {
|
||||
Some(ExplicitSpecializationError::NonGeneric) => Type::unknown(),
|
||||
Some(
|
||||
ExplicitSpecializationError::MissingTypeVars
|
||||
| ExplicitSpecializationError::TooManyArguments,
|
||||
) => {
|
||||
let unknowns = generic_context
|
||||
.variables(self.db())
|
||||
.map(|typevar| {
|
||||
Some(if typevar.is_paramspec(db) {
|
||||
Type::paramspec_value_callable(db, Parameters::unknown())
|
||||
} else {
|
||||
Type::unknown()
|
||||
})
|
||||
})
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
return specialize(&unknowns);
|
||||
.collect::<Vec<_>>();
|
||||
specialize(&unknowns)
|
||||
}
|
||||
Some(
|
||||
ExplicitSpecializationError::UnsatisfiedBound
|
||||
| ExplicitSpecializationError::UnsatisfiedConstraints
|
||||
| ExplicitSpecializationError::InvalidParamSpec,
|
||||
)
|
||||
| None => specialize(&specialization_types),
|
||||
}
|
||||
|
||||
specialize(&specialization_types)
|
||||
}
|
||||
|
||||
fn infer_subscript_expression_types(
|
||||
|
|
@ -12037,9 +12099,17 @@ impl<'db, 'ast> TypeInferenceBuilder<'db, 'ast> {
|
|||
Type::KnownInstance(KnownInstanceType::TypeAliasType(TypeAliasType::PEP695(alias))),
|
||||
_,
|
||||
) if alias.generic_context(db).is_none() => {
|
||||
debug_assert!(alias.specialization(db).is_none());
|
||||
if let Some(builder) = self.context.report_lint(&NON_SUBSCRIPTABLE, subscript) {
|
||||
builder
|
||||
.into_diagnostic(format_args!("Cannot subscript non-generic type alias"));
|
||||
let value_type = alias.raw_value_type(db);
|
||||
let mut diagnostic =
|
||||
builder.into_diagnostic("Cannot subscript non-generic type alias");
|
||||
if value_type.is_definition_generic(db) {
|
||||
diagnostic.set_primary_message(format_args!(
|
||||
"`{}` is already specialized",
|
||||
value_type.display(db)
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
Some(Type::unknown())
|
||||
|
|
|
|||
|
|
@ -16,8 +16,8 @@ use crate::types::tuple::{TupleSpecBuilder, TupleType};
|
|||
use crate::types::{
|
||||
BindingContext, CallableType, DynamicType, GenericContext, IntersectionBuilder, KnownClass,
|
||||
KnownInstanceType, LintDiagnosticGuard, Parameter, Parameters, SpecialFormType, SubclassOfType,
|
||||
Type, TypeAliasType, TypeContext, TypeIsType, TypeMapping, UnionBuilder, UnionType,
|
||||
any_over_type, todo_type,
|
||||
Type, TypeAliasType, TypeContext, TypeIsType, TypeMapping, TypeVarKind, UnionBuilder,
|
||||
UnionType, any_over_type, todo_type,
|
||||
};
|
||||
|
||||
/// Type expressions
|
||||
|
|
@ -919,6 +919,16 @@ impl<'db> TypeInferenceBuilder<'db, '_> {
|
|||
Type::unknown()
|
||||
}
|
||||
KnownInstanceType::TypeAliasType(type_alias @ TypeAliasType::PEP695(_)) => {
|
||||
if type_alias.specialization(self.db()).is_some() {
|
||||
if let Some(builder) =
|
||||
self.context.report_lint(&NON_SUBSCRIPTABLE, subscript)
|
||||
{
|
||||
let mut diagnostic =
|
||||
builder.into_diagnostic("Cannot subscript non-generic type alias");
|
||||
diagnostic.set_primary_message("Double specialization is not allowed");
|
||||
}
|
||||
return Type::unknown();
|
||||
}
|
||||
match type_alias.generic_context(self.db()) {
|
||||
Some(generic_context) => {
|
||||
let specialized_type_alias = self
|
||||
|
|
@ -943,9 +953,15 @@ impl<'db> TypeInferenceBuilder<'db, '_> {
|
|||
if let Some(builder) =
|
||||
self.context.report_lint(&NON_SUBSCRIPTABLE, subscript)
|
||||
{
|
||||
builder.into_diagnostic(format_args!(
|
||||
"Cannot subscript non-generic type alias"
|
||||
));
|
||||
let value_type = type_alias.raw_value_type(self.db());
|
||||
let mut diagnostic = builder
|
||||
.into_diagnostic("Cannot subscript non-generic type alias");
|
||||
if value_type.is_definition_generic(self.db()) {
|
||||
diagnostic.set_primary_message(format_args!(
|
||||
"`{}` is already specialized",
|
||||
value_type.display(self.db()),
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
Type::unknown()
|
||||
|
|
@ -979,8 +995,26 @@ impl<'db> TypeInferenceBuilder<'db, '_> {
|
|||
}
|
||||
Type::unknown()
|
||||
}
|
||||
KnownInstanceType::TypeVar(_) => {
|
||||
self.infer_explicit_type_alias_specialization(subscript, value_ty, false)
|
||||
KnownInstanceType::TypeVar(typevar) => {
|
||||
// The type variable designated as a generic type alias by `typing.TypeAlias` can be explicitly specialized.
|
||||
// ```py
|
||||
// from typing import TypeVar, TypeAlias
|
||||
// T = TypeVar('T')
|
||||
// Annotated: TypeAlias = T
|
||||
// _: Annotated[int] = 1 # valid
|
||||
// ```
|
||||
if typevar.identity(self.db()).kind(self.db()) == TypeVarKind::Pep613Alias {
|
||||
self.infer_explicit_type_alias_specialization(subscript, value_ty, false)
|
||||
} else {
|
||||
if let Some(builder) =
|
||||
self.context.report_lint(&INVALID_TYPE_FORM, subscript)
|
||||
{
|
||||
builder.into_diagnostic(format_args!(
|
||||
"A type variable itself cannot be specialized",
|
||||
));
|
||||
}
|
||||
Type::unknown()
|
||||
}
|
||||
}
|
||||
|
||||
KnownInstanceType::UnionType(_)
|
||||
|
|
|
|||
|
|
@ -304,6 +304,14 @@ impl<'db> NominalInstanceType<'db> {
|
|||
matches!(self.0, NominalInstanceInner::Object)
|
||||
}
|
||||
|
||||
pub(super) fn is_definition_generic(self) -> bool {
|
||||
match self.0 {
|
||||
NominalInstanceInner::NonTuple(class) => class.is_generic(),
|
||||
NominalInstanceInner::ExactTuple(_) => true,
|
||||
NominalInstanceInner::Object => false,
|
||||
}
|
||||
}
|
||||
|
||||
/// If this type is an *exact* tuple type (*not* a subclass of `tuple`), returns the
|
||||
/// tuple spec.
|
||||
///
|
||||
|
|
|
|||
|
|
@ -187,7 +187,7 @@ impl<'db> AllMembers<'db> {
|
|||
}
|
||||
|
||||
Type::NewTypeInstance(newtype) => {
|
||||
self.extend_with_type(db, Type::instance(db, newtype.base_class_type(db)));
|
||||
self.extend_with_type(db, newtype.concrete_base_type(db));
|
||||
}
|
||||
|
||||
Type::ClassLiteral(class_literal) if class_literal.is_typed_dict(db) => {
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ use std::collections::BTreeSet;
|
|||
use crate::Db;
|
||||
use crate::semantic_index::definition::{Definition, DefinitionKind};
|
||||
use crate::types::constraints::ConstraintSet;
|
||||
use crate::types::{ClassType, Type, definition_expression_type, visitor};
|
||||
use crate::types::{ClassType, KnownUnion, Type, definition_expression_type, visitor};
|
||||
use ruff_db::parsed::parsed_module;
|
||||
use ruff_python_ast as ast;
|
||||
|
||||
|
|
@ -80,8 +80,15 @@ impl<'db> NewType<'db> {
|
|||
NewTypeBase::ClassType(nominal_instance_type.class(db))
|
||||
}
|
||||
Type::NewTypeInstance(newtype) => NewTypeBase::NewType(newtype),
|
||||
// This branch includes bases that are other typing constructs besides classes and
|
||||
// other newtypes, for example unions. `NewType("Foo", int | str)` is not allowed.
|
||||
// There are exactly two union types allowed as bases for NewType: `int | float` and
|
||||
// `int | float | complex`. These are allowed because that's what `float` and `complex`
|
||||
// expand into in type position. We don't currently ask whether the union was implicit
|
||||
// or explicit, so the explicit version is also allowed.
|
||||
Type::Union(union_type) => match union_type.known(db) {
|
||||
Some(KnownUnion::Float) => NewTypeBase::Float,
|
||||
Some(KnownUnion::Complex) => NewTypeBase::Complex,
|
||||
_ => object_fallback,
|
||||
},
|
||||
_ => object_fallback,
|
||||
}
|
||||
}
|
||||
|
|
@ -94,15 +101,16 @@ impl<'db> NewType<'db> {
|
|||
}
|
||||
}
|
||||
|
||||
// Walk the `NewTypeBase` chain to find the underlying `ClassType`. There might not be a
|
||||
// `ClassType` if this `NewType` is cyclical, and we fall back to `object` in that case.
|
||||
pub fn base_class_type(self, db: &'db dyn Db) -> ClassType<'db> {
|
||||
// Walk the `NewTypeBase` chain to find the underlying non-newtype `Type`. There might not be
|
||||
// one if this `NewType` is cyclical, and we fall back to `object` in that case.
|
||||
pub fn concrete_base_type(self, db: &'db dyn Db) -> Type<'db> {
|
||||
for base in self.iter_bases(db) {
|
||||
if let NewTypeBase::ClassType(class_type) = base {
|
||||
return class_type;
|
||||
match base {
|
||||
NewTypeBase::NewType(_) => continue,
|
||||
concrete => return concrete.instance_type(db),
|
||||
}
|
||||
}
|
||||
ClassType::object(db)
|
||||
Type::object()
|
||||
}
|
||||
|
||||
pub(crate) fn is_equivalent_to_impl(self, db: &'db dyn Db, other: Self) -> bool {
|
||||
|
|
@ -179,10 +187,14 @@ impl<'db> NewType<'db> {
|
|||
Some(mapped_base),
|
||||
));
|
||||
}
|
||||
// Mapping base class types is used for normalization and applying type mappings,
|
||||
// neither of which have any effect on `float` or `complex` (which are already
|
||||
// fully normalized and non-generic), so we don't need to bother calling `f`.
|
||||
NewTypeBase::Float | NewTypeBase::Complex => {}
|
||||
}
|
||||
}
|
||||
// If we get here, there is no `ClassType` (because this newtype is cyclic), and we don't
|
||||
// call `f` at all.
|
||||
// If we get here, there is no `ClassType` (because this newtype is either float/complex or
|
||||
// cyclic), and we don't call `f` at all.
|
||||
Some(self)
|
||||
}
|
||||
|
||||
|
|
@ -209,6 +221,12 @@ pub(crate) fn walk_newtype_instance_type<'db, V: visitor::TypeVisitor<'db> + ?Si
|
|||
pub enum NewTypeBase<'db> {
|
||||
ClassType(ClassType<'db>),
|
||||
NewType(NewType<'db>),
|
||||
// `float` and `complex` are special-cased in type position, where they refer to `int | float`
|
||||
// and `int | float | complex` respectively. As an extension of that special case, we allow
|
||||
// them in `NewType` bases, even though unions and other typing constructs normally aren't
|
||||
// allowed.
|
||||
Float,
|
||||
Complex,
|
||||
}
|
||||
|
||||
impl<'db> NewTypeBase<'db> {
|
||||
|
|
@ -216,6 +234,8 @@ impl<'db> NewTypeBase<'db> {
|
|||
match self {
|
||||
NewTypeBase::ClassType(class_type) => Type::instance(db, class_type),
|
||||
NewTypeBase::NewType(newtype) => Type::NewTypeInstance(newtype),
|
||||
NewTypeBase::Float => KnownUnion::Float.to_type(db),
|
||||
NewTypeBase::Complex => KnownUnion::Complex.to_type(db),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -246,10 +266,6 @@ impl<'db> Iterator for NewTypeBaseIter<'db> {
|
|||
fn next(&mut self) -> Option<Self::Item> {
|
||||
let current = self.current?;
|
||||
match current.base(self.db) {
|
||||
NewTypeBase::ClassType(base_class_type) => {
|
||||
self.current = None;
|
||||
Some(NewTypeBase::ClassType(base_class_type))
|
||||
}
|
||||
NewTypeBase::NewType(base_newtype) => {
|
||||
// Doing the insertion only in this branch avoids allocating in the common case.
|
||||
self.seen_before.insert(current);
|
||||
|
|
@ -262,6 +278,10 @@ impl<'db> Iterator for NewTypeBaseIter<'db> {
|
|||
Some(NewTypeBase::NewType(base_newtype))
|
||||
}
|
||||
}
|
||||
concrete_base => {
|
||||
self.current = None;
|
||||
Some(concrete_base)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -396,7 +396,6 @@ impl<'db> CallableSignature<'db> {
|
|||
self_bound_typevar,
|
||||
Type::TypeVar(other_bound_typevar),
|
||||
Type::TypeVar(other_bound_typevar),
|
||||
relation,
|
||||
);
|
||||
let return_types_match = self_return_type.zip(other_return_type).when_some_and(
|
||||
|(self_return_type, other_return_type)| {
|
||||
|
|
@ -427,7 +426,6 @@ impl<'db> CallableSignature<'db> {
|
|||
self_bound_typevar,
|
||||
Type::Never,
|
||||
upper,
|
||||
relation,
|
||||
);
|
||||
let return_types_match = self_return_type.when_some_and(|self_return_type| {
|
||||
other_signatures
|
||||
|
|
@ -461,7 +459,6 @@ impl<'db> CallableSignature<'db> {
|
|||
other_bound_typevar,
|
||||
lower,
|
||||
Type::object(),
|
||||
relation,
|
||||
);
|
||||
let return_types_match = other_return_type.when_some_and(|other_return_type| {
|
||||
self_signatures
|
||||
|
|
|
|||
|
|
@ -119,7 +119,7 @@ impl<'db> SubclassOfType<'db> {
|
|||
subclass_of.is_type_var()
|
||||
}
|
||||
|
||||
pub(crate) const fn into_type_var(self) -> Option<BoundTypeVarInstance<'db>> {
|
||||
pub const fn into_type_var(self) -> Option<BoundTypeVarInstance<'db>> {
|
||||
self.subclass_of.into_type_var()
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -34,9 +34,8 @@ bitflags::bitflags! {
|
|||
const RELATIVE_FILE_WATCHER_SUPPORT = 1 << 13;
|
||||
const DIAGNOSTIC_DYNAMIC_REGISTRATION = 1 << 14;
|
||||
const WORKSPACE_CONFIGURATION = 1 << 15;
|
||||
const RENAME_DYNAMIC_REGISTRATION = 1 << 16;
|
||||
const COMPLETION_ITEM_LABEL_DETAILS_SUPPORT = 1 << 17;
|
||||
const DIAGNOSTIC_RELATED_INFORMATION = 1 << 18;
|
||||
const COMPLETION_ITEM_LABEL_DETAILS_SUPPORT = 1 << 16;
|
||||
const DIAGNOSTIC_RELATED_INFORMATION = 1 << 17;
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -169,11 +168,6 @@ impl ResolvedClientCapabilities {
|
|||
self.contains(Self::DIAGNOSTIC_RELATED_INFORMATION)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports dynamic registration for rename capabilities.
|
||||
pub(crate) const fn supports_rename_dynamic_registration(self) -> bool {
|
||||
self.contains(Self::RENAME_DYNAMIC_REGISTRATION)
|
||||
}
|
||||
|
||||
/// Returns `true` if the client supports "label details" in completion items.
|
||||
pub(crate) const fn supports_completion_item_label_details(self) -> bool {
|
||||
self.contains(Self::COMPLETION_ITEM_LABEL_DETAILS_SUPPORT)
|
||||
|
|
@ -326,13 +320,6 @@ impl ResolvedClientCapabilities {
|
|||
flags |= Self::HIERARCHICAL_DOCUMENT_SYMBOL_SUPPORT;
|
||||
}
|
||||
|
||||
if text_document
|
||||
.and_then(|text_document| text_document.rename.as_ref()?.dynamic_registration)
|
||||
.unwrap_or_default()
|
||||
{
|
||||
flags |= Self::RENAME_DYNAMIC_REGISTRATION;
|
||||
}
|
||||
|
||||
if client_capabilities
|
||||
.window
|
||||
.as_ref()
|
||||
|
|
@ -373,16 +360,6 @@ pub(crate) fn server_capabilities(
|
|||
))
|
||||
};
|
||||
|
||||
let rename_provider = if resolved_client_capabilities.supports_rename_dynamic_registration() {
|
||||
// If the client supports dynamic registration, we will register the rename capabilities
|
||||
// dynamically based on the `ty.experimental.rename` setting.
|
||||
None
|
||||
} else {
|
||||
// Otherwise, we always register the rename provider and bail out in `prepareRename` if
|
||||
// the feature is disabled.
|
||||
Some(OneOf::Right(server_rename_options()))
|
||||
};
|
||||
|
||||
ServerCapabilities {
|
||||
position_encoding: Some(position_encoding.into()),
|
||||
code_action_provider: Some(types::CodeActionProviderCapability::Options(
|
||||
|
|
@ -413,7 +390,7 @@ pub(crate) fn server_capabilities(
|
|||
definition_provider: Some(OneOf::Left(true)),
|
||||
declaration_provider: Some(DeclarationCapability::Simple(true)),
|
||||
references_provider: Some(OneOf::Left(true)),
|
||||
rename_provider,
|
||||
rename_provider: Some(OneOf::Right(server_rename_options())),
|
||||
document_highlight_provider: Some(OneOf::Left(true)),
|
||||
hover_provider: Some(HoverProviderCapability::Simple(true)),
|
||||
signature_help_provider: Some(SignatureHelpOptions {
|
||||
|
|
|
|||
|
|
@ -299,7 +299,9 @@ where
|
|||
}
|
||||
|
||||
if let Err(error) = ruff_db::panic::catch_unwind(|| {
|
||||
R::handle_request(&id, &db, document, client, params);
|
||||
salsa::attach(&db, || {
|
||||
R::handle_request(&id, &db, document, client, params);
|
||||
});
|
||||
}) {
|
||||
panic_response::<R>(&id, client, &error, retry);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -26,7 +26,7 @@ impl SyncNotificationHandler for DidChangeTextDocumentHandler {
|
|||
content_changes,
|
||||
} = params;
|
||||
|
||||
let document = session
|
||||
let mut document = session
|
||||
.document_handle(&uri)
|
||||
.with_failure_code(ErrorCode::InternalError)?;
|
||||
|
||||
|
|
|
|||
|
|
@ -24,7 +24,7 @@ impl SyncNotificationHandler for DidChangeNotebookHandler {
|
|||
change: types::NotebookDocumentChangeEvent { cells, metadata },
|
||||
}: types::DidChangeNotebookDocumentParams,
|
||||
) -> Result<()> {
|
||||
let document = session
|
||||
let mut document = session
|
||||
.document_handle(&uri)
|
||||
.with_failure_code(ErrorCode::InternalError)?;
|
||||
|
||||
|
|
|
|||
|
|
@ -32,7 +32,6 @@ impl BackgroundDocumentRequestHandler for PrepareRenameRequestHandler {
|
|||
if snapshot
|
||||
.workspace_settings()
|
||||
.is_language_services_disabled()
|
||||
|| !snapshot.global_settings().is_rename_enabled()
|
||||
{
|
||||
return Ok(None);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -9,7 +9,7 @@ use anyhow::{Context, anyhow};
|
|||
use lsp_server::{Message, RequestId};
|
||||
use lsp_types::notification::{DidChangeWatchedFiles, Exit, Notification};
|
||||
use lsp_types::request::{
|
||||
DocumentDiagnosticRequest, RegisterCapability, Rename, Request, Shutdown, UnregisterCapability,
|
||||
DocumentDiagnosticRequest, RegisterCapability, Request, Shutdown, UnregisterCapability,
|
||||
WorkspaceDiagnosticRequest,
|
||||
};
|
||||
use lsp_types::{
|
||||
|
|
@ -32,9 +32,7 @@ use options::GlobalOptions;
|
|||
pub(crate) use self::options::InitializationOptions;
|
||||
pub use self::options::{ClientOptions, DiagnosticMode};
|
||||
pub(crate) use self::settings::{GlobalSettings, WorkspaceSettings};
|
||||
use crate::capabilities::{
|
||||
ResolvedClientCapabilities, server_diagnostic_options, server_rename_options,
|
||||
};
|
||||
use crate::capabilities::{ResolvedClientCapabilities, server_diagnostic_options};
|
||||
use crate::document::{DocumentKey, DocumentVersion, NotebookDocument};
|
||||
use crate::server::{Action, publish_settings_diagnostics};
|
||||
use crate::session::client::Client;
|
||||
|
|
@ -583,7 +581,6 @@ impl Session {
|
|||
/// `ty.experimental.rename` global setting.
|
||||
fn register_capabilities(&mut self, client: &Client) {
|
||||
static DIAGNOSTIC_REGISTRATION_ID: &str = "ty/textDocument/diagnostic";
|
||||
static RENAME_REGISTRATION_ID: &str = "ty/textDocument/rename";
|
||||
static FILE_WATCHER_REGISTRATION_ID: &str = "ty/workspace/didChangeWatchedFiles";
|
||||
|
||||
let mut registrations = vec![];
|
||||
|
|
@ -625,31 +622,6 @@ impl Session {
|
|||
});
|
||||
}
|
||||
|
||||
if self
|
||||
.resolved_client_capabilities
|
||||
.supports_rename_dynamic_registration()
|
||||
{
|
||||
let is_rename_enabled = self.global_settings.is_rename_enabled();
|
||||
|
||||
if !is_rename_enabled {
|
||||
tracing::debug!("Rename capability is disabled in the resolved global settings");
|
||||
if self.registrations.contains(Rename::METHOD) {
|
||||
unregistrations.push(Unregistration {
|
||||
id: RENAME_REGISTRATION_ID.into(),
|
||||
method: Rename::METHOD.into(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if is_rename_enabled {
|
||||
registrations.push(Registration {
|
||||
id: RENAME_REGISTRATION_ID.into(),
|
||||
method: Rename::METHOD.into(),
|
||||
register_options: Some(serde_json::to_value(server_rename_options()).unwrap()),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(register_options) = self.file_watcher_registration_options() {
|
||||
if self.registrations.contains(DidChangeWatchedFiles::METHOD) {
|
||||
unregistrations.push(Unregistration {
|
||||
|
|
@ -1020,6 +992,7 @@ impl DocumentSnapshot {
|
|||
}
|
||||
|
||||
/// Returns the client settings for all workspaces.
|
||||
#[expect(unused)]
|
||||
pub(crate) fn global_settings(&self) -> &GlobalSettings {
|
||||
&self.global_settings
|
||||
}
|
||||
|
|
@ -1451,7 +1424,7 @@ impl DocumentHandle {
|
|||
}
|
||||
|
||||
pub(crate) fn update_text_document(
|
||||
&self,
|
||||
&mut self,
|
||||
session: &mut Session,
|
||||
content_changes: Vec<TextDocumentContentChangeEvent>,
|
||||
new_version: DocumentVersion,
|
||||
|
|
@ -1471,6 +1444,8 @@ impl DocumentHandle {
|
|||
} else {
|
||||
document.apply_changes(content_changes, new_version, position_encoding);
|
||||
}
|
||||
|
||||
self.set_version(document.version());
|
||||
}
|
||||
|
||||
self.update_in_db(session);
|
||||
|
|
@ -1479,7 +1454,7 @@ impl DocumentHandle {
|
|||
}
|
||||
|
||||
pub(crate) fn update_notebook_document(
|
||||
&self,
|
||||
&mut self,
|
||||
session: &mut Session,
|
||||
cells: Option<lsp_types::NotebookDocumentCellChange>,
|
||||
metadata: Option<lsp_types::LSPObject>,
|
||||
|
|
@ -1496,6 +1471,8 @@ impl DocumentHandle {
|
|||
new_version,
|
||||
position_encoding,
|
||||
)?;
|
||||
|
||||
self.set_version(new_version);
|
||||
}
|
||||
|
||||
self.update_in_db(session);
|
||||
|
|
@ -1516,6 +1493,16 @@ impl DocumentHandle {
|
|||
session.apply_changes(path, changes);
|
||||
}
|
||||
|
||||
fn set_version(&mut self, version: DocumentVersion) {
|
||||
let self_version = match self {
|
||||
DocumentHandle::Text { version, .. }
|
||||
| DocumentHandle::Notebook { version, .. }
|
||||
| DocumentHandle::Cell { version, .. } => version,
|
||||
};
|
||||
|
||||
*self_version = version;
|
||||
}
|
||||
|
||||
/// De-registers a document, specified by its key.
|
||||
/// Calling this multiple times for the same document is a logic error.
|
||||
///
|
||||
|
|
|
|||
|
|
@ -116,12 +116,6 @@ impl ClientOptions {
|
|||
self
|
||||
}
|
||||
|
||||
#[must_use]
|
||||
pub fn with_experimental_rename(mut self, enabled: bool) -> Self {
|
||||
self.global.experimental.get_or_insert_default().rename = Some(enabled);
|
||||
self
|
||||
}
|
||||
|
||||
#[must_use]
|
||||
pub fn with_auto_import(mut self, enabled: bool) -> Self {
|
||||
self.workspace
|
||||
|
|
@ -156,9 +150,7 @@ impl GlobalOptions {
|
|||
pub(crate) fn into_settings(self) -> GlobalSettings {
|
||||
let experimental = self
|
||||
.experimental
|
||||
.map(|experimental| ExperimentalSettings {
|
||||
rename: experimental.rename.unwrap_or(false),
|
||||
})
|
||||
.map(Experimental::into_settings)
|
||||
.unwrap_or_default();
|
||||
|
||||
GlobalSettings {
|
||||
|
|
@ -326,9 +318,13 @@ impl Combine for DiagnosticMode {
|
|||
|
||||
#[derive(Clone, Combine, Debug, Serialize, Deserialize, Default)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub(crate) struct Experimental {
|
||||
/// Whether to enable the experimental symbol rename feature.
|
||||
pub(crate) rename: Option<bool>,
|
||||
pub(crate) struct Experimental;
|
||||
|
||||
impl Experimental {
|
||||
#[expect(clippy::unused_self)]
|
||||
fn into_settings(self) -> ExperimentalSettings {
|
||||
ExperimentalSettings {}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, Default)]
|
||||
|
|
|
|||
|
|
@ -10,12 +10,6 @@ pub(crate) struct GlobalSettings {
|
|||
pub(super) experimental: ExperimentalSettings,
|
||||
}
|
||||
|
||||
impl GlobalSettings {
|
||||
pub(crate) fn is_rename_enabled(&self) -> bool {
|
||||
self.experimental.rename
|
||||
}
|
||||
}
|
||||
|
||||
impl GlobalSettings {
|
||||
pub(crate) fn diagnostic_mode(&self) -> DiagnosticMode {
|
||||
self.diagnostic_mode
|
||||
|
|
@ -23,9 +17,7 @@ impl GlobalSettings {
|
|||
}
|
||||
|
||||
#[derive(Clone, Default, Debug, PartialEq)]
|
||||
pub(crate) struct ExperimentalSettings {
|
||||
pub(super) rename: bool,
|
||||
}
|
||||
pub(crate) struct ExperimentalSettings;
|
||||
|
||||
/// Resolved client settings for a specific workspace.
|
||||
///
|
||||
|
|
|
|||
|
|
@ -434,96 +434,21 @@ fn unknown_options_in_workspace_configuration() -> Result<()> {
|
|||
Ok(())
|
||||
}
|
||||
|
||||
/// Tests that the server sends a registration request for the rename capability if the client
|
||||
/// setting is set to true and dynamic registration is enabled.
|
||||
#[test]
|
||||
fn register_rename_capability_when_enabled() -> Result<()> {
|
||||
let workspace_root = SystemPath::new("foo");
|
||||
let mut server = TestServerBuilder::new()?
|
||||
.with_workspace(workspace_root, None)?
|
||||
.with_initialization_options(ClientOptions::default().with_experimental_rename(true))
|
||||
.enable_rename_dynamic_registration(true)
|
||||
.build()
|
||||
.wait_until_workspaces_are_initialized();
|
||||
|
||||
let (_, params) = server.await_request::<RegisterCapability>();
|
||||
let [registration] = params.registrations.as_slice() else {
|
||||
panic!(
|
||||
"Expected a single registration, got: {:#?}",
|
||||
params.registrations
|
||||
);
|
||||
};
|
||||
|
||||
insta::assert_json_snapshot!(registration, @r#"
|
||||
{
|
||||
"id": "ty/textDocument/rename",
|
||||
"method": "textDocument/rename",
|
||||
"registerOptions": {
|
||||
"prepareProvider": true
|
||||
}
|
||||
}
|
||||
"#);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Tests that rename capability is statically registered during initialization if the client
|
||||
/// doesn't support dynamic registration, but the server is configured to support it.
|
||||
#[test]
|
||||
fn rename_available_without_dynamic_registration() -> Result<()> {
|
||||
let workspace_root = SystemPath::new("foo");
|
||||
|
||||
let server = TestServerBuilder::new()?
|
||||
.with_workspace(workspace_root, None)?
|
||||
.with_initialization_options(ClientOptions::default().with_experimental_rename(true))
|
||||
.enable_rename_dynamic_registration(false)
|
||||
.build()
|
||||
.wait_until_workspaces_are_initialized();
|
||||
|
||||
let initialization_result = server.initialization_result().unwrap();
|
||||
insta::assert_json_snapshot!(initialization_result.capabilities.rename_provider, @r#"
|
||||
{
|
||||
"prepareProvider": true
|
||||
}
|
||||
"#);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Tests that the server does not send a registration request for the rename capability if the
|
||||
/// client setting is set to false and dynamic registration is enabled.
|
||||
#[test]
|
||||
fn not_register_rename_capability_when_disabled() -> Result<()> {
|
||||
let workspace_root = SystemPath::new("foo");
|
||||
|
||||
TestServerBuilder::new()?
|
||||
.with_workspace(workspace_root, None)?
|
||||
.with_initialization_options(ClientOptions::default().with_experimental_rename(false))
|
||||
.enable_rename_dynamic_registration(true)
|
||||
.build()
|
||||
.wait_until_workspaces_are_initialized();
|
||||
|
||||
// The `Drop` implementation will make sure that the client did not receive any registration
|
||||
// request.
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Tests that the server can register multiple capabilities at once.
|
||||
///
|
||||
/// This test would need to be updated when the server supports additional capabilities in the
|
||||
/// future.
|
||||
///
|
||||
/// TODO: This test currently only verifies a single capability. It should be
|
||||
/// updated with more dynamic capabilities when the server supports it.
|
||||
#[test]
|
||||
fn register_multiple_capabilities() -> Result<()> {
|
||||
let workspace_root = SystemPath::new("foo");
|
||||
let mut server = TestServerBuilder::new()?
|
||||
.with_workspace(workspace_root, None)?
|
||||
.with_initialization_options(
|
||||
ClientOptions::default()
|
||||
.with_experimental_rename(true)
|
||||
.with_diagnostic_mode(DiagnosticMode::Workspace),
|
||||
ClientOptions::default().with_diagnostic_mode(DiagnosticMode::Workspace),
|
||||
)
|
||||
.enable_rename_dynamic_registration(true)
|
||||
.enable_diagnostic_dynamic_registration(true)
|
||||
.build()
|
||||
.wait_until_workspaces_are_initialized();
|
||||
|
|
@ -531,8 +456,6 @@ fn register_multiple_capabilities() -> Result<()> {
|
|||
let (_, params) = server.await_request::<RegisterCapability>();
|
||||
let registrations = params.registrations;
|
||||
|
||||
assert_eq!(registrations.len(), 2);
|
||||
|
||||
insta::assert_json_snapshot!(registrations, @r#"
|
||||
[
|
||||
{
|
||||
|
|
@ -545,13 +468,6 @@ fn register_multiple_capabilities() -> Result<()> {
|
|||
"workDoneProgress": true,
|
||||
"workspaceDiagnostics": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "ty/textDocument/rename",
|
||||
"method": "textDocument/rename",
|
||||
"registerOptions": {
|
||||
"prepareProvider": true
|
||||
}
|
||||
}
|
||||
]
|
||||
"#);
|
||||
|
|
|
|||
|
|
@ -35,6 +35,7 @@ mod inlay_hints;
|
|||
mod notebook;
|
||||
mod publish_diagnostics;
|
||||
mod pull_diagnostics;
|
||||
mod rename;
|
||||
|
||||
use std::collections::{BTreeMap, HashMap, VecDeque};
|
||||
use std::num::NonZeroUsize;
|
||||
|
|
@ -52,8 +53,8 @@ use lsp_types::notification::{
|
|||
Initialized, Notification,
|
||||
};
|
||||
use lsp_types::request::{
|
||||
Completion, DocumentDiagnosticRequest, HoverRequest, Initialize, InlayHintRequest, Request,
|
||||
Shutdown, WorkspaceConfiguration, WorkspaceDiagnosticRequest,
|
||||
Completion, DocumentDiagnosticRequest, HoverRequest, Initialize, InlayHintRequest,
|
||||
PrepareRenameRequest, Request, Shutdown, WorkspaceConfiguration, WorkspaceDiagnosticRequest,
|
||||
};
|
||||
use lsp_types::{
|
||||
ClientCapabilities, CompletionItem, CompletionParams, CompletionResponse,
|
||||
|
|
@ -67,12 +68,11 @@ use lsp_types::{
|
|||
TextDocumentContentChangeEvent, TextDocumentIdentifier, TextDocumentItem,
|
||||
TextDocumentPositionParams, Url, VersionedTextDocumentIdentifier, WorkDoneProgressParams,
|
||||
WorkspaceClientCapabilities, WorkspaceDiagnosticParams, WorkspaceDiagnosticReportResult,
|
||||
WorkspaceFolder,
|
||||
WorkspaceEdit, WorkspaceFolder,
|
||||
};
|
||||
use ruff_db::system::{OsSystem, SystemPath, SystemPathBuf, TestSystem};
|
||||
use rustc_hash::FxHashMap;
|
||||
use tempfile::TempDir;
|
||||
|
||||
use ty_server::{ClientOptions, LogLevel, Server, init_logging};
|
||||
|
||||
/// Number of times to retry receiving a message before giving up
|
||||
|
|
@ -420,6 +420,16 @@ impl TestServer {
|
|||
.unwrap_or_else(|err| panic!("Failed to receive response for request {id}: {err}"))
|
||||
}
|
||||
|
||||
#[track_caller]
|
||||
pub(crate) fn send_request_await<R>(&mut self, params: R::Params) -> R::Result
|
||||
where
|
||||
R: Request,
|
||||
{
|
||||
let id = self.send_request::<R>(params);
|
||||
self.try_await_response::<R>(&id, None)
|
||||
.unwrap_or_else(|err| panic!("Failed to receive response for request {id}: {err}"))
|
||||
}
|
||||
|
||||
/// Wait for a server response corresponding to the given request ID.
|
||||
///
|
||||
/// This should only be called if a request was already sent to the server via [`send_request`]
|
||||
|
|
@ -802,6 +812,38 @@ impl TestServer {
|
|||
self.send_notification::<DidChangeWatchedFiles>(params);
|
||||
}
|
||||
|
||||
pub(crate) fn rename(
|
||||
&mut self,
|
||||
document: &Url,
|
||||
position: lsp_types::Position,
|
||||
new_name: &str,
|
||||
) -> Result<Option<WorkspaceEdit>, ()> {
|
||||
if self
|
||||
.send_request_await::<PrepareRenameRequest>(lsp_types::TextDocumentPositionParams {
|
||||
text_document: TextDocumentIdentifier {
|
||||
uri: document.clone(),
|
||||
},
|
||||
position,
|
||||
})
|
||||
.is_none()
|
||||
{
|
||||
return Err(());
|
||||
}
|
||||
|
||||
Ok(
|
||||
self.send_request_await::<lsp_types::request::Rename>(lsp_types::RenameParams {
|
||||
text_document_position: TextDocumentPositionParams {
|
||||
text_document: TextDocumentIdentifier {
|
||||
uri: document.clone(),
|
||||
},
|
||||
position,
|
||||
},
|
||||
new_name: new_name.to_string(),
|
||||
work_done_progress_params: WorkDoneProgressParams::default(),
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// Send a `textDocument/diagnostic` request for the document at the given path.
|
||||
pub(crate) fn document_diagnostic_request(
|
||||
&mut self,
|
||||
|
|
@ -1082,17 +1124,6 @@ impl TestServerBuilder {
|
|||
self
|
||||
}
|
||||
|
||||
/// Enable or disable dynamic registration of rename capability
|
||||
pub(crate) fn enable_rename_dynamic_registration(mut self, enabled: bool) -> Self {
|
||||
self.client_capabilities
|
||||
.text_document
|
||||
.get_or_insert_default()
|
||||
.rename
|
||||
.get_or_insert_default()
|
||||
.dynamic_registration = Some(enabled);
|
||||
self
|
||||
}
|
||||
|
||||
/// Enable or disable workspace configuration capability
|
||||
pub(crate) fn enable_workspace_configuration(mut self, enabled: bool) -> Self {
|
||||
self.client_capabilities
|
||||
|
|
|
|||
|
|
@ -33,6 +33,42 @@ def foo() -> str:
|
|||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn on_did_change() -> Result<()> {
|
||||
let workspace_root = SystemPath::new("src");
|
||||
let foo = SystemPath::new("src/foo.py");
|
||||
let foo_content = "\
|
||||
def foo() -> str:
|
||||
return 42
|
||||
";
|
||||
|
||||
let mut server = TestServerBuilder::new()?
|
||||
.with_workspace(workspace_root, None)?
|
||||
.with_file(foo, foo_content)?
|
||||
.enable_pull_diagnostics(false)
|
||||
.build()
|
||||
.wait_until_workspaces_are_initialized();
|
||||
|
||||
server.open_text_document(foo, foo_content, 1);
|
||||
let _ = server.await_notification::<PublishDiagnostics>();
|
||||
|
||||
let changes = vec![lsp_types::TextDocumentContentChangeEvent {
|
||||
range: None,
|
||||
range_length: None,
|
||||
text: "def foo() -> int: return 42".to_string(),
|
||||
}];
|
||||
|
||||
server.change_text_document(foo, changes, 2);
|
||||
|
||||
let diagnostics = server.await_notification::<PublishDiagnostics>();
|
||||
|
||||
assert_eq!(diagnostics.version, Some(2));
|
||||
|
||||
insta::assert_debug_snapshot!(diagnostics);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn message_without_related_information_support() -> Result<()> {
|
||||
let workspace_root = SystemPath::new("src");
|
||||
|
|
|
|||
|
|
@ -0,0 +1,84 @@
|
|||
use crate::TestServerBuilder;
|
||||
use crate::notebook::NotebookBuilder;
|
||||
use insta::assert_json_snapshot;
|
||||
|
||||
#[test]
|
||||
fn text_document() -> anyhow::Result<()> {
|
||||
let mut server = TestServerBuilder::new()?
|
||||
.with_file("foo.py", "")?
|
||||
.enable_pull_diagnostics(true)
|
||||
.build()
|
||||
.wait_until_workspaces_are_initialized();
|
||||
|
||||
server.open_text_document(
|
||||
"foo.py",
|
||||
r#"def test(): ...
|
||||
|
||||
test()
|
||||
"#,
|
||||
1,
|
||||
);
|
||||
|
||||
let edits = server
|
||||
.rename(
|
||||
&server.file_uri("foo.py"),
|
||||
lsp_types::Position {
|
||||
line: 0,
|
||||
character: 5,
|
||||
},
|
||||
"new_name",
|
||||
)
|
||||
.expect("Can rename `test` function");
|
||||
|
||||
assert_json_snapshot!(edits);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn notebook() -> anyhow::Result<()> {
|
||||
let mut server = TestServerBuilder::new()?
|
||||
.with_file("test.ipynb", "")?
|
||||
.enable_pull_diagnostics(true)
|
||||
.build()
|
||||
.wait_until_workspaces_are_initialized();
|
||||
|
||||
let mut builder = NotebookBuilder::virtual_file("test.ipynb");
|
||||
builder.add_python_cell(
|
||||
r#"from typing import Literal
|
||||
|
||||
type Style = Literal["italic", "bold", "underline"]"#,
|
||||
);
|
||||
|
||||
let cell2 = builder.add_python_cell(
|
||||
r#"def with_style(line: str, word, style: Style) -> str:
|
||||
if style == "italic":
|
||||
return line.replace(word, f"*{word}*")
|
||||
elif style == "bold":
|
||||
return line.replace(word, f"__{word}__")
|
||||
|
||||
position = line.find(word)
|
||||
output = line + "\n"
|
||||
output += " " * position
|
||||
output += "-" * len(word)
|
||||
"#,
|
||||
);
|
||||
|
||||
builder.open(&mut server);
|
||||
|
||||
let edits = server
|
||||
.rename(
|
||||
&cell2,
|
||||
lsp_types::Position {
|
||||
line: 0,
|
||||
character: 16,
|
||||
},
|
||||
"text",
|
||||
)
|
||||
.expect("Can rename `line` parameter");
|
||||
|
||||
assert_json_snapshot!(edits);
|
||||
|
||||
server.collect_publish_diagnostic_notifications(2);
|
||||
Ok(())
|
||||
}
|
||||
|
|
@ -8,9 +8,7 @@ Client capabilities: ResolvedClientCapabilities(
|
|||
Position encoding: UTF16
|
||||
Global settings: GlobalSettings {
|
||||
diagnostic_mode: OpenFilesOnly,
|
||||
experimental: ExperimentalSettings {
|
||||
rename: false,
|
||||
},
|
||||
experimental: ExperimentalSettings,
|
||||
}
|
||||
Open text documents: 0
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,21 @@
|
|||
---
|
||||
source: crates/ty_server/tests/e2e/publish_diagnostics.rs
|
||||
expression: diagnostics
|
||||
---
|
||||
PublishDiagnosticsParams {
|
||||
uri: Url {
|
||||
scheme: "file",
|
||||
cannot_be_a_base: false,
|
||||
username: "",
|
||||
password: None,
|
||||
host: None,
|
||||
port: None,
|
||||
path: "<temp_dir>/src/foo.py",
|
||||
query: None,
|
||||
fragment: None,
|
||||
},
|
||||
diagnostics: [],
|
||||
version: Some(
|
||||
2,
|
||||
),
|
||||
}
|
||||
|
|
@ -0,0 +1,75 @@
|
|||
---
|
||||
source: crates/ty_server/tests/e2e/rename.rs
|
||||
expression: edits
|
||||
---
|
||||
{
|
||||
"changes": {
|
||||
"vscode-notebook-cell://test.ipynb#1": [
|
||||
{
|
||||
"range": {
|
||||
"start": {
|
||||
"line": 0,
|
||||
"character": 15
|
||||
},
|
||||
"end": {
|
||||
"line": 0,
|
||||
"character": 19
|
||||
}
|
||||
},
|
||||
"newText": "text"
|
||||
},
|
||||
{
|
||||
"range": {
|
||||
"start": {
|
||||
"line": 2,
|
||||
"character": 15
|
||||
},
|
||||
"end": {
|
||||
"line": 2,
|
||||
"character": 19
|
||||
}
|
||||
},
|
||||
"newText": "text"
|
||||
},
|
||||
{
|
||||
"range": {
|
||||
"start": {
|
||||
"line": 4,
|
||||
"character": 15
|
||||
},
|
||||
"end": {
|
||||
"line": 4,
|
||||
"character": 19
|
||||
}
|
||||
},
|
||||
"newText": "text"
|
||||
},
|
||||
{
|
||||
"range": {
|
||||
"start": {
|
||||
"line": 6,
|
||||
"character": 15
|
||||
},
|
||||
"end": {
|
||||
"line": 6,
|
||||
"character": 19
|
||||
}
|
||||
},
|
||||
"newText": "text"
|
||||
},
|
||||
{
|
||||
"range": {
|
||||
"start": {
|
||||
"line": 7,
|
||||
"character": 13
|
||||
},
|
||||
"end": {
|
||||
"line": 7,
|
||||
"character": 17
|
||||
}
|
||||
},
|
||||
"newText": "text"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,36 @@
|
|||
---
|
||||
source: crates/ty_server/tests/e2e/rename.rs
|
||||
expression: edits
|
||||
---
|
||||
{
|
||||
"changes": {
|
||||
"file://<temp_dir>/foo.py": [
|
||||
{
|
||||
"range": {
|
||||
"start": {
|
||||
"line": 0,
|
||||
"character": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 0,
|
||||
"character": 8
|
||||
}
|
||||
},
|
||||
"newText": "new_name"
|
||||
},
|
||||
{
|
||||
"range": {
|
||||
"start": {
|
||||
"line": 2,
|
||||
"character": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 2,
|
||||
"character": 4
|
||||
}
|
||||
},
|
||||
"newText": "new_name"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
|
@ -722,7 +722,7 @@ with tempfile.TemporaryDirectory() as d1:
|
|||
|
||||
### Preserving parentheses around single-element lists
|
||||
|
||||
Ruff preserves at least one parentheses around list elements, even if the list only contains a single element. The Black 2025 or newer, on the other hand, removes the parentheses
|
||||
Ruff preserves at least one set of parentheses around list elements, even if the list only contains a single element. The Black 2025 style or newer, on the other hand, removes the parentheses
|
||||
for single-element lists if they aren't multiline and doing so does not change semantics:
|
||||
|
||||
```python
|
||||
|
|
@ -742,3 +742,98 @@ items = [(True)]
|
|||
items = {(123)}
|
||||
|
||||
```
|
||||
|
||||
### Long lambda expressions
|
||||
|
||||
In [preview](../preview.md), Ruff will keep lambda parameters on a single line,
|
||||
just like Black:
|
||||
|
||||
```python
|
||||
# Input
|
||||
def a():
|
||||
return b(
|
||||
c,
|
||||
d,
|
||||
e,
|
||||
f=lambda self, *args, **kwargs: aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(
|
||||
*args, **kwargs
|
||||
),
|
||||
)
|
||||
|
||||
# Ruff Stable
|
||||
def a():
|
||||
return b(
|
||||
c,
|
||||
d,
|
||||
e,
|
||||
f=lambda self,
|
||||
*args,
|
||||
**kwargs: aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*args, **kwargs),
|
||||
)
|
||||
|
||||
# Black and Ruff Preview
|
||||
def a():
|
||||
return b(
|
||||
c,
|
||||
d,
|
||||
e,
|
||||
f=lambda self, *args, **kwargs: aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(
|
||||
*args, **kwargs
|
||||
),
|
||||
)
|
||||
```
|
||||
|
||||
However, if the body expression exceeds the configured line length, Ruff will
|
||||
additionally add parentheses around the lambda body and break it over multiple
|
||||
lines:
|
||||
|
||||
```python
|
||||
# Input
|
||||
def a():
|
||||
return b(
|
||||
c,
|
||||
d,
|
||||
e,
|
||||
# Additional `b` character pushes this over the line length
|
||||
f=lambda self, *args, **kwargs: baaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(
|
||||
*args, **kwargs
|
||||
),
|
||||
# More complex expressions also trigger wrapping
|
||||
g=lambda self, *args, **kwargs: baaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(
|
||||
*args, **kwargs
|
||||
) + 1,
|
||||
)
|
||||
|
||||
# Black
|
||||
def a():
|
||||
return b(
|
||||
c,
|
||||
d,
|
||||
e,
|
||||
# Additional `b` character pushes this over the line length
|
||||
f=lambda self, *args, **kwargs: baaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(
|
||||
*args, **kwargs
|
||||
),
|
||||
# More complex expressions also trigger wrapping
|
||||
g=lambda self, *args, **kwargs: baaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(
|
||||
*args, **kwargs
|
||||
)
|
||||
+ 1,
|
||||
)
|
||||
|
||||
# Ruff Preview
|
||||
def a():
|
||||
return b(
|
||||
c,
|
||||
d,
|
||||
e,
|
||||
# Additional `b` character pushes this over the line length
|
||||
f=lambda self, *args, **kwargs: (
|
||||
baaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*args, **kwargs)
|
||||
),
|
||||
# More complex expressions also trigger wrapping
|
||||
g=lambda self, *args, **kwargs: (
|
||||
baaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa(*args, **kwargs) + 1
|
||||
),
|
||||
)
|
||||
```
|
||||
|
|
|
|||
|
|
@ -80,7 +80,7 @@ You can add the following configuration to `.gitlab-ci.yml` to run a `ruff forma
|
|||
stage: build
|
||||
interruptible: true
|
||||
image:
|
||||
name: ghcr.io/astral-sh/ruff:0.14.8-alpine
|
||||
name: ghcr.io/astral-sh/ruff:0.14.9-alpine
|
||||
before_script:
|
||||
- cd $CI_PROJECT_DIR
|
||||
- ruff --version
|
||||
|
|
@ -106,7 +106,7 @@ Ruff can be used as a [pre-commit](https://pre-commit.com) hook via [`ruff-pre-c
|
|||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.8
|
||||
rev: v0.14.9
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
|
@ -119,7 +119,7 @@ To enable lint fixes, add the `--fix` argument to the lint hook:
|
|||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.8
|
||||
rev: v0.14.9
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
|
@ -133,7 +133,7 @@ To avoid running on Jupyter Notebooks, remove `jupyter` from the list of allowed
|
|||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.8
|
||||
rev: v0.14.9
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
|
|
|||
|
|
@ -369,7 +369,7 @@ This tutorial has focused on Ruff's command-line interface, but Ruff can also be
|
|||
```yaml
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.14.8
|
||||
rev: v0.14.9
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff-check
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ build-backend = "maturin"
|
|||
|
||||
[project]
|
||||
name = "ruff"
|
||||
version = "0.14.8"
|
||||
version = "0.14.9"
|
||||
description = "An extremely fast Python linter and code formatter, written in Rust."
|
||||
authors = [{ name = "Astral Software Inc.", email = "hey@astral.sh" }]
|
||||
readme = "README.md"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[project]
|
||||
name = "scripts"
|
||||
version = "0.14.8"
|
||||
version = "0.14.9"
|
||||
description = ""
|
||||
authors = ["Charles Marsh <charlie.r.marsh@gmail.com>"]
|
||||
|
||||
|
|
|
|||
|
|
@ -29,7 +29,6 @@ ERROR src/black/trans.py:544:10-23: `type[Err[CannotTransform] | Ok[TypeVar[T]]]
|
|||
ERROR src/black/trans.py:752:55-68: `type[Err[CannotTransform] | Ok[TypeVar[T]]]` is not subscriptable [unsupported-operation]
|
||||
ERROR src/black/trans.py:985:19-32: `type[Err[CannotTransform] | Ok[TypeVar[T]]]` is not subscriptable [unsupported-operation]
|
||||
ERROR src/black/trans.py:1111:57-70: `type[Err[CannotTransform] | Ok[TypeVar[T]]]` is not subscriptable [unsupported-operation]
|
||||
ERROR src/black/trans.py:1349:17-1350:27: `int` is not assignable to `int` (caused by inconsistent types when breaking cycles) [bad-assignment]
|
||||
ERROR src/black/trans.py:1480:19-32: `type[Err[CannotTransform] | Ok[TypeVar[T]]]` is not subscriptable [unsupported-operation]
|
||||
ERROR src/black/trans.py:1630:25-31: `csplit` may be uninitialized [unbound-name]
|
||||
ERROR src/black/trans.py:2162:19-32: `type[Err[CannotTransform] | Ok[TypeVar[T]]]` is not subscriptable [unsupported-operation]
|
||||
|
|
@ -43,4 +42,4 @@ ERROR src/blib2to3/pytree.py:670:24-34: `newcontent` may be uninitialized [unbou
|
|||
ERROR src/blib2to3/pytree.py:756:24-39: `wrapped_content` may be uninitialized [unbound-name]
|
||||
ERROR src/blib2to3/pytree.py:847:34-45: `save_stderr` may be uninitialized [unbound-name]
|
||||
INFO Checking project configured at `<CWD>/pyrefly.toml`
|
||||
INFO 44 errors (2 suppressed)
|
||||
INFO 43 errors (2 suppressed)
|
||||
|
|
|
|||
|
|
@ -12,7 +12,7 @@ ERROR discord/activity.py:286:9-16: Class member `Activity.to_dict` overrides pa
|
|||
ERROR discord/activity.py:455:9-16: Class member `Game.to_dict` overrides parent class `BaseActivity` in an inconsistent manner [bad-override]
|
||||
ERROR discord/activity.py:566:9-16: Class member `Streaming.to_dict` overrides parent class `BaseActivity` in an inconsistent manner [bad-override]
|
||||
ERROR discord/activity.py:822:9-16: Class member `CustomActivity.to_dict` overrides parent class `BaseActivity` in an inconsistent manner [bad-override]
|
||||
ERROR discord/activity.py:836:26-46: Cannot set item in `dict[str, int | str | None]` [unsupported-operation]
|
||||
ERROR discord/activity.py:836:26-46: `Emoji` is not assignable to TypedDict key `emoji` with type `int | str | None` [bad-typed-dict-key]
|
||||
ERROR discord/app_commands/checks.py:390:64-71: Argument `((Interaction[Any]) -> Cooldown | None) | ((Interaction[Any]) -> Coroutine[Any, Any, Cooldown | None])` is not assignable to parameter with type `(Interaction[Any]) -> MaybeAwaitable` in function `discord.utils.maybe_coroutine` [bad-argument-type]
|
||||
ERROR discord/app_commands/commands.py:235:9-38: Object of class `FunctionType` has no attribute `pass_command_binding` [missing-attribute]
|
||||
ERROR discord/app_commands/commands.py:393:29-76: Object of class `FunctionType` has no attribute `__discord_app_commands_param_description__` [missing-attribute]
|
||||
|
|
@ -34,7 +34,7 @@ ERROR discord/app_commands/models.py:370:57-89: Cannot set item in `dict[str, st
|
|||
ERROR discord/app_commands/models.py:372:57-61: Cannot set item in `dict[str, str]` [unsupported-operation]
|
||||
ERROR discord/app_commands/models.py:375:40-53: Cannot set item in `dict[str, str]` [unsupported-operation]
|
||||
ERROR discord/app_commands/models.py:378:34-74: Cannot set item in `dict[str, str]` [unsupported-operation]
|
||||
ERROR discord/app_commands/models.py:539:42-97: Cannot set item in `dict[str, str | ChoiceT]` [unsupported-operation]
|
||||
ERROR discord/app_commands/models.py:539:42-97: `dict[str, str]` is not assignable to TypedDict key `name_localizations` with type `str | ChoiceT` [bad-typed-dict-key]
|
||||
ERROR discord/app_commands/models.py:926:16-67: Returned type `Guild | Member | None` is not assignable to declared return type `Member | None` [bad-return]
|
||||
ERROR discord/app_commands/models.py:1057:31-58: `bool | object` is not assignable to attribute `required` with type `bool` [bad-assignment]
|
||||
ERROR discord/app_commands/models.py:1058:55-76: `float | int | object | None` is not assignable to attribute `min_value` with type `float | int | None` [bad-assignment]
|
||||
|
|
@ -50,10 +50,12 @@ ERROR discord/app_commands/models.py:1168:16-1175:10: Returned type `dict[str, d
|
|||
ERROR discord/app_commands/models.py:1235:21-36: `int` is not assignable to TypedDict key `type` with type `Literal[1, 2, 3]` [bad-typed-dict-key]
|
||||
ERROR discord/app_commands/transformers.py:110:67-73: Argument `locale_str | str` is not assignable to parameter `string` with type `locale_str` in function `discord.app_commands.translator.Translator._checked_translate` [bad-argument-type]
|
||||
ERROR discord/app_commands/transformers.py:115:67-78: Argument `locale_str | str` is not assignable to parameter `string` with type `locale_str` in function `discord.app_commands.translator.Translator._checked_translate` [bad-argument-type]
|
||||
ERROR discord/app_commands/transformers.py:139:31-76: Cannot set item in `dict[str, bool | int | str]` [unsupported-operation]
|
||||
ERROR discord/app_commands/transformers.py:141:37-74: Cannot set item in `dict[str, bool | int | str]` [unsupported-operation]
|
||||
ERROR discord/app_commands/transformers.py:149:29-43: Cannot set item in `dict[str, bool | int | str]` [unsupported-operation]
|
||||
ERROR discord/app_commands/transformers.py:151:29-43: Cannot set item in `dict[str, bool | int | str]` [unsupported-operation]
|
||||
ERROR discord/app_commands/transformers.py:139:31-76: `list[dict[str, Any]]` is not assignable to TypedDict key `choices` with type `bool | int | str` [bad-typed-dict-key]
|
||||
ERROR discord/app_commands/transformers.py:141:37-74: `list[int]` is not assignable to TypedDict key `channel_types` with type `bool | int | str` [bad-typed-dict-key]
|
||||
ERROR discord/app_commands/transformers.py:149:29-43: `float | int` is not assignable to TypedDict key `min_length` with type `bool | int | str` [bad-typed-dict-key]
|
||||
ERROR discord/app_commands/transformers.py:149:29-43: `float | int` is not assignable to TypedDict key `min_value` with type `bool | int | str` [bad-typed-dict-key]
|
||||
ERROR discord/app_commands/transformers.py:151:29-43: `float | int` is not assignable to TypedDict key `max_length` with type `bool | int | str` [bad-typed-dict-key]
|
||||
ERROR discord/app_commands/transformers.py:151:29-43: `float | int` is not assignable to TypedDict key `max_value` with type `bool | int | str` [bad-typed-dict-key]
|
||||
ERROR discord/app_commands/transformers.py:238:22-26: Expected a type form, got instance of `Self@Transformer` [not-a-type]
|
||||
ERROR discord/app_commands/translator.py:119:61-85: Expected a type form, got instance of `Literal['Command[Any, ..., Any]']` [not-a-type]
|
||||
ERROR discord/app_commands/translator.py:119:87-100: Expected a type form, got instance of `Literal['ContextMenu']` [not-a-type]
|
||||
|
|
@ -127,16 +129,10 @@ ERROR discord/channel.py:2082:9-13: Class member `CategoryChannel.type` override
|
|||
ERROR discord/channel.py:2478:9-16: Class member `ForumChannel._update` overrides parent class `GuildChannel` in an inconsistent manner [bad-override]
|
||||
ERROR discord/channel.py:2512:9-13: Class member `ForumChannel.type` overrides parent class `GuildChannel` in an inconsistent manner [bad-override]
|
||||
ERROR discord/channel.py:2619:15-20: Class member `ForumChannel.clone` overrides parent class `GuildChannel` in an inconsistent manner [bad-override]
|
||||
ERROR discord/channel.py:2637:46-97: Cannot set item in `dict[str, bool | int | list[dict[str, Any]] | str | None]` [unsupported-operation]
|
||||
ERROR discord/channel.py:3036:47-84: Cannot set item in `dict[str, int | str | None]` [unsupported-operation]
|
||||
ERROR discord/client.py:723:59-70: Unpacked keyword argument `bool | int | None` is not assignable to parameter `initial` with type `bool` in function `discord.gateway.DiscordWebSocket.from_client` [bad-argument-type]
|
||||
ERROR discord/client.py:723:59-70: Unpacked keyword argument `bool | int | None` is not assignable to parameter `gateway` with type `URL | None` in function `discord.gateway.DiscordWebSocket.from_client` [bad-argument-type]
|
||||
ERROR discord/client.py:723:59-70: Unpacked keyword argument `bool | int | None` is not assignable to parameter `session` with type `str | None` in function `discord.gateway.DiscordWebSocket.from_client` [bad-argument-type]
|
||||
ERROR discord/client.py:723:59-70: Unpacked keyword argument `bool | int | None` is not assignable to parameter `resume` with type `bool` in function `discord.gateway.DiscordWebSocket.from_client` [bad-argument-type]
|
||||
ERROR discord/client.py:723:59-70: Unpacked keyword argument `bool | int | None` is not assignable to parameter `encoding` with type `str` in function `discord.gateway.DiscordWebSocket.from_client` [bad-argument-type]
|
||||
ERROR discord/client.py:723:59-70: Unpacked keyword argument `bool | int | None` is not assignable to parameter `compress` with type `bool` in function `discord.gateway.DiscordWebSocket.from_client` [bad-argument-type]
|
||||
ERROR discord/channel.py:2637:46-97: `dict[str, Any]` is not assignable to TypedDict key `default_reaction_emoji` with type `bool | int | list[dict[str, Any]] | str | None` [bad-typed-dict-key]
|
||||
ERROR discord/channel.py:3036:47-84: `list[str]` is not assignable to TypedDict key `applied_tags` with type `int | str | None` [bad-typed-dict-key]
|
||||
ERROR discord/client.py:731:33-105: No matching overload found for function `typing.MutableMapping.update` called with arguments: (sequence=int | None, resume=bool, session=str | None) [no-matching-overload]
|
||||
ERROR discord/client.py:733:44-59: Cannot set item in `dict[str, bool | int | None]` [unsupported-operation]
|
||||
ERROR discord/client.py:733:44-59: `URL` is not assignable to TypedDict key `gateway` with type `bool | int | None` [bad-typed-dict-key]
|
||||
ERROR discord/client.py:756:37-762:22: No matching overload found for function `typing.MutableMapping.update` called with arguments: (sequence=int | None, gateway=URL, initial=Literal[False], resume=Literal[True], session=str | None) [no-matching-overload]
|
||||
ERROR discord/client.py:782:33-787:18: No matching overload found for function `typing.MutableMapping.update` called with arguments: (sequence=int | None, gateway=URL, resume=Literal[True], session=str | None) [no-matching-overload]
|
||||
ERROR discord/client.py:2975:83-99: Argument `int` is not assignable to parameter `owner_type` with type `Literal[1, 2]` in function `discord.http.HTTPClient.create_entitlement` [bad-argument-type]
|
||||
|
|
@ -158,10 +154,7 @@ ERROR discord/components.py:1324:21-36: `int` is not assignable to TypedDict key
|
|||
ERROR discord/components.py:1326:27-63: `list[Component]` is not assignable to TypedDict key `components` with type `list[ContainerChildComponent]` [bad-typed-dict-key]
|
||||
ERROR discord/components.py:1380:21-36: `int` is not assignable to TypedDict key `type` with type `Literal[18]` [bad-typed-dict-key]
|
||||
ERROR discord/components.py:1444:21-36: `int` is not assignable to TypedDict key `type` with type `Literal[19]` [bad-typed-dict-key]
|
||||
ERROR discord/embeds.py:246:28-55: `Colour` is not assignable to attribute `_colour` with type `None` [bad-assignment]
|
||||
ERROR discord/embeds.py:308:9-15: Class member `Embed.__eq__` overrides parent class `object` in an inconsistent manner [bad-override]
|
||||
ERROR discord/embeds.py:343:28-33: `Colour` is not assignable to attribute `_colour` with type `None` [bad-assignment]
|
||||
ERROR discord/embeds.py:345:28-47: `Colour` is not assignable to attribute `_colour` with type `None` [bad-assignment]
|
||||
ERROR discord/embeds.py:362:31-35: `None` is not assignable to attribute `_timestamp` with type `datetime` [bad-assignment]
|
||||
ERROR discord/emoji.py:119:14-20: Class member `Emoji._state` overrides parent class `AssetMixin` in an inconsistent manner [bad-override]
|
||||
ERROR discord/emoji.py:166:9-12: Class member `Emoji.url` overrides parent class `AssetMixin` in an inconsistent manner [bad-override]
|
||||
|
|
@ -283,38 +276,20 @@ ERROR discord/ext/commands/parameters.py:115:9-16: Class member `Parameter.repla
|
|||
ERROR discord/ext/commands/parameters.py:324:5-15: Class member `Signature.parameters` overrides parent class `Signature` in an inconsistent manner [bad-override]
|
||||
ERROR discord/ext/commands/view.py:151:53-64: Argument `str | None` is not assignable to parameter `close_quote` with type `str` in function `discord.ext.commands.errors.ExpectedClosingQuoteError.__init__` [bad-argument-type]
|
||||
ERROR discord/ext/commands/view.py:162:57-68: Argument `str | None` is not assignable to parameter `close_quote` with type `str` in function `discord.ext.commands.errors.ExpectedClosingQuoteError.__init__` [bad-argument-type]
|
||||
ERROR discord/ext/tasks/__init__.py:212:36-63: `datetime` is not assignable to attribute `_next_iteration` with type `None` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:214:36-80: `datetime` is not assignable to attribute `_next_iteration` with type `None` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:222:49-69: Argument `None` is not assignable to parameter `dt` with type `datetime` in function `Loop._try_sleep_until` [bad-argument-type]
|
||||
ERROR discord/ext/tasks/__init__.py:224:44-64: `None` is not assignable to attribute `_last_iteration` with type `datetime` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:225:44-71: `datetime` is not assignable to attribute `_next_iteration` with type `None` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:231:56-100: `<=` is not supported between `None` and `datetime` [unsupported-operation]
|
||||
ERROR discord/ext/tasks/__init__.py:242:53-73: Argument `None` is not assignable to parameter `dt` with type `datetime` in function `Loop._try_sleep_until` [bad-argument-type]
|
||||
ERROR discord/ext/tasks/__init__.py:243:48-75: `datetime` is not assignable to attribute `_next_iteration` with type `None` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:266:53-73: Argument `None` is not assignable to parameter `dt` with type `datetime` in function `Loop._try_sleep_until` [bad-argument-type]
|
||||
ERROR discord/ext/tasks/__init__.py:301:26-29: `T` is not assignable to attribute `_injected` with type `None` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:500:33-70: `tuple[type[OSError], type[GatewayNotFound], type[ConnectionClosed], type[ClientError], type[TimeoutError], *tuple[type[BaseException], ...]]` is not assignable to attribute `_valid_exception` with type `tuple[type[OSError], type[GatewayNotFound], type[ConnectionClosed], type[ClientError], type[TimeoutError]]` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:509:33-35: `tuple[()]` is not assignable to attribute `_valid_exception` with type `tuple[type[OSError], type[GatewayNotFound], type[ConnectionClosed], type[ClientError], type[TimeoutError]]` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:525:33-95: `tuple[type[ClientError] | type[ConnectionClosed] | type[GatewayNotFound] | type[OSError] | type[TimeoutError], ...]` is not assignable to attribute `_valid_exception` with type `tuple[type[OSError], type[GatewayNotFound], type[ConnectionClosed], type[ClientError], type[TimeoutError]]` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:579:29-33: `FT` is not assignable to attribute `_before_loop` with type `None` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:607:28-32: `FT` is not assignable to attribute `_after_loop` with type `None` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:774:36-63: `datetime` is not assignable to attribute `_next_iteration` with type `None` [bad-assignment]
|
||||
ERROR discord/ext/tasks/__init__.py:777:42-62: Argument `None` is not assignable to parameter `dt` with type `datetime` in function `SleepHandle.recalculate` [bad-argument-type]
|
||||
ERROR discord/file.py:93:42-44: `(IOBase & PathLike[Any]) | (IOBase & bytes) | (IOBase & str) | BufferedIOBase` is not assignable to attribute `fp` with type `BufferedIOBase` [bad-assignment]
|
||||
WARN discord/flags.py:234:24-29: Identity comparison `False is False` is always True [unnecessary-comparison]
|
||||
WARN discord/flags.py:324:24-29: Identity comparison `False is False` is always True [unnecessary-comparison]
|
||||
ERROR discord/flags.py:1784:9-20: Class member `ArrayFlags._from_value` overrides parent class `BaseFlags` in an inconsistent manner [bad-override]
|
||||
ERROR discord/flags.py:1881:9-17: Class member `AutoModPresets.to_array` overrides parent class `ArrayFlags` in an inconsistent manner [bad-override]
|
||||
ERROR discord/gateway.py:137:48-57: Multiple values for argument `name` in function `threading.Thread.__init__` [bad-keyword-argument]
|
||||
ERROR discord/gateway.py:218:9-11: Class member `VoiceKeepAliveHandler.ws` overrides parent class `KeepAliveHandler` in an inconsistent manner [bad-override]
|
||||
ERROR discord/gateway.py:466:13-34: Cannot set item in `int` [unsupported-operation]
|
||||
ERROR discord/gateway.py:466:37-70: Cannot set item in `dict[str, bool | dict[str, str] | int | str | None]` [unsupported-operation]
|
||||
ERROR discord/gateway.py:470:13-37: Cannot set item in `int` [unsupported-operation]
|
||||
ERROR discord/gateway.py:470:40-475:14: Cannot set item in `dict[str, bool | dict[str, str] | int | str | None]` [unsupported-operation]
|
||||
ERROR discord/gateway.py:478:13-36: Cannot set item in `int` [unsupported-operation]
|
||||
ERROR discord/gateway.py:735:13-34: Cannot set item in `int` [unsupported-operation]
|
||||
ERROR discord/gateway.py:735:37-42: Cannot set item in `dict[str, bool | int]` [unsupported-operation]
|
||||
ERROR discord/gateway.py:738:13-37: Cannot set item in `int` [unsupported-operation]
|
||||
ERROR discord/gateway.py:738:40-48: Cannot set item in `dict[str, bool | int]` [unsupported-operation]
|
||||
ERROR discord/gateway.py:741:13-34: Cannot set item in `int` [unsupported-operation]
|
||||
ERROR discord/gateway.py:741:37-42: Cannot set item in `dict[str, bool | int]` [unsupported-operation]
|
||||
ERROR discord/guild.py:617:53-95: `object | None` is not assignable to attribute `max_stage_video_users` with type `int | None` [bad-assignment]
|
||||
ERROR discord/guild.py:631:58-97: `object | None` is not assignable to attribute `approximate_presence_count` with type `int | None` [bad-assignment]
|
||||
|
|
@ -367,24 +342,6 @@ ERROR discord/http.py:404:34-45: Object of class `reify` has no attribute `get`
|
|||
ERROR discord/http.py:407:23-34: Object of class `reify` has no attribute `get` [missing-attribute]
|
||||
ERROR discord/http.py:411:59-87: Cannot index into `reify` [bad-index]
|
||||
ERROR discord/http.py:411:59-87: Cannot index into `reify[CIMultiDictProxy[str]]` [bad-index]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `method` with type `str` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `protocols` with type `Iterable[str]` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `timeout` with type `float` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `receive_timeout` with type `float | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `autoclose` with type `bool` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `autoping` with type `bool` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `heartbeat` with type `float | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `origin` with type `str | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `params` with type `Mapping[str, QueryVariable] | Sequence[tuple[str, QueryVariable]] | str | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `headers` with type `CIMultiDict[str] | CIMultiDictProxy[str] | Iterable[tuple[istr | str, str]] | Mapping[istr, str] | Mapping[str, str] | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `proxy` with type `URL | str | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `ssl` with type `Fingerprint | SSLContext | bool` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `verify_ssl` with type `bool | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `fingerprint` with type `bytes | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `ssl_context` with type `SSLContext | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `proxy_headers` with type `CIMultiDict[str] | CIMultiDictProxy[str] | Iterable[tuple[istr | str, str]] | Mapping[istr, str] | Mapping[str, str] | None` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `compress` with type `int` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:566:53-61: Unpacked keyword argument `BasicAuth | bool | dict[str, str] | float | int | str | None` is not assignable to parameter `max_msg_size` with type `int` in function `aiohttp.client.ClientSession.ws_connect` [bad-argument-type]
|
||||
ERROR discord/http.py:640:20-29: Cannot use `Ratelimit` as a context manager [bad-context-manager]
|
||||
ERROR discord/http.py:661:40-60: Object of class `reify` has no attribute `get` [missing-attribute]
|
||||
ERROR discord/http.py:664:49-92: `in` is not supported between `Literal['X-Ratelimit-Remaining']` and `reify` [not-iterable]
|
||||
|
|
@ -392,8 +349,8 @@ ERROR discord/http.py:664:49-92: `in` is not supported between `Literal['X-Ratel
|
|||
ERROR discord/http.py:707:36-56: Object of class `reify` has no attribute `get` [missing-attribute]
|
||||
ERROR discord/http.py:801:44-52: Unpacked keyword argument `str` is not assignable to parameter `allow_redirects` with type `bool` in function `aiohttp.client.ClientSession.get` [bad-argument-type]
|
||||
ERROR discord/http.py:1075:31-36: Cannot set item in `dict[str, str]` [unsupported-operation]
|
||||
ERROR discord/http.py:1866:41-55: Cannot set item in `dict[str, bool | int]` [unsupported-operation]
|
||||
ERROR discord/http.py:1869:48-74: Cannot set item in `dict[str, bool | int]` [unsupported-operation]
|
||||
ERROR discord/http.py:1866:41-55: `int | str` is not assignable to TypedDict key `target_user_id` with type `bool | int` [bad-typed-dict-key]
|
||||
ERROR discord/http.py:1869:48-74: `str` is not assignable to TypedDict key `target_application_id` with type `bool | int` [bad-typed-dict-key]
|
||||
ERROR discord/http.py:2574:46-65: Cannot set item in `dict[str, list[Prompt]]` [unsupported-operation]
|
||||
ERROR discord/http.py:2577:34-41: Cannot set item in `dict[str, list[Prompt]]` [unsupported-operation]
|
||||
ERROR discord/http.py:2580:31-35: Cannot set item in `dict[str, list[Prompt]]` [unsupported-operation]
|
||||
|
|
@ -620,4 +577,4 @@ ERROR discord/welcome_screen.py:104:33-48: Object of class `_EmojiTag` has no at
|
|||
ERROR discord/welcome_screen.py:211:37-48: Cannot set item in `dict[str, list[Unknown]]` [unsupported-operation]
|
||||
ERROR discord/welcome_screen.py:214:33-40: Cannot set item in `dict[str, list[Unknown]]` [unsupported-operation]
|
||||
INFO Checking project configured at `<CWD>/pyrefly.toml`
|
||||
INFO 617 errors (519 suppressed)
|
||||
INFO 572 errors (518 suppressed)
|
||||
|
|
|
|||
|
|
@ -698,9 +698,9 @@ discord/abc.py:687: error: Incompatible types in assignment (expression has type
|
|||
discord/abc.py:694: error: Incompatible return value type (got "Dict[Optional[Role], PermissionOverwrite]", expected "Dict[Union[Role, Member, Object], PermissionOverwrite]") [return-value]
|
||||
discord/abc.py:1031: error: Argument 5 to "edit_channel_permissions" of "HTTPClient" has incompatible type "int"; expected "Literal[0, 1]" [arg-type]
|
||||
discord/abc.py:1269: error: Unexpected keyword argument "parent_id" for "update" of "TypedDict" [call-arg]
|
||||
<TMPDIR>_/venv/lib/python3.8/site-packages/mypy/typeshed/stdlib/typing.pyi:960: note: "update" of "TypedDict" defined here
|
||||
venv/lib/python3.8/site-packages/mypy/typeshed/stdlib/typing.pyi:960: note: "update" of "TypedDict" defined here
|
||||
discord/abc.py:1269: error: Unexpected keyword argument "lock_permissions" for "update" of "TypedDict" [call-arg]
|
||||
<TMPDIR>_/venv/lib/python3.8/site-packages/mypy/typeshed/stdlib/typing.pyi:960: note: "update" of "TypedDict" defined here
|
||||
venv/lib/python3.8/site-packages/mypy/typeshed/stdlib/typing.pyi:960: note: "update" of "TypedDict" defined here
|
||||
discord/abc.py:1815: error: Incompatible types in assignment (expression has type "reversed[MessagePin]", variable has type "List[MessagePin]") [assignment]
|
||||
discord/abc.py:1821: error: Incompatible types in "yield" (actual type "Message", expected type "PinnedMessage") [misc]
|
||||
discord/abc.py:2035: error: Incompatible types in assignment (expression has type "Callable[[Arg(int, 'retrieve'), Arg(Optional[Snowflake], 'after'), Arg(Optional[int], 'limit')], Coroutine[Any, Any, Any]]", variable has type "Callable[[Arg(int, 'retrieve'), Arg(Optional[Snowflake], 'around'), Arg(Optional[int], 'limit')], Coroutine[Any, Any, Any]]") [assignment]
|
||||
|
|
|
|||
|
|
@ -37,8 +37,8 @@ discord/app_commands/commands.py:2026:40: error[invalid-type-arguments] Too many
|
|||
discord/app_commands/commands.py:2053:49: error[invalid-type-arguments] Too many type arguments: expected 1, got 3
|
||||
discord/app_commands/commands.py:2066:51: error[unresolved-attribute] Object of type `(...) -> Coroutine[Any, Any, Unknown]` has no attribute `__name__`
|
||||
discord/app_commands/commands.py:2129:23: error[unresolved-attribute] Object of type `((Interaction[Any], Member, /) -> Coroutine[Any, Any, Any]) | ((Interaction[Any], User, /) -> Coroutine[Any, Any, Any]) | ((Interaction[Any], Message, /) -> Coroutine[Any, Any, Any])` has no attribute `__name__`
|
||||
discord/app_commands/commands.py:2477:17: error[invalid-assignment] Object of type `list[Unknown]` is not assignable to attribute `__discord_app_commands_checks__` on type `(((...) -> Coroutine[Any, Any, Unknown]) & ~Top[Command[Unknown, object, Unknown]] & ~ContextMenu & ~<Protocol with members '__discord_app_commands_checks__'>) | (((Interaction[Any], Member, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]] & ~ContextMenu & ~<Protocol with members '__discord_app_commands_checks__'>) | (((Interaction[Any], User, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]] & ~ContextMenu & ~<Protocol with members '__discord_app_commands_checks__'>) | (((Interaction[Any], Message, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]] & ~ContextMenu & ~<Protocol with members '__discord_app_commands_checks__'>)`
|
||||
discord/app_commands/commands.py:2479:13: error[unresolved-attribute] Object of type `(((...) -> Coroutine[Any, Any, Unknown]) & ~Top[Command[Unknown, object, Unknown]] & ~ContextMenu) | (((Interaction[Any], Member, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]] & ~ContextMenu) | (((Interaction[Any], User, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]] & ~ContextMenu) | (((Interaction[Any], Message, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]] & ~ContextMenu)` has no attribute `__discord_app_commands_checks__`
|
||||
discord/app_commands/commands.py:2477:17: error[invalid-assignment] Object of type `list[Unknown]` is not assignable to attribute `__discord_app_commands_checks__` on type `(((...) -> Coroutine[Any, Any, Unknown]) & ~Top[Command[Unknown, (), Unknown]] & ~ContextMenu & ~<Protocol with members '__discord_app_commands_checks__'>) | (((Interaction[Any], Member, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]] & ~ContextMenu & ~<Protocol with members '__discord_app_commands_checks__'>) | (((Interaction[Any], User, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]] & ~ContextMenu & ~<Protocol with members '__discord_app_commands_checks__'>) | (((Interaction[Any], Message, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]] & ~ContextMenu & ~<Protocol with members '__discord_app_commands_checks__'>)`
|
||||
discord/app_commands/commands.py:2479:13: error[unresolved-attribute] Object of type `(((...) -> Coroutine[Any, Any, Unknown]) & ~Top[Command[Unknown, (), Unknown]] & ~ContextMenu) | (((Interaction[Any], Member, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]] & ~ContextMenu) | (((Interaction[Any], User, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]] & ~ContextMenu) | (((Interaction[Any], Message, /) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]] & ~ContextMenu)` has no attribute `__discord_app_commands_checks__`
|
||||
discord/app_commands/errors.py:453:95: error[unresolved-attribute] Object of type `(() -> Coroutine[object, Never, object]) | ((...) -> Coroutine[Any, Any, Unknown])` has no attribute `__qualname__`
|
||||
discord/app_commands/errors.py:460:88: error[unresolved-attribute] Object of type `((Interaction[Any], Member, /) -> Coroutine[Any, Any, Any]) | ((Interaction[Any], User, /) -> Coroutine[Any, Any, Any]) | ((Interaction[Any], Message, /) -> Coroutine[Any, Any, Any])` has no attribute `__qualname__`
|
||||
discord/app_commands/models.py:926:16: error[invalid-return-type] Return type does not match returned value: expected `Member | None`, found `(Guild & ~AlwaysTruthy) | None | Member`
|
||||
|
|
@ -115,7 +115,6 @@ discord/emoji.py:287:26: warning[possibly-missing-attribute] Attribute `http` ma
|
|||
discord/emoji.py:292:54: error[invalid-argument-type] Argument to bound method `__init__` is incorrect: Expected `ConnectionState[Client]`, found `Any | None | ConnectionState[Client]`
|
||||
discord/emoji.py:294:22: warning[possibly-missing-attribute] Attribute `http` may be missing on object of type `Any | None | ConnectionState[Client]`
|
||||
discord/errors.py:30:10: error[unresolved-import] Cannot resolve imported module `requests`
|
||||
discord/ext/commands/bot.py:175:36: error[invalid-type-arguments] Too many type arguments: expected 0, got 1
|
||||
discord/ext/commands/bot.py:177:9: error[invalid-parameter-default] Default value of type `_DefaultRepr` is not assignable to annotated parameter type `HelpCommand | None`
|
||||
discord/ext/commands/bot.py:296:41: error[invalid-type-arguments] Too many type arguments: expected 1, got 4
|
||||
discord/ext/commands/bot.py:306:50: error[invalid-type-arguments] Too many type arguments: expected 1, got 4
|
||||
|
|
@ -123,7 +122,6 @@ discord/ext/commands/bot.py:320:41: error[invalid-type-arguments] Too many type
|
|||
discord/ext/commands/bot.py:330:50: error[invalid-type-arguments] Too many type arguments: expected 1, got 4
|
||||
discord/ext/commands/bot.py:655:16: error[unresolved-attribute] Object of type `(...) -> Coroutine[Any, Any, Any]` has no attribute `__name__`
|
||||
discord/ext/commands/bot.py:681:16: error[unresolved-attribute] Object of type `(...) -> Coroutine[Any, Any, Any]` has no attribute `__name__`
|
||||
discord/ext/commands/bot.py:1544:40: error[invalid-type-arguments] Too many type arguments: expected 0, got 1
|
||||
discord/ext/commands/bot.py:1546:13: error[invalid-parameter-default] Default value of type `_DefaultRepr` is not assignable to annotated parameter type `HelpCommand | None`
|
||||
discord/ext/commands/cog.py:288:36: error[invalid-type-arguments] Type `<special form 'typing.Self'>` is not assignable to upper bound `Cog | None` of type variable `CogT@Command`
|
||||
discord/ext/commands/cog.py:289:79: error[invalid-type-arguments] Type `<special form 'typing.Self'>` is not assignable to upper bound `Group | Cog` of type variable `GroupT@Command`
|
||||
|
|
@ -144,29 +142,29 @@ discord/ext/commands/core.py:508:28: error[unresolved-attribute] Object of type
|
|||
discord/ext/commands/core.py:544:24: error[unresolved-attribute] Object of type `(...) -> Any` has no attribute `__globals__`
|
||||
discord/ext/commands/core.py:1552:22: error[no-matching-overload] No overload of function `command` matches arguments
|
||||
discord/ext/commands/core.py:1609:22: error[no-matching-overload] No overload of function `group` matches arguments
|
||||
discord/ext/commands/core.py:1942:17: error[invalid-assignment] Object of type `list[Unknown]` is not assignable to attribute `__commands_checks__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]] & ~<Protocol with members '__commands_checks__'>`
|
||||
discord/ext/commands/core.py:1944:13: error[unresolved-attribute] Object of type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]` has no attribute `__commands_checks__`
|
||||
discord/ext/commands/core.py:1949:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Any, @Todo, Any] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Any, @Todo, Any] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:1956:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Any, @Todo, Any] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Any, @Todo, Any] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:1942:17: error[invalid-assignment] Object of type `list[Unknown]` is not assignable to attribute `__commands_checks__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]] & ~<Protocol with members '__commands_checks__'>`
|
||||
discord/ext/commands/core.py:1944:13: error[unresolved-attribute] Object of type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]` has no attribute `__commands_checks__`
|
||||
discord/ext/commands/core.py:1949:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Any, (...), Any] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Any, (...), Any] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:1956:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Any, (...), Any] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Any, (...), Any] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:2358:32: error[invalid-argument-type] Argument to bound method `append` is incorrect: Expected `(Context[object], /) -> bool | Coroutine[Never, object, bool]`, found `def predicate[BotT](ctx: Context[BotT@predicate]) -> bool`
|
||||
discord/ext/commands/core.py:2365:17: error[invalid-assignment] Object of type `list[Unknown]` is not assignable to attribute `__commands_checks__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]] & ~<Protocol with members '__commands_checks__'>`
|
||||
discord/ext/commands/core.py:2367:13: error[unresolved-attribute] Object of type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]` has no attribute `__commands_checks__`
|
||||
discord/ext/commands/core.py:2368:13: error[invalid-assignment] Object of type `Literal[True]` is not assignable to attribute `__discord_app_commands_guild_only__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]`
|
||||
discord/ext/commands/core.py:2373:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Unknown, Unknown, Unknown] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Unknown, Unknown, Unknown] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:2380:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Unknown, Unknown, Unknown] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Unknown, Unknown, Unknown] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:2365:17: error[invalid-assignment] Object of type `list[Unknown]` is not assignable to attribute `__commands_checks__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]] & ~<Protocol with members '__commands_checks__'>`
|
||||
discord/ext/commands/core.py:2367:13: error[unresolved-attribute] Object of type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]` has no attribute `__commands_checks__`
|
||||
discord/ext/commands/core.py:2368:13: error[invalid-assignment] Object of type `Literal[True]` is not assignable to attribute `__discord_app_commands_guild_only__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]`
|
||||
discord/ext/commands/core.py:2373:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Unknown, (...), Unknown] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Unknown, (...), Unknown] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:2380:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Unknown, (...), Unknown] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Unknown, (...), Unknown] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:2433:32: error[invalid-argument-type] Argument to bound method `append` is incorrect: Expected `(Context[object], /) -> bool | Coroutine[Never, object, bool]`, found `def predicate[BotT](ctx: Context[BotT@predicate]) -> bool`
|
||||
discord/ext/commands/core.py:2440:17: error[invalid-assignment] Object of type `list[Unknown]` is not assignable to attribute `__commands_checks__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]] & ~<Protocol with members '__commands_checks__'>`
|
||||
discord/ext/commands/core.py:2442:13: error[unresolved-attribute] Object of type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]` has no attribute `__commands_checks__`
|
||||
discord/ext/commands/core.py:2443:13: error[invalid-assignment] Object of type `Literal[True]` is not assignable to attribute `__discord_app_commands_is_nsfw__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]`
|
||||
discord/ext/commands/core.py:2448:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Unknown, Unknown, Unknown] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Unknown, Unknown, Unknown] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:2455:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Unknown, Unknown, Unknown] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Unknown, Unknown, Unknown] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:2499:13: error[invalid-assignment] Object of type `CooldownMapping[Unknown]` is not assignable to attribute `__commands_cooldown__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]`
|
||||
discord/ext/commands/core.py:2547:13: error[invalid-assignment] Object of type `DynamicCooldownMapping[Unknown]` is not assignable to attribute `__commands_cooldown__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]`
|
||||
discord/ext/commands/core.py:2582:13: error[invalid-assignment] Object of type `MaxConcurrency` is not assignable to attribute `__commands_max_concurrency__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]`
|
||||
discord/ext/commands/core.py:2440:17: error[invalid-assignment] Object of type `list[Unknown]` is not assignable to attribute `__commands_checks__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]] & ~<Protocol with members '__commands_checks__'>`
|
||||
discord/ext/commands/core.py:2442:13: error[unresolved-attribute] Object of type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]` has no attribute `__commands_checks__`
|
||||
discord/ext/commands/core.py:2443:13: error[invalid-assignment] Object of type `Literal[True]` is not assignable to attribute `__discord_app_commands_is_nsfw__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]`
|
||||
discord/ext/commands/core.py:2448:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Unknown, (...), Unknown] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Unknown, (...), Unknown] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:2455:9: error[unresolved-attribute] Unresolved attribute `predicate` on type `def decorator(func: Command[Unknown, (...), Unknown] | ((...) -> Coroutine[Any, Any, Any])) -> Command[Unknown, (...), Unknown] | ((...) -> Coroutine[Any, Any, Any])`.
|
||||
discord/ext/commands/core.py:2499:13: error[invalid-assignment] Object of type `CooldownMapping[Unknown]` is not assignable to attribute `__commands_cooldown__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]`
|
||||
discord/ext/commands/core.py:2547:13: error[invalid-assignment] Object of type `DynamicCooldownMapping[Unknown]` is not assignable to attribute `__commands_cooldown__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]`
|
||||
discord/ext/commands/core.py:2582:13: error[invalid-assignment] Object of type `MaxConcurrency` is not assignable to attribute `__commands_max_concurrency__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]`
|
||||
discord/ext/commands/core.py:2632:32: error[invalid-argument-type] Argument to bound method `before_invoke` is incorrect: Expected `((object, Unknown, /) -> Coroutine[Never, object, Never]) | ((Unknown, /) -> Coroutine[Never, object, Never])`, found `((CogT@before_invoke, ContextT@before_invoke, /) -> Coroutine[Any, Any, Any]) | ((ContextT@before_invoke, /) -> Coroutine[Any, Any, Any])`
|
||||
discord/ext/commands/core.py:2634:13: error[invalid-assignment] Object of type `((CogT@before_invoke, ContextT@before_invoke, /) -> Coroutine[Any, Any, Any]) | ((ContextT@before_invoke, /) -> Coroutine[Any, Any, Any])` is not assignable to attribute `__before_invoke__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]`
|
||||
discord/ext/commands/core.py:2634:13: error[invalid-assignment] Object of type `((CogT@before_invoke, ContextT@before_invoke, /) -> Coroutine[Any, Any, Any]) | ((ContextT@before_invoke, /) -> Coroutine[Any, Any, Any])` is not assignable to attribute `__before_invoke__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]`
|
||||
discord/ext/commands/core.py:2655:31: error[invalid-argument-type] Argument to bound method `after_invoke` is incorrect: Expected `((object, Unknown, /) -> Coroutine[Never, object, Never]) | ((Unknown, /) -> Coroutine[Never, object, Never])`, found `((CogT@after_invoke, ContextT@after_invoke, /) -> Coroutine[Any, Any, Any]) | ((ContextT@after_invoke, /) -> Coroutine[Any, Any, Any])`
|
||||
discord/ext/commands/core.py:2657:13: error[invalid-assignment] Object of type `((CogT@after_invoke, ContextT@after_invoke, /) -> Coroutine[Any, Any, Any]) | ((ContextT@after_invoke, /) -> Coroutine[Any, Any, Any])` is not assignable to attribute `__after_invoke__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, object, Unknown]]`
|
||||
discord/ext/commands/core.py:2657:13: error[invalid-assignment] Object of type `((CogT@after_invoke, ContextT@after_invoke, /) -> Coroutine[Any, Any, Any]) | ((ContextT@after_invoke, /) -> Coroutine[Any, Any, Any])` is not assignable to attribute `__after_invoke__` on type `((...) -> Coroutine[Any, Any, Any]) & ~Top[Command[Unknown, (), Unknown]]`
|
||||
discord/ext/commands/help.py:309:9: error[invalid-assignment] Implicit shadowing of function `get_commands`
|
||||
discord/ext/commands/help.py:310:9: error[invalid-assignment] Implicit shadowing of function `walk_commands`
|
||||
discord/ext/commands/help.py:1255:9: error[invalid-method-override] Invalid override of method `get_destination`: Definition is incompatible with `HelpCommand.get_destination`
|
||||
|
|
@ -177,6 +175,7 @@ discord/ext/commands/hybrid.py:176:49: error[unresolved-attribute] Object of typ
|
|||
discord/ext/commands/hybrid.py:232:13: error[unresolved-attribute] Unresolved attribute `__hybrid_command_flag__` on type `(...) -> Any`.
|
||||
discord/ext/commands/hybrid.py:328:9: error[unresolved-attribute] Unresolved attribute `__signature__` on type `(...) -> Coroutine[Any, Any, T@HybridAppCommand]`.
|
||||
discord/ext/commands/hybrid.py:338:17: error[unresolved-attribute] Object of type `(...) -> Coroutine[Any, Any, T@HybridAppCommand]` has no attribute `__signature__`
|
||||
discord/ext/commands/hybrid.py:430:45: error[invalid-argument-type] Argument to function `maybe_coroutine` is incorrect: Expected `Interaction[ClientT@interaction_check]`, found `Interaction[Client]`
|
||||
discord/ext/commands/hybrid.py:512:37: error[invalid-type-arguments] Too many type arguments: expected 1, got 4
|
||||
discord/ext/commands/hybrid.py:563:9: error[invalid-method-override] Invalid override of method `_ensure_assignment_on_copy`: Definition is incompatible with `Command._ensure_assignment_on_copy`
|
||||
discord/ext/commands/hybrid.py:731:9: error[invalid-method-override] Invalid override of method `_ensure_assignment_on_copy`: Definition is incompatible with `Command._ensure_assignment_on_copy`
|
||||
|
|
@ -207,6 +206,8 @@ discord/file.py:160:9: error[invalid-assignment] Implicit shadowing of function
|
|||
discord/flags.py:1784:9: error[invalid-method-override] Invalid override of method `_from_value`: Definition is incompatible with `BaseFlags._from_value`
|
||||
discord/flags.py:1881:9: error[invalid-method-override] Invalid override of method `to_array`: Definition is incompatible with `ArrayFlags.to_array`
|
||||
discord/gateway.py:137:48: error[parameter-already-assigned] Multiple values provided for parameter `name` of bound method `__init__`
|
||||
discord/gateway.py:402:13: error[invalid-assignment] Implicit shadowing of function `send`
|
||||
discord/gateway.py:403:13: error[invalid-assignment] Implicit shadowing of function `log_receive`
|
||||
discord/gateway.py:466:13: error[invalid-assignment] Cannot assign to a subscript on an object of type `int`
|
||||
discord/gateway.py:470:13: error[invalid-assignment] Cannot assign to a subscript on an object of type `int`
|
||||
discord/gateway.py:478:13: error[invalid-assignment] Cannot assign to a subscript on an object of type `int`
|
||||
|
|
@ -275,7 +276,7 @@ discord/soundboard.py:301:22: warning[possibly-missing-attribute] Attribute `htt
|
|||
discord/soundboard.py:302:50: error[invalid-argument-type] Argument to bound method `__init__` is incorrect: Expected `ConnectionState[Client]`, found `Any | None | ConnectionState[Client]`
|
||||
discord/soundboard.py:325:15: warning[possibly-missing-attribute] Attribute `http` may be missing on object of type `Any | None | ConnectionState[Client]`
|
||||
discord/state.py:263:13: error[invalid-assignment] Implicit shadowing of function `store_user`
|
||||
discord/state.py:551:16: error[invalid-return-type] Return type does not match returned value: expected `tuple[VoiceChannel | StageChannel | ForumChannel | ... omitted 5 union elements, Guild | None]`, found `tuple[(Unknown & ~AlwaysFalsy) | (Guild & ~AlwaysTruthy & ~AlwaysFalsy) | (VoiceChannel & ~AlwaysFalsy) | ... omitted 6 union elements, Guild | None]`
|
||||
discord/state.py:551:16: error[invalid-return-type] Return type does not match returned value: expected `tuple[VoiceChannel | StageChannel | ForumChannel | ... omitted 5 union elements, Guild | None]`, found `tuple[(DMChannel & ~AlwaysFalsy) | (Guild & ~AlwaysTruthy & ~AlwaysFalsy) | (VoiceChannel & ~AlwaysFalsy) | ... omitted 6 union elements, Guild | None]`
|
||||
discord/state.py:822:31: error[invalid-key] Unknown key "data" for TypedDict `PingInteraction`: Unknown key "data"
|
||||
discord/state.py:823:36: error[invalid-key] Unknown key "custom_id" for TypedDict `ChatInputApplicationCommandInteractionData`: Unknown key "custom_id"
|
||||
discord/state.py:823:36: error[invalid-key] Unknown key "custom_id" for TypedDict `UserApplicationCommandInteractionData`: Unknown key "custom_id"
|
||||
|
|
@ -403,4 +404,4 @@ discord/webhook/sync.py:522:9: error[invalid-method-override] Invalid override o
|
|||
discord/webhook/sync.py:652:16: error[unresolved-import] Cannot resolve imported module `requests`
|
||||
discord/webhook/sync.py:695:16: error[unresolved-import] Cannot resolve imported module `requests`
|
||||
discord/welcome_screen.py:104:33: warning[possibly-missing-attribute] Attribute `name` may be missing on object of type `(Unknown & _EmojiTag) | PartialEmoji | Emoji | (str & _EmojiTag)`
|
||||
Found 405 diagnostics
|
||||
Found 406 diagnostics
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load Diff
|
|
@ -32036,8 +32036,6 @@
|
|||
Attribute "pid" is unknown (reportAttributeAccessIssue)
|
||||
<CWD>/homeassistant/components/systemmonitor/coordinator.py:253:33 - error: Cannot access attribute "name" for class "Error"
|
||||
Attribute "name" is unknown (reportAttributeAccessIssue)
|
||||
<CWD>/homeassistant/components/systemmonitor/coordinator.py:266:38 - error: "sensors_temperatures" is not a known attribute of module "psutil" (reportAttributeAccessIssue)
|
||||
<CWD>/homeassistant/components/systemmonitor/coordinator.py:274:44 - error: "sensors_fans" is not a known attribute of module "psutil" (reportAttributeAccessIssue)
|
||||
<CWD>/homeassistant/components/systemmonitor/sensor.py
|
||||
<CWD>/homeassistant/components/systemmonitor/sensor.py:643:14 - error: "entity_description" overrides symbol of same name in class "Entity"
|
||||
Variable is mutable so its type is invariant
|
||||
|
|
@ -42035,4 +42033,4 @@
|
|||
<CWD>/homeassistant/util/loop.py:164:16 - error: "integration_frame" is possibly unbound (reportPossiblyUnboundVariable)
|
||||
<CWD>/homeassistant/util/loop.py:164:60 - error: "integration_frame" is possibly unbound (reportPossiblyUnboundVariable)
|
||||
<CWD>/homeassistant/util/loop.py:165:17 - error: "integration_frame" is possibly unbound (reportPossiblyUnboundVariable)
|
||||
19459 errors, 18 warnings, 0 informations
|
||||
19457 errors, 18 warnings, 0 informations
|
||||
|
|
|
|||
|
|
@ -53,6 +53,6 @@ homeassistant/runner.py:284: error: function asyncio.events.set_event_loop_polic
|
|||
homeassistant/scripts/auth.py:51: error: function asyncio.events.set_event_loop_policy is deprecated: Deprecated since Python 3.14; will be removed in Python 3.16. [deprecated]
|
||||
homeassistant/scripts/__init__.py:64: error: function asyncio.events.set_event_loop_policy is deprecated: Deprecated since Python 3.14; will be removed in Python 3.16. [deprecated]
|
||||
Found 54 errors in 13 files (checked 8626 source files)
|
||||
/Users/micha/.local/share/uv/python/cpython-3.14.0-macos-aarch64-none/lib/python3.14/importlib/__init__.py:88: UserWarning: Core Pydantic V1 functionality isn't compatible with Python 3.14 or greater.
|
||||
/home/micha/.local/share/uv/python/cpython-3.14.2-linux-x86_64-gnu/lib/python3.14/importlib/__init__.py:88: UserWarning: Core Pydantic V1 functionality isn't compatible with Python 3.14 or greater.
|
||||
return _bootstrap._gcd_import(name[level:], package, level)
|
||||
Warning: unused section(s) in mypy.ini: [mypy-tests.*]
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue