mirror of https://github.com/astral-sh/ruff
Update pre-commit dependencies (#17073)
This PR contains the following updates: | Package | Type | Update | Change | |---|---|---|---| | [abravalheri/validate-pyproject](https://redirect.github.com/abravalheri/validate-pyproject) | repository | patch | `v0.24` -> `v0.24.1` | | [astral-sh/ruff-pre-commit](https://redirect.github.com/astral-sh/ruff-pre-commit) | repository | patch | `v0.11.0` -> `v0.11.2` | | [crate-ci/typos](https://redirect.github.com/crate-ci/typos) | repository | minor | `v1.30.2` -> `v1.31.0` | | [python-jsonschema/check-jsonschema](https://redirect.github.com/python-jsonschema/check-jsonschema) | repository | minor | `0.31.3` -> `0.32.1` | | [woodruffw/zizmor-pre-commit](https://redirect.github.com/woodruffw/zizmor-pre-commit) | repository | patch | `v1.5.1` -> `v1.5.2` | --- > [!WARNING] > Some dependencies could not be looked up. Check the Dependency Dashboard for more information. Note: The `pre-commit` manager in Renovate is not supported by the `pre-commit` maintainers or community. Please do not report any problems there, instead [create a Discussion in the Renovate repository](https://redirect.github.com/renovatebot/renovate/discussions/new) if you have any questions. --- ### Release Notes <details> <summary>abravalheri/validate-pyproject (abravalheri/validate-pyproject)</summary> ### [`v0.24.1`](https://redirect.github.com/abravalheri/validate-pyproject/releases/tag/v0.24.1) [Compare Source](https://redirect.github.com/abravalheri/validate-pyproject/compare/v0.24...v0.24.1) #### What's Changed - Fixed multi plugin id was read from the wrong place by [@​henryiii](https://redirect.github.com/henryiii) in [https://github.com/abravalheri/validate-pyproject/pull/240](https://redirect.github.com/abravalheri/validate-pyproject/pull/240) - Implemented alternative plugin sorting, [https://github.com/abravalheri/validate-pyproject/pull/243](https://redirect.github.com/abravalheri/validate-pyproject/pull/243) **Full Changelog**: https://github.com/abravalheri/validate-pyproject/compare/v0.24...v0.24.1 </details> <details> <summary>astral-sh/ruff-pre-commit (astral-sh/ruff-pre-commit)</summary> ### [`v0.11.2`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.11.2) [Compare Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.11.1...v0.11.2) See: https://github.com/astral-sh/ruff/releases/tag/0.11.2 ### [`v0.11.1`](https://redirect.github.com/astral-sh/ruff-pre-commit/releases/tag/v0.11.1) [Compare Source](https://redirect.github.com/astral-sh/ruff-pre-commit/compare/v0.11.0...v0.11.1) See: https://github.com/astral-sh/ruff/releases/tag/0.11.1 </details> <details> <summary>crate-ci/typos (crate-ci/typos)</summary> ### [`v1.31.0`](https://redirect.github.com/crate-ci/typos/releases/tag/v1.31.0) [Compare Source](https://redirect.github.com/crate-ci/typos/compare/v1.30.3...v1.31.0) #### \[1.31.0] - 2025-03-28 ##### Features - Updated the dictionary with the [March 2025](https://redirect.github.com/crate-ci/typos/issues/1266) changes ### [`v1.30.3`](https://redirect.github.com/crate-ci/typos/releases/tag/v1.30.3) [Compare Source](https://redirect.github.com/crate-ci/typos/compare/v1.30.2...v1.30.3) #### \[1.30.3] - 2025-03-24 ##### Features - Support detecting `go.work` and `go.work.sum` files </details> <details> <summary>python-jsonschema/check-jsonschema (python-jsonschema/check-jsonschema)</summary> ### [`v0.32.1`](https://redirect.github.com/python-jsonschema/check-jsonschema/blob/HEAD/CHANGELOG.rst#0321) [Compare Source](https://redirect.github.com/python-jsonschema/check-jsonschema/compare/0.32.0...0.32.1) - Fix the `check-meltano` hook to use `types_or`. Thanks :user:`edgarrmondragon`! (:pr:`543`) ### [`v0.32.0`](https://redirect.github.com/python-jsonschema/check-jsonschema/blob/HEAD/CHANGELOG.rst#0320) [Compare Source](https://redirect.github.com/python-jsonschema/check-jsonschema/compare/0.31.3...0.32.0) - Update vendored schemas: circle-ci, compose-spec, dependabot, github-workflows, gitlab-ci, mergify, renovate, taskfile (2025-03-25) - Add Meltano schema and pre-commit hook. Thanks :user:`edgarrmondragon`! (:issue:`540`) - Add Snapcraft schema and pre-commit hook. Thanks :user:`fabolhak`! (:issue:`535`) </details> <details> <summary>woodruffw/zizmor-pre-commit (woodruffw/zizmor-pre-commit)</summary> ### [`v1.5.2`](https://redirect.github.com/woodruffw/zizmor-pre-commit/releases/tag/v1.5.2) [Compare Source](https://redirect.github.com/woodruffw/zizmor-pre-commit/compare/v1.5.1...v1.5.2) See: https://github.com/woodruffw/zizmor/releases/tag/v1.5.2 </details> --- ### Configuration 📅 **Schedule**: Branch creation - "before 4am on Monday" (UTC), Automerge - At any time (no schedule defined). 🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 👻 **Immortal**: This PR will be recreated if closed unmerged. Get [config help](https://redirect.github.com/renovatebot/renovate/discussions) if that's undesired. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR was generated by [Mend Renovate](https://mend.io/renovate/). View the [repository job log](https://developer.mend.io/github/astral-sh/ruff). <!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6WyJpbnRlcm5hbCJdfQ==--> --------- Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com> Co-authored-by: Micha Reiser <micha@reiser.io>
This commit is contained in:
parent
5c1fab0661
commit
a192d96880
|
|
@ -19,7 +19,7 @@ exclude: |
|
||||||
|
|
||||||
repos:
|
repos:
|
||||||
- repo: https://github.com/abravalheri/validate-pyproject
|
- repo: https://github.com/abravalheri/validate-pyproject
|
||||||
rev: v0.24
|
rev: v0.24.1
|
||||||
hooks:
|
hooks:
|
||||||
- id: validate-pyproject
|
- id: validate-pyproject
|
||||||
|
|
||||||
|
|
@ -60,7 +60,7 @@ repos:
|
||||||
- black==25.1.0
|
- black==25.1.0
|
||||||
|
|
||||||
- repo: https://github.com/crate-ci/typos
|
- repo: https://github.com/crate-ci/typos
|
||||||
rev: v1.30.2
|
rev: v1.31.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: typos
|
- id: typos
|
||||||
|
|
||||||
|
|
@ -74,7 +74,7 @@ repos:
|
||||||
pass_filenames: false # This makes it a lot faster
|
pass_filenames: false # This makes it a lot faster
|
||||||
|
|
||||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
rev: v0.11.0
|
rev: v0.11.2
|
||||||
hooks:
|
hooks:
|
||||||
- id: ruff-format
|
- id: ruff-format
|
||||||
- id: ruff
|
- id: ruff
|
||||||
|
|
@ -92,12 +92,12 @@ repos:
|
||||||
# zizmor detects security vulnerabilities in GitHub Actions workflows.
|
# zizmor detects security vulnerabilities in GitHub Actions workflows.
|
||||||
# Additional configuration for the tool is found in `.github/zizmor.yml`
|
# Additional configuration for the tool is found in `.github/zizmor.yml`
|
||||||
- repo: https://github.com/woodruffw/zizmor-pre-commit
|
- repo: https://github.com/woodruffw/zizmor-pre-commit
|
||||||
rev: v1.5.1
|
rev: v1.5.2
|
||||||
hooks:
|
hooks:
|
||||||
- id: zizmor
|
- id: zizmor
|
||||||
|
|
||||||
- repo: https://github.com/python-jsonschema/check-jsonschema
|
- repo: https://github.com/python-jsonschema/check-jsonschema
|
||||||
rev: 0.31.3
|
rev: 0.32.1
|
||||||
hooks:
|
hooks:
|
||||||
- id: check-github-workflows
|
- id: check-github-workflows
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -43,7 +43,7 @@ use ruff_python_ast::PythonVersion;
|
||||||
/// resolved extra search path of `["b", "c", "a"]`, which means `a` will be tried last.
|
/// resolved extra search path of `["b", "c", "a"]`, which means `a` will be tried last.
|
||||||
///
|
///
|
||||||
/// There's an argument here that the user should be able to specify the order of the paths,
|
/// There's an argument here that the user should be able to specify the order of the paths,
|
||||||
/// because only then is the user in full control of where to insert the path when specyifing `extra-paths`
|
/// because only then is the user in full control of where to insert the path when specifying `extra-paths`
|
||||||
/// in multiple sources.
|
/// in multiple sources.
|
||||||
///
|
///
|
||||||
/// ## Macro
|
/// ## Macro
|
||||||
|
|
|
||||||
|
|
@ -126,7 +126,7 @@ fn definition_expression_type<'db>(
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// The descriptor protocol distiguishes two kinds of descriptors. Non-data descriptors
|
/// The descriptor protocol distinguishes two kinds of descriptors. Non-data descriptors
|
||||||
/// define a `__get__` method, while data descriptors additionally define a `__set__`
|
/// define a `__get__` method, while data descriptors additionally define a `__set__`
|
||||||
/// method or a `__delete__` method. This enum is used to categorize attributes into two
|
/// method or a `__delete__` method. This enum is used to categorize attributes into two
|
||||||
/// groups: (1) data descriptors and (2) normal attributes or non-data descriptors.
|
/// groups: (1) data descriptors and (2) normal attributes or non-data descriptors.
|
||||||
|
|
|
||||||
|
|
@ -118,7 +118,7 @@ impl NotebookDocument {
|
||||||
// This is required because of the way the `NotebookCell` is modelled. We include
|
// This is required because of the way the `NotebookCell` is modelled. We include
|
||||||
// the `TextDocument` within the `NotebookCell` so when it's deleted, the
|
// the `TextDocument` within the `NotebookCell` so when it's deleted, the
|
||||||
// corresponding `TextDocument` is removed as well. But, when cells are
|
// corresponding `TextDocument` is removed as well. But, when cells are
|
||||||
// re-oredered, the change request doesn't provide the actual contents of the cell.
|
// re-ordered, the change request doesn't provide the actual contents of the cell.
|
||||||
// Instead, it only provides that (a) these cell URIs were removed, and (b) these
|
// Instead, it only provides that (a) these cell URIs were removed, and (b) these
|
||||||
// cell URIs were added.
|
// cell URIs were added.
|
||||||
// https://github.com/astral-sh/ruff/issues/12573
|
// https://github.com/astral-sh/ruff/issues/12573
|
||||||
|
|
|
||||||
|
|
@ -26,7 +26,7 @@ pub(crate) fn add_noqa(
|
||||||
let start = Instant::now();
|
let start = Instant::now();
|
||||||
let (paths, resolver) = python_files_in_path(files, pyproject_config, config_arguments)?;
|
let (paths, resolver) = python_files_in_path(files, pyproject_config, config_arguments)?;
|
||||||
let duration = start.elapsed();
|
let duration = start.elapsed();
|
||||||
debug!("Identified files to lint in: {:?}", duration);
|
debug!("Identified files to lint in: {duration:?}");
|
||||||
|
|
||||||
if paths.is_empty() {
|
if paths.is_empty() {
|
||||||
warn_user_once!("No Python files found under the given path(s)");
|
warn_user_once!("No Python files found under the given path(s)");
|
||||||
|
|
@ -87,7 +87,7 @@ pub(crate) fn add_noqa(
|
||||||
.sum();
|
.sum();
|
||||||
|
|
||||||
let duration = start.elapsed();
|
let duration = start.elapsed();
|
||||||
debug!("Added noqa to files in: {:?}", duration);
|
debug!("Added noqa to files in: {duration:?}");
|
||||||
|
|
||||||
Ok(modifications)
|
Ok(modifications)
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -174,7 +174,7 @@ pub(crate) fn check(
|
||||||
caches.persist()?;
|
caches.persist()?;
|
||||||
|
|
||||||
let duration = start.elapsed();
|
let duration = start.elapsed();
|
||||||
debug!("Checked {:?} files in: {:?}", checked_files, duration);
|
debug!("Checked {checked_files:?} files in: {duration:?}");
|
||||||
|
|
||||||
Ok(all_diagnostics)
|
Ok(all_diagnostics)
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -290,8 +290,8 @@ pub fn check(args: CheckCommand, global_options: GlobalConfigArgs) -> Result<Exi
|
||||||
// - If `--fix` or `--fix-only` is set, apply applicable fixes to the filesystem (or
|
// - If `--fix` or `--fix-only` is set, apply applicable fixes to the filesystem (or
|
||||||
// print them to stdout, if we're reading from stdin).
|
// print them to stdout, if we're reading from stdin).
|
||||||
// - If `--diff` or `--fix-only` are set, don't print any violations (only applicable fixes)
|
// - If `--diff` or `--fix-only` are set, don't print any violations (only applicable fixes)
|
||||||
// - By default, applicable fixes only include [`Applicablility::Automatic`], but if
|
// - By default, applicable fixes only include [`Applicability::Automatic`], but if
|
||||||
// `--unsafe-fixes` is set, then [`Applicablility::Suggested`] fixes are included.
|
// `--unsafe-fixes` is set, then [`Applicability::Suggested`] fixes are included.
|
||||||
|
|
||||||
let fix_mode = if cli.diff {
|
let fix_mode = if cli.diff {
|
||||||
FixMode::Diff
|
FixMode::Diff
|
||||||
|
|
|
||||||
|
|
@ -474,7 +474,7 @@ impl DisplaySet<'_> {
|
||||||
// 3 │ X0 Y0 Z0
|
// 3 │ X0 Y0 Z0
|
||||||
// │ ┏━━━━━┛ │ │ < We are writing these lines
|
// │ ┏━━━━━┛ │ │ < We are writing these lines
|
||||||
// │ ┃┌───────┘ │ < by reverting the "depth" of
|
// │ ┃┌───────┘ │ < by reverting the "depth" of
|
||||||
// │ ┃│┌─────────┘ < their multilne spans.
|
// │ ┃│┌─────────┘ < their multiline spans.
|
||||||
// 4 │ ┃││ X1 Y1 Z1
|
// 4 │ ┃││ X1 Y1 Z1
|
||||||
// 5 │ ┃││ X2 Y2 Z2
|
// 5 │ ┃││ X2 Y2 Z2
|
||||||
// │ ┃│└────╿──│──┘ `Z` label
|
// │ ┃│└────╿──│──┘ `Z` label
|
||||||
|
|
|
||||||
|
|
@ -146,10 +146,7 @@ pub(crate) fn categorize<'a>(
|
||||||
reason = Reason::DisabledSection(import_type);
|
reason = Reason::DisabledSection(import_type);
|
||||||
import_type = default_section;
|
import_type = default_section;
|
||||||
}
|
}
|
||||||
debug!(
|
debug!("Categorized '{module_name}' as {import_type:?} ({reason:?})");
|
||||||
"Categorized '{}' as {:?} ({:?})",
|
|
||||||
module_name, import_type, reason
|
|
||||||
);
|
|
||||||
import_type
|
import_type
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -505,7 +505,7 @@ impl InOrderEntry {
|
||||||
|
|
||||||
#[derive(Clone, Debug)]
|
#[derive(Clone, Debug)]
|
||||||
struct OutOfOrderEntry {
|
struct OutOfOrderEntry {
|
||||||
/// Index into the [`MultiMap::out_of_order`] vector at which offset the leaading vec is stored.
|
/// Index into the [`MultiMap::out_of_order`] vector at which offset the leading vec is stored.
|
||||||
leading_index: usize,
|
leading_index: usize,
|
||||||
_count: Count<OutOfOrderEntry>,
|
_count: Count<OutOfOrderEntry>,
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -71,7 +71,7 @@ script for more information or use the `--help` flag to see the available option
|
||||||
#### CI
|
#### CI
|
||||||
|
|
||||||
The fuzzer is run as part of the CI pipeline. The purpose of running the fuzzer in the CI is to
|
The fuzzer is run as part of the CI pipeline. The purpose of running the fuzzer in the CI is to
|
||||||
catch any regresssions introduced by any new changes to the parser. This is why the fuzzer is run on
|
catch any regressions introduced by any new changes to the parser. This is why the fuzzer is run on
|
||||||
the same set of seeds on every run.
|
the same set of seeds on every run.
|
||||||
|
|
||||||
## Benchmarks
|
## Benchmarks
|
||||||
|
|
|
||||||
|
|
@ -625,7 +625,7 @@ mod tests {
|
||||||
|
|
||||||
let result = resolve_options(file2, "file1", root, ResolverOptions::default());
|
let result = resolve_options(file2, "file1", root, ResolverOptions::default());
|
||||||
|
|
||||||
debug!("result: {:?}", result);
|
debug!("result: {result:?}");
|
||||||
|
|
||||||
assert!(!result.is_import_found);
|
assert!(!result.is_import_found);
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -392,7 +392,7 @@ fn resolve_best_absolute_import<Host: host::Host>(
|
||||||
|
|
||||||
if allow_pyi && !module_descriptor.name_parts.is_empty() {
|
if allow_pyi && !module_descriptor.name_parts.is_empty() {
|
||||||
// Check for a stdlib typeshed file.
|
// Check for a stdlib typeshed file.
|
||||||
debug!("Looking for typeshed stdlib path: {}", import_name);
|
debug!("Looking for typeshed stdlib path: {import_name}");
|
||||||
if let Some(mut typeshed_stdilib_import) =
|
if let Some(mut typeshed_stdilib_import) =
|
||||||
find_typeshed_path(module_descriptor, true, config, host)
|
find_typeshed_path(module_descriptor, true, config, host)
|
||||||
{
|
{
|
||||||
|
|
@ -401,7 +401,7 @@ fn resolve_best_absolute_import<Host: host::Host>(
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check for a third-party typeshed file.
|
// Check for a third-party typeshed file.
|
||||||
debug!("Looking for typeshed third-party path: {}", import_name);
|
debug!("Looking for typeshed third-party path: {import_name}");
|
||||||
if let Some(mut typeshed_third_party_import) =
|
if let Some(mut typeshed_third_party_import) =
|
||||||
find_typeshed_path(module_descriptor, false, config, host)
|
find_typeshed_path(module_descriptor, false, config, host)
|
||||||
{
|
{
|
||||||
|
|
|
||||||
|
|
@ -118,7 +118,7 @@ impl NotebookDocument {
|
||||||
// This is required because of the way the `NotebookCell` is modelled. We include
|
// This is required because of the way the `NotebookCell` is modelled. We include
|
||||||
// the `TextDocument` within the `NotebookCell` so when it's deleted, the
|
// the `TextDocument` within the `NotebookCell` so when it's deleted, the
|
||||||
// corresponding `TextDocument` is removed as well. But, when cells are
|
// corresponding `TextDocument` is removed as well. But, when cells are
|
||||||
// re-oredered, the change request doesn't provide the actual contents of the cell.
|
// re-ordered, the change request doesn't provide the actual contents of the cell.
|
||||||
// Instead, it only provides that (a) these cell URIs were removed, and (b) these
|
// Instead, it only provides that (a) these cell URIs were removed, and (b) these
|
||||||
// cell URIs were added.
|
// cell URIs were added.
|
||||||
// https://github.com/astral-sh/ruff/issues/12573
|
// https://github.com/astral-sh/ruff/issues/12573
|
||||||
|
|
|
||||||
|
|
@ -209,7 +209,7 @@ fn get_fallback_target_version(dir: &Path) -> Option<PythonVersion> {
|
||||||
let pyproject = match parsed_pyproject {
|
let pyproject = match parsed_pyproject {
|
||||||
Ok(pyproject) => pyproject,
|
Ok(pyproject) => pyproject,
|
||||||
Err(err) => {
|
Err(err) => {
|
||||||
debug!("Failed to find fallback `target-version` due to: {}", err);
|
debug!("Failed to find fallback `target-version` due to: {err}");
|
||||||
return None;
|
return None;
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
|
||||||
|
|
@ -578,18 +578,18 @@ impl ParallelVisitor for PythonFilesVisitor<'_, '_> {
|
||||||
&file_basename,
|
&file_basename,
|
||||||
&settings.file_resolver.exclude,
|
&settings.file_resolver.exclude,
|
||||||
) {
|
) {
|
||||||
debug!("Ignored path via `exclude`: {:?}", path);
|
debug!("Ignored path via `exclude`: {path:?}");
|
||||||
return WalkState::Skip;
|
return WalkState::Skip;
|
||||||
} else if match_candidate_exclusion(
|
} else if match_candidate_exclusion(
|
||||||
&file_path,
|
&file_path,
|
||||||
&file_basename,
|
&file_basename,
|
||||||
&settings.file_resolver.extend_exclude,
|
&settings.file_resolver.extend_exclude,
|
||||||
) {
|
) {
|
||||||
debug!("Ignored path via `extend-exclude`: {:?}", path);
|
debug!("Ignored path via `extend-exclude`: {path:?}");
|
||||||
return WalkState::Skip;
|
return WalkState::Skip;
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
debug!("Ignored path due to error in parsing: {:?}", path);
|
debug!("Ignored path due to error in parsing: {path:?}");
|
||||||
return WalkState::Skip;
|
return WalkState::Skip;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
@ -641,10 +641,10 @@ impl ParallelVisitor for PythonFilesVisitor<'_, '_> {
|
||||||
let resolver = self.global.resolver.read().unwrap();
|
let resolver = self.global.resolver.read().unwrap();
|
||||||
let settings = resolver.resolve(path);
|
let settings = resolver.resolve(path);
|
||||||
if settings.file_resolver.include.is_match(path) {
|
if settings.file_resolver.include.is_match(path) {
|
||||||
debug!("Included path via `include`: {:?}", path);
|
debug!("Included path via `include`: {path:?}");
|
||||||
Some(ResolvedFile::Nested(entry.into_path()))
|
Some(ResolvedFile::Nested(entry.into_path()))
|
||||||
} else if settings.file_resolver.extend_include.is_match(path) {
|
} else if settings.file_resolver.extend_include.is_match(path) {
|
||||||
debug!("Included path via `extend-include`: {:?}", path);
|
debug!("Included path via `extend-include`: {path:?}");
|
||||||
Some(ResolvedFile::Nested(entry.into_path()))
|
Some(ResolvedFile::Nested(entry.into_path()))
|
||||||
} else {
|
} else {
|
||||||
None
|
None
|
||||||
|
|
@ -765,14 +765,14 @@ fn is_file_excluded(path: &Path, resolver: &Resolver) -> bool {
|
||||||
&file_basename,
|
&file_basename,
|
||||||
&settings.file_resolver.exclude,
|
&settings.file_resolver.exclude,
|
||||||
) {
|
) {
|
||||||
debug!("Ignored path via `exclude`: {:?}", path);
|
debug!("Ignored path via `exclude`: {path:?}");
|
||||||
return true;
|
return true;
|
||||||
} else if match_candidate_exclusion(
|
} else if match_candidate_exclusion(
|
||||||
&file_path,
|
&file_path,
|
||||||
&file_basename,
|
&file_basename,
|
||||||
&settings.file_resolver.extend_exclude,
|
&settings.file_resolver.extend_exclude,
|
||||||
) {
|
) {
|
||||||
debug!("Ignored path via `extend-exclude`: {:?}", path);
|
debug!("Ignored path via `extend-exclude`: {path:?}");
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue