Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

ENH: Consider warnings as errors when running pytest #40

Merged
merged 2 commits into from
Jan 15, 2025

Conversation

jhlegarreta
Copy link
Contributor

Consider warnings as errors when running pytest:

  • Add the warnings-as-errors flag to pytest to consider warnings as errors.
  • Use an environment variable NIFREEZE_WERRORS in GHA workflow files to mark the corresponding workflows as treating warnings as errors.
  • Process the pytest session end to mark it as failed if warnings were present.

Implementation drawn from DIPY.

Filter two warnings that are being raised by the current tests:

  • Filter warnings related to GPR parameter optimal values being close to the boundary values raised by scikit-learn.
  • Filter warnings related to the b0 threshold value being updated in gradient table slicing raised by DIPY.

@jhlegarreta
Copy link
Contributor Author

@jhlegarreta jhlegarreta force-pushed the ConsiderWarningsAsErrorsInCI branch 3 times, most recently from 9ba5324 to 57398b0 Compare December 22, 2024 23:18
Copy link

codecov bot commented Dec 22, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 65.74%. Comparing base (ffa64fe) to head (7c3b24e).
Report is 9 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main      #40   +/-   ##
=======================================
  Coverage   65.74%   65.74%           
=======================================
  Files          19       19           
  Lines         940      940           
  Branches      120      120           
=======================================
  Hits          618      618           
  Misses        278      278           
  Partials       44       44           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@jhlegarreta
Copy link
Contributor Author

jhlegarreta commented Dec 22, 2024

Warnings are being filtered, but the test is not being marked as failed:

nifreeze_ci_filtered_not_failed

Marking this as draft.

Working: NIFREEZE_WERRORS had to be passed to tox in pass_env:

nifreeze_warnings_as_errors

PR #39 solves the warning at issue. Marking this as ready.

@jhlegarreta jhlegarreta marked this pull request as draft December 22, 2024 23:31
Consider warnings as errors when running `pytest`:
- Add the `warnings-as-errors` flag to `pytest` to consider warnings as
  errors.
- Use an environment variable `NIFREEZE_WERRORS` in GHA workflow files
  to mark the corresponding workflows as treating warnings as errors.
- Process the `pytest` session end to mark it as failed if warnings were
  present.

Implementation drawn from DIPY.

Filter two warnings that are being raised by the current tests:
- Filter warnings related to GPR parameter optimal values being close to
  the boundary values raised by `scikit-learn`.
- Filter warnings related to the b0 threshold value being updated in
  gradient table slicing raised by DIPY.
@oesteban oesteban merged commit 548a744 into nipreps:main Jan 15, 2025
8 checks passed
@jhlegarreta jhlegarreta deleted the ConsiderWarningsAsErrorsInCI branch January 15, 2025 19:29
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants