Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only raise warning when not enough data for bootstrapping #252

Closed
wants to merge 1 commit into from

Conversation

s-scherrer
Copy link
Collaborator

I changed the bootstrapping functions to warn and return NaN for the confidence intervals instead of raising an error when there is not enough data to do bootstrapping, as discussed in #251

@s-scherrer
Copy link
Collaborator Author

This issue again:

  File "/usr/share/miniconda3/envs/pytesmo/bin/pytest", line 8 in <module>
tests/test_validation_framework/test_validation.py::test_ascat_ismn_validation_metadata_rolling 
/home/runner/work/_temp/2c06d129-8349-4c2d-a0df-7358a0b65a24.sh: line 4:  4925 Segmentation fault      (core dumped) pytest --cache-clear -m 'full_framework'

I think it's unrelated to my changes, since the tests run on 3.7, but I thought that running the full_framework tests separately solved the issue.

@s-scherrer s-scherrer closed this Nov 29, 2021
@s-scherrer s-scherrer reopened this Nov 29, 2021
@wpreimes
Copy link
Member

This issue again:

  File "/usr/share/miniconda3/envs/pytesmo/bin/pytest", line 8 in <module>
tests/test_validation_framework/test_validation.py::test_ascat_ismn_validation_metadata_rolling 
/home/runner/work/_temp/2c06d129-8349-4c2d-a0df-7358a0b65a24.sh: line 4:  4925 Segmentation fault      (core dumped) pytest --cache-clear -m 'full_framework'

I think it's unrelated to my changes, since the tests run on 3.7, but I thought that running the full_framework tests separately solved the issue.

I think it did ... until a few weeks ago. I will take a look at this today and try to fix it.

@wpreimes wpreimes closed this Nov 30, 2021
@wpreimes wpreimes reopened this Nov 30, 2021
@wpreimes
Copy link
Member

wpreimes commented Dec 7, 2021

btw, the reason why I have not merge this yet is that I should implement an option into the validation framework that allows a user to skip points where an error during metric calculation occurred.
E.g. An option to continue on errors in this loop: https://github.com/TUW-GEO/pytesmo/blob/master/src/pytesmo/validation_framework/validation.py#L239
In that case it would be fine if the Bootstrapping metrics are only calculated when the bootstrapping is possible (which would be more consistent, otherwise we have more points where the metric was calculated than points where the CI was calculated), right?

@s-scherrer
Copy link
Collaborator Author

Right, that's probably the better solution, I didn't think of this before. Let's add the error handling in this loop then.

@s-scherrer s-scherrer closed this Dec 7, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants