Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementing mean error as a verification metric? #826

Closed
JasonFurtado opened this issue Jun 26, 2023 · 1 comment · Fixed by #827
Closed

Implementing mean error as a verification metric? #826

JasonFurtado opened this issue Jun 26, 2023 · 1 comment · Fixed by #827

Comments

@JasonFurtado
Copy link

JasonFurtado commented Jun 26, 2023

Is your feature request related to a problem? Please describe.
First, thanks for putting together this package! It has been very useful. One metric I have not seen is implemented is just calculating the actual mean error (not normalized or absolute) between an ensemble-mean of forecasts (or individual ensemble members) and the obs/verification. Maybe I missed something...?

Describe the solution you'd like
xskillscore has a "me" function for mean error, but it isn't imported into metrics.py. Perhaps adding this measure to the package might be useful? Or, do you have other quick suggestions on how to do that?

Describe alternatives you've considered
One could code manually the error difference, but for large ensembles and multiple initializations, it could be a little cumbersome. For the mean time, I modified metrics.py to add in '__me' as a metric, along with importing it from xskillscore. You may have some nicer ways to do it, but I am trying this workaround for now.

Thanks again for everything!

@aaronspring
Copy link
Collaborator

True. Forgot to implement. Happy to receive a PR on that.

In the meantime you can wrap xs.me see user defined metrics in https://climpred.readthedocs.io/en/stable/metrics.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants