Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

datamodel: add __check_metrics__() #301

Merged
merged 9 commits into from Jan 7, 2020

Conversation

dplarson
Copy link
Contributor

@dplarson dplarson commented Jan 3, 2020

  • Closes implement datamodel.__check_metrics__ #267 .
  • I am familiar with the contributing guidelines.
  • Tests added.
  • Updates entries to docs/source/api.rst for API changes.
  • Adds descriptions to appropriate "what's new" file in docs/source/whatsnew for all changes. Includes link to the GitHub Issue with :issue:`num` or this Pull Request with :pull:`num`. Includes contributor name and/or GitHub username (link with :ghuser:`user`).
  • New code is fully documented. Includes numpydoc compliant docstrings, examples, and comments where necessary.
  • Maintainer: Appropriate GitHub Labels and Milestone are assigned to the Pull Request and linked Issue.

Adds helper function __check_metrics__() to determine if the select metrics are valid (e.g. if evaluating a deterministic forecast, then should use deterministic forecast metrics, not probabilistic).

Add helpfer function to determine if the select metrics are valid (e.g.
if evaluating a deterministic forecast, then should use deterministic
forecast metrics, not probabilistic).
@dplarson dplarson changed the title [WIP] datamodel: add __check_metrics__() datamodel: add __check_metrics__() Jan 4, 2020
dplarson and others added 4 commits January 3, 2020 22:22
Remove the probabilistic forecasts from `__check_metrics__()` since they
are not yet fully integrated into the `metrics.calculator` code.
@dplarson
Copy link
Contributor Author

dplarson commented Jan 6, 2020

@wholmgren This PR (adding the __check_metrics__() function for verifying that the selected metrics are valid for the given forecast type) is ready for review.

@wholmgren
Copy link
Member

Looks good assuming we add follow up issue for probabilistic metrics. I think it's worth a quick note in the whats new file too.

@wholmgren wholmgren added this to the 1.0 beta 4 milestone Jan 6, 2020
@wholmgren wholmgren added enhancement New feature or request metrics Issue pertains to metrics calculation labels Jan 6, 2020
@dplarson
Copy link
Contributor Author

dplarson commented Jan 6, 2020

Just opened issue #302 for checking probabilistic metrics and added a line to docs/whatsnew about the new functionality.

@wholmgren
Copy link
Member

@alorenzo175 ok to merge?

@alorenzo175 alorenzo175 merged commit b99b5d2 into SolarArbiter:master Jan 7, 2020
@dplarson dplarson deleted the implement_check_metrics branch February 10, 2020 19:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request metrics Issue pertains to metrics calculation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

implement datamodel.__check_metrics__
3 participants