Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/unc func pytest #269

Merged
merged 2 commits into from
Apr 28, 2022

Conversation

davidegraff
Copy link
Contributor

Description

reformatted the unit tests to use pytest. The platform is a lot more flexible and I want to enforce that all new tests use pytest if possible.

Checklist

  • linted with flake8?
  • (if appropriate) unit tests added?

@davidegraff davidegraff mentioned this pull request Apr 28, 2022
2 tasks


def test_build_regression_metric(regression_metric):
assert build_uncertainty_evaluator(regression_metric, None, None, "regression", None, None)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Previously, this test checked to see if the output was an UncertaintyEvaluator. Now it's changed to be checking just that it's not None or False right?

I can't think of a corner case where the distinction would matter I suppose. So I think it's okay.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also this request structure is interesting. Wasn't familiar with it previously.

@pytest.mark.parametrize("metric", [str(uuid.uuid4()) for _ in range(3)])
def test_build_unsupported_metrics(metric, dataset_type):
with pytest.raises(NotImplementedError):
build_uncertainty_evaluator(metric, None, None, "spectra", dataset_type, None)
Copy link
Contributor

@cjmcgill cjmcgill Apr 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Edit: Oh wait. Second read, the metric is becoming the variable name for the gibberish. And the dataset type is getting requested from the static method before. Okay seems good.

@cjmcgill cjmcgill self-requested a review April 28, 2022 19:39
Copy link
Contributor

@cjmcgill cjmcgill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay looks good

@cjmcgill cjmcgill merged commit bf75660 into chemprop:uncertainty_funcs Apr 28, 2022
cjmcgill added a commit that referenced this pull request Apr 29, 2022
* Squash uncertainty commits

* change test window for python from 3.6-3.8 to 3.7-3.8

* Fix processing argument error

* Fix uncal output for no uncertainty calculation

* Fix output of make_predictions for web app

* Update chemprop/train/predict.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Update chemprop/train/loss_functions.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Update chemprop/train/loss_functions.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* fix inequality in torch version check for dropout

* cleaner dropout layer activation

* Re-label evidential classification and multiclass as dirichlet

* Make returning unc default False for make_predictions function

* Remove redundant torch device assignments

* Remove temp function from dirichlet loss

* Change uncertainty function names from ..._builder to build_...

* Update chemprop/uncertainty/uncertainty_evaluator.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Make batch assignment more readable

* Replace relative imports with absolute imports

* Fix imports

* Make uncertainty classes ABCs with specified abstractmethods

* Switching NLL metric to mean not sum

* Replace label attribute with abstract property for calibrator/predictor

* Add function descriptions

* Update chemprop/models/model.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Update chemprop/train/loss_functions.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Update chemprop/uncertainty/uncertainty_calibrator.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Additional errors, including --ensemble_variance argument deprecation

* Disallow ensemble variance from being used in classification calibration

* Updates to readme file

* Remove pass from abstract methods

* Merge common dirichlet classification function components

* Fix number of dropout cycles

* Provide targets to evaluator not data

* Trigger evaluator argument errors

* Add unit tests for loss functions and evaluators

* Correct calibrated miscalibration_error

* Integration test for uncertainty

* Removing unstable spearman test

* Clarify uncertainty dropout in documentation

* Refactor load_models function so that older workflows will not break

* Remove accidental note line inclusion

* Feat/unc func pytest (#269)

* reformat to pytest

* formatting

* Add comments to uncertainty evaluator tests

* Convert loss function tests to pytest classes

* Fix testing problems

* Make nll an abstractmethod

* Update chemprop/uncertainty/uncertainty_calibrator.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Docstring change

* Clarify if/else blocks in make_predictions

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>
Co-authored-by: Kevin Greenman <kpg@mit.edu>
cjmcgill added a commit that referenced this pull request Apr 30, 2022
* add more examples and faster setup

* fix typo and add nanoHUB link

* Uncertainty Functions (#267)

* Squash uncertainty commits

* change test window for python from 3.6-3.8 to 3.7-3.8

* Fix processing argument error

* Fix uncal output for no uncertainty calculation

* Fix output of make_predictions for web app

* Update chemprop/train/predict.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Update chemprop/train/loss_functions.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Update chemprop/train/loss_functions.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* fix inequality in torch version check for dropout

* cleaner dropout layer activation

* Re-label evidential classification and multiclass as dirichlet

* Make returning unc default False for make_predictions function

* Remove redundant torch device assignments

* Remove temp function from dirichlet loss

* Change uncertainty function names from ..._builder to build_...

* Update chemprop/uncertainty/uncertainty_evaluator.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Make batch assignment more readable

* Replace relative imports with absolute imports

* Fix imports

* Make uncertainty classes ABCs with specified abstractmethods

* Switching NLL metric to mean not sum

* Replace label attribute with abstract property for calibrator/predictor

* Add function descriptions

* Update chemprop/models/model.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Update chemprop/train/loss_functions.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Update chemprop/uncertainty/uncertainty_calibrator.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Additional errors, including --ensemble_variance argument deprecation

* Disallow ensemble variance from being used in classification calibration

* Updates to readme file

* Remove pass from abstract methods

* Merge common dirichlet classification function components

* Fix number of dropout cycles

* Provide targets to evaluator not data

* Trigger evaluator argument errors

* Add unit tests for loss functions and evaluators

* Correct calibrated miscalibration_error

* Integration test for uncertainty

* Removing unstable spearman test

* Clarify uncertainty dropout in documentation

* Refactor load_models function so that older workflows will not break

* Remove accidental note line inclusion

* Feat/unc func pytest (#269)

* reformat to pytest

* formatting

* Add comments to uncertainty evaluator tests

* Convert loss function tests to pytest classes

* Fix testing problems

* Make nll an abstractmethod

* Update chemprop/uncertainty/uncertainty_calibrator.py

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>

* Docstring change

* Clarify if/else blocks in make_predictions

Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>
Co-authored-by: Kevin Greenman <kpg@mit.edu>

* updated column name with new uncertainty

* add more examples and faster setup

* fix typo and add nanoHUB link

* updated column name with new uncertainty

Co-authored-by: Charles McGill <44245643+cjmcgill@users.noreply.github.com>
Co-authored-by: david graff <60193893+davidegraff@users.noreply.github.com>
Co-authored-by: Chas <charlesjmcgill@gmail.com>
@davidegraff davidegraff deleted the feat/unc_func_pytest branch May 12, 2022 16:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants