Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

change checks to init - update - compute structure #826

Merged
merged 13 commits into from Feb 9, 2022
Merged

Conversation

noamzbr
Copy link
Collaborator

@noamzbr noamzbr commented Feb 7, 2022

This PR is intended to change the structure of our vision checks to use the following methods:
initialize_run - everything that should be done before we start running on the batches returned by dataloader
update - accumulate results for a specific batch
compute - return the final CheckResult using accumulated values.

Updated also the base checks in vision to match, where now run_logic handels running all these functions for a call of a single check.

This is a draft, please let me know what you think. If it's approved I'll expand these changes to suite and to the label drift check.

@noamzbr noamzbr added the refactoring Making significant changes to structure of code label Feb 7, 2022
@noamzbr noamzbr added this to the computer-vision milestone Feb 7, 2022
@noamzbr noamzbr self-assigned this Feb 7, 2022
@JKL98ISR
Copy link
Member

JKL98ISR commented Feb 7, 2022

so all checks are built for batches in their design like the ignite metrics?

@noamzbr
Copy link
Collaborator Author

noamzbr commented Feb 7, 2022

so all checks are built for batches in their design like the ignite metrics?

Basically yes, I just named our "reset" to initialize_run

@noamzbr noamzbr requested review from JKL98ISR and removed request for matanper February 9, 2022 12:37
@noamzbr noamzbr marked this pull request as ready for review February 9, 2022 14:58
@@ -152,12 +135,60 @@ def assert_task_type(self, *expected_types: TaskType):
f'Check is irrelevant for task of type {self.train.task_type}')
return True

def infer(self, batch: Any) -> Any:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is that used for?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to cache the predictions for a given batch. Used for e.g. in performance report.

Refactor to proper init-update-compute structure will follow shortly
checks_list = []
for check_idx, check in list(self.checks.items()):
if isinstance(check, (TrainTestCheck, ModelOnlyCheck)):
checks_list.append((check_idx, copy.deepcopy(check)))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do you need to copy here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because checks have state, seems cleaner to me to copy them when doing a specific .run() call rather than use the originals used in the constructor

results = OrderedDict({check_idx: None for check_idx, check in check_dict.items()})

# Loop over training batches
n_batches = len(context.train.get_data_loader())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so we decided data loader must be mapped?

@noamzbr noamzbr merged commit f9b1ef6 into main Feb 9, 2022
@delete-merged-branch delete-merged-branch bot deleted the init-update-compute branch February 9, 2022 15:50
@noamzbr noamzbr linked an issue Feb 9, 2022 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
refactoring Making significant changes to structure of code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEAT][CV] Run checks on batches
3 participants