Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating torch lightning #168

Closed
RaulPPelaez opened this issue Apr 25, 2023 · 1 comment
Closed

Updating torch lightning #168

RaulPPelaez opened this issue Apr 25, 2023 · 1 comment

Comments

@RaulPPelaez
Copy link
Collaborator

In a quest to use the latest versions of the different dependencies (mainly for torch2) I have encountered an issue with torch-lightning.
The API has changed in some slight ways, in particular this reset line here has been deprecated:

if should_reset:
# reset validation dataloaders before and after testing epoch, which is faster
# than skipping test validation steps by returning None
self.trainer.reset_val_dataloader(self)

There is a succint message refering to this in the docs:

   * - used ``trainer.reset_*_dataloader()`` methods
     - use  ``Loop.setup_data()`` for the top-level loops
     - `PR16726`_

Pointing to this PR: Lightning-AI/pytorch-lightning#16726

I do not know, however, what this line is doing exactly and I do not understand the comment.
Do you have some pointers?
cc @PhilippThoelke

@PhilippThoelke
Copy link
Collaborator

To evaluate the test set every couple of epochs during training we add it as a val_dataloader. In the test epochs we then have two dataloaders (validation + test) instead of only the validation dataloader. The deprecated function resets the Trainer's internal state of which data to use for what, allowing us to evaluate on the test set only every couple of epochs but evaluate on the validation set every epoch. You'd have to make sure that the suggested replacement method achieves the same thing without breaking or slowing down anything else.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants