How to have two validation loaders in validation loop #10364
Replies: 2 comments 6 replies
-
You can return two loaders in your val_dataloader:
pl will go trough both of them and the data will be available for you. Remember to accept the dataloader_ix in you validation_step. |
Beta Was this translation helpful? Give feedback.
-
Do you have an example input you want to compare the difference on? You could extract your nn.Module from your LightningModule and compare the difference between training mode vs eval mode like so:
https://pytorch.org/docs/stable/notes/autograd.html#locally-disable-grad-doc |
Beta Was this translation helpful? Give feedback.
-
Hello, I'm training with dropouts. However, when I compared loss in training and validation, they're not really comparable since the former has dropout but the later does not. So I'm trying to run the training dataset in the validation loop as well so I can get a training loss without dropout. But I'm not sure how to pass both a training and validation data loaders to the validation loop?
Beta Was this translation helpful? Give feedback.
All reactions