-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to log metrics during training #94
Comments
If it is only accuracy, loss, or other metrics that only depend on the predictions and data, you can do this in
You can also pass arbitrary user-defined metrics, as long as it has the same signature as About the callback, yes, look at
Look here to check how |
Okay great, thank you! I see that the loss/accuracy are printed during training, but I'm unsure how I can go inside the training loop and, for every epoch, do something like Any ideas? |
The training and validation metrics and loss values are all saved into the
Perhaps is this what you need? I am not sure if I understand the use case of going inside the trainer yet. |
@gianlucadetommaso yes this is good! |
Hi,
Sorry to ask so many questions! And thanks again for creating such a great library.
I'd like to log the loss / accuracy and other metrics during training...for example with Tensorboard or Weights & Biases. I've looked at Callbacks but it appears they interact with
TrainerState
which seems to only contain the parameters of the model state.Do you know if there is any easy way to construct a Callback function to retrieve predictions and ground truth for a particular epoch? Then I could compute whatever sort of metrics I'd want.
Thank you!
The text was updated successfully, but these errors were encountered: