-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Labels
featureIs an improvement or enhancementIs an improvement or enhancementhelp wantedOpen to be worked onOpen to be worked on
Description
Is your feature request related to a problem? Please describe.
TensorBoard is a great tool for visualization but the size of event files rapidly grows bigger when we log images.
To prevent this problem, we need to set logging frequencies.
As the best of my knowledge, in pytorch-lightning, we can manually set them like:
class CoolModule(pl.LightningModule):
def __init__(self, args):
self.log_feq = args.log_freq
self.model = ...
def training_step(self, data_batch, batch_nb):
input = data_batch['input']
output = self.forward(input)
if self.global_step % self.log_freq == 0:
self.experiment.add_image('output_image', output, self.global_step)This is an easy way, however, I think it is clearer to control frequencies by Trainer.
Describe the solution you'd like
Add an option for controlling the frequencies to Trainer.
trainer = Trainer(model, tb_log_freq=foo)To enable this functionality, training_step should return image tensors like:
def training_step(self, data_batch, batch_nb):
input = data_batch['input']
output = self.forward(input)
loss = ...
return {'loss', loss, 'prog': {'loss': loss}, 'image': {'output': output}}Metadata
Metadata
Assignees
Labels
featureIs an improvement or enhancementIs an improvement or enhancementhelp wantedOpen to be worked onOpen to be worked on