Skip to content

LightningModule.log(..., on_epoch=True) logs with global_step instead of current_epoch #4998

@quinor

Description

@quinor

🐛 Bug

When logging using:
self.log(f"some_metric", value, on_step=False, on_epoch=True)
in ie. training_step, the data is logged to the tensorboard with X axis in steps instead of epochs:

image

Expected behavior is for the x axis to be in epochs:

image

To Reproduce

(I'll try to work on reproduction example once I find some free cycles this week)

Environment

Pytorch 1.7 and Lightning 1.1rc

Additional context

@tchaton

Metadata

Metadata

Assignees

No one assigned

    Labels

    featureIs an improvement or enhancementhelp wantedOpen to be worked onloggingRelated to the `LoggerConnector` and `log()`

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions