Skip to content

Trainer training batches estimation for map-style datasets seems wrong #14209

@robertomest

Description

@robertomest

Hello,

It seems that setting max_steps and not setting max_epochs for map-style datasets will lead to estimated_stepping_batches returning the batches for a single epoch. From the docstring, it indicates that this would be the estimated stepping batches for the whole training. I believe the line below (with max(self.max_epochs, 1)) is not appropriate in this scenario. Shouldn't it always return max_steps in case max_epochs == -1?
https://github.com/Lightning-AI/lightning/blob/deaadc157b87b653f1c3aa7926a5ff092ab81863/src/pytorch_lightning/trainer/trainer.py#L2783

Thanks!

cc @carmocca @justusschock @ananthsub @ninginthecloud @rohitgr7

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingloopsRelated to the Loop API

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions