Skip to content

Setting lightningmodule.truncated_bptt_steps>0 will not enable tBPTT #8802

@anhnht3

Description

@anhnht3

🐛 Bug

The _tbptt_split_batch function in training_batch_loop.py ignores the value lightningmodule.truncated_bptt_steps. Setting truncated_bptt_steps via Trainer will work fine but it is deprecated.

def _tbptt_split_batch(self, batch: Any) -> List[Any]:
     """Splits a single batch into a list of sequence steps for tbptt.

     Args:
         batch: the current batch to split
     """
     splits = [batch]
     if self.trainer.truncated_bptt_steps is not None:
         model_ref = self.trainer.lightning_module
         with self.trainer.profiler.profile("tbptt_split_batch"):
             splits = model_ref.tbptt_split_batch(batch, self.trainer.truncated_bptt_steps)
     return splits
  • PyTorch Lightning Version (e.g., 1.3.0): 1.4.1

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinghelp wantedOpen to be worked onpriority: 0High priority task

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions