Skip to content

Removing the modification of loss value due to rounding off to 4 digits #38032

Open
@harish6696

Description

@harish6696

System Info

transformers version: 4.50.2
python version: 3.13.1

Who can help?

@zach-huggingface @SunMarc

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

Inside the Trainer class, why is the loss rounded to 4 digits? I have applications where I am interested to see the loss go below 4 significant digits, but they all get rounded to 0. Please let the user set this rounding number, or let the loss be displayed in scientific notation like 1.xxxe^y

this is set inside def _maybe_log_save_evaluate()
logs["loss"] = round(tr_loss_scalar / (self.state.global_step - self._globalstep_last_logged), 4)

Expected behavior

It would be great if this hardcoding of rounding off the "loss" is removed. Best output would be to remove the round function and log the loss value in scientific notation which is much clearer.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions