Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add metric_key_prefix from training_args #21286

Closed
marctorsoc opened this issue Jan 24, 2023 · 3 comments
Closed

Add metric_key_prefix from training_args #21286

marctorsoc opened this issue Jan 24, 2023 · 3 comments

Comments

@marctorsoc
Copy link

Feature request

Today, if we create a Trainer as in

trainer = Trainer(
            model=self.model,  # the instantiated 馃 Transformers model to be trained
            args=training_args,  # training arguments, defined above
            train_dataset=data_cls["train"],
            eval_dataset=data_cls["develop"],
            compute_metrics=partial(
                compute_metrics,
                fbeta_beta=self.config.early_stopping.fbeta_beta,
            ),
            data_collator=collate_chunks,  # type: ignore
            callbacks=callbacks,  # type: ignore
)

and do trainer.train():

  1. There's no way to change the prefixes for the train dataset metrics (at least that I'm aware of)
  2. One can change the prefix for the evaluation dataset from the default eval/ into anything by changing the above into
prefix = "other"
Trainer(
         ....
         eval_dataset={prefix: data_cls["develop"]},
         ...
)

However, doing this creates eval/other_accuracy due to the way rewrite_logs works. Ideally I'd like it to be other/accuracy.

My request is to have a clear way in the training_args to add any prefixes to the metrics, either for train or eval datasets.

Motivation

I want to train multiple models within the same wandb run. As things stand right now, the metrics clash

Your contribution

I've contributed to other OS projects but I'm not very familiar with the codebase to do the contribution directly. If someone guides me around with a high-level description, happy to do it myself

@ArthurZucker
Copy link
Collaborator

cc @sgugger

@sgugger
Copy link
Collaborator

sgugger commented Jan 25, 2023

This seems like a very niche feature which can be achieved by customizing your callback (you can use your own instead of the default ones).

@marctorsoc
Copy link
Author

marctorsoc commented Jan 25, 2023

@sgugger could you elaborate a bit more about what callback(s)? I guess I have to remove one and add mine?

(oh sorry, I just found https://huggingface.co/docs/transformers/main_classes/callback)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants