-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding flags to expose gradient clipping args in Trainer #361
Conversation
Coverage reportThe coverage rate went from
Diff Coverage details (click to unfold)src/renate/training/training.py
src/renate/defaults.py
src/renate/updaters/model_updater.py
|
@@ -629,6 +633,8 @@ def __init__( | |||
precision: str = defaults.PRECISION, | |||
seed: int = defaults.SEED, | |||
deterministic_trainer: bool = defaults.DETERMINISTIC_TRAINER, | |||
gradient_clip_val: Union[int, float, None] = defaults.GRADIENT_CLIP_VAL, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it fair to assume this is Optional[float]
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changed.
This change is to expose
gradient_clip_val
andgradient_clip_algorithm
flags ofTrainer
object.Gradient clipping is used for L2P training.
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.