Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Trainer] Change num_train_epochs default value #8113

Merged
merged 2 commits into from Mar 13, 2024

Conversation

DesmonDay
Copy link
Contributor

PR types

Others

PR changes

Others

Description

Change num_train_epochs default value.

Copy link

paddle-bot bot commented Mar 13, 2024

Thanks for your contribution!

lugimzzz
lugimzzz previously approved these changes Mar 13, 2024
Copy link
Contributor

@lugimzzz lugimzzz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Copy link

codecov bot commented Mar 13, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 56.46%. Comparing base (e145bfc) to head (d37dd3e).

Additional details and impacted files
@@           Coverage Diff            @@
##           develop    #8113   +/-   ##
========================================
  Coverage    56.46%   56.46%           
========================================
  Files          596      596           
  Lines        91583    91583           
========================================
  Hits         51711    51711           
  Misses       39872    39872           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

wawltor
wawltor previously approved these changes Mar 13, 2024
Copy link
Collaborator

@wawltor wawltor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -127,7 +127,7 @@ class TrainingArguments:
The epsilon hyperparameter for the [`AdamW`] optimizer.
max_grad_norm (`float`, *optional*, defaults to 1.0):
Maximum gradient norm (for gradient clipping).
num_train_epochs(`float`, *optional*, defaults to 3.0):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

中文文档同步修改

@DesmonDay DesmonDay dismissed stale reviews from wawltor and lugimzzz via d37dd3e March 13, 2024 11:42
Copy link
Contributor

@ZHUI ZHUI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wawltor wawltor merged commit 0a39fbf into PaddlePaddle:develop Mar 13, 2024
8 of 10 checks passed
DesmonDay added a commit to DesmonDay/PaddleNLP that referenced this pull request Mar 14, 2024
* change num_train_epochs default value

* update docs
DesmonDay added a commit that referenced this pull request Mar 14, 2024
* change num_train_epochs default value

* update docs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants