Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix bugs with trainer #24134

Merged
merged 4 commits into from
Jun 9, 2023
Merged

Conversation

pacman100
Copy link
Contributor

@pacman100 pacman100 commented Jun 9, 2023

What does this PR do?

Context:

  1. Issue 1 - Currently, when the lr_scheduler is specified in the deepspeed config file, we leverage a DummyScheduler to pass to the accelerator.prepare to get the correct scheduler post prepare call. A prior PR removed preparation of the lr_scheduler leading to a lot of DeepSpeed tests failing.
  2. Issue 2 - when using apex we shouldn't be preparing optimizer else we get AttributeError: 'AcceleratedOptimizer' object has no attribute '_amp_stash'
  3. Issue 3 - FSDP ckpt logic should create ckpt dir if not present. Fixes seems to be a bug related to saving model #24130

This PR fixes the above issues.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Jun 9, 2023

The documentation is not available anymore as the PR was closed or merged.

@pacman100 pacman100 changed the title fix the deepspeed test failures fix bugs with trainer Jun 9, 2023
Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fixes!

src/transformers/trainer.py Outdated Show resolved Hide resolved
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
@pacman100 pacman100 merged commit f2b9183 into main Jun 9, 2023
22 checks passed
@pacman100 pacman100 deleted the smangrul/conditional-prepare-lr-scheduler branch June 9, 2023 12:24
sgugger added a commit that referenced this pull request Jun 9, 2023
* fix the deepspeed test failures

* apex fix

* FSDP save ckpt fix

* Update src/transformers/trainer.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

---------

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
novice03 pushed a commit to novice03/transformers that referenced this pull request Jun 23, 2023
* fix the deepspeed test failures

* apex fix

* FSDP save ckpt fix

* Update src/transformers/trainer.py

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>

---------

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
zhanwenchen added a commit to zhanwenchen/transformers that referenced this pull request Nov 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

seems to be a bug related to saving model
3 participants