Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DummyScheduler errors #233

Closed
Young1993 opened this issue Apr 5, 2023 · 4 comments
Closed

DummyScheduler errors #233

Young1993 opened this issue Apr 5, 2023 · 4 comments

Comments

@Young1993
Copy link

Young1993 commented Apr 5, 2023

scheduler defined in Deepspeed config

scheduler = DummyScheduler(
    optimizer, warmup_num_steps=config["warmup_steps"],
)

model, optimizer, train_dataloader, val_dataloader, scheduler = accelerator.prepare(
    model, optimizer, train_dataloader, val_dataloader, scheduler
)

# setup for saving training states in case preemption
accelerator.register_for_checkpointing(scheduler)

When I run train.py in pycharm, the errors happen for accelerator.register_for_checkpointing(scheduler) as follows:

Traceback (most recent call last):
File "/home/wh/.pycharm_helpers/pydev/_pydevd_bundle/pydevd_exec2.py", line 3, in Exec
exec(exp, global_vars, local_vars)
File "", line 1, in
File "/home/zju/anaconda3/envs/picard/lib/python3.8/site-packages/accelerate/accelerator.py", line 2563, in register_for_checkpointing
raise ValueError(err)
ValueError: All objects must include a state_dict and load_state_dict function to be stored. The following inputs are invalid:
- Item at index 0, DummyScheduler

how to fix it ?

@zanussbaum
Copy link
Collaborator

For a quick solution, you can replace it with a real scheduler. This original code was meant to interface with Deepspeed but it doesn't currently work with non-Deepsped. There will be some updates in the coming days to fix this

@dxenos
Copy link

dxenos commented Apr 7, 2023

I have the same problem :( @zanussbaum what does it mean to replace it with a real scheduler?

@Young1993
Copy link
Author

Thanks

@sepilqi
Copy link

sepilqi commented Feb 26, 2024

I have the same problem :( @zanussbaum what does it mean to replace it with a real scheduler?

in deepspeed case,

  • set your environment with accelerate config first, and set deepspeed
  • pass --deepspeed in your args

will initialize a real scheduler

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants