-
Notifications
You must be signed in to change notification settings - Fork 3.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[REQUEST] i want to know how to merge deepspeed multi gpu optim file into one pytorch optim.pt file ? #5460
Comments
Hi @ldh127 - can you please be more specific, share more about what you are trying to do and what errors you are hitting? |
yes,but i think this code is for ds2universe model param ,not for merging multi optim file into to one file , it can process merge deepspeed multi gpu optim file into one pytorch optim.pt file?
ldh
***@***.***
…------------------ 原始邮件 ------------------
发件人: ***@***.***>;
发送时间: 2024年4月25日(星期四) 凌晨3:40
收件人: ***@***.***>;
抄送: ***@***.***>; ***@***.***>;
主题: Re: [microsoft/DeepSpeed] [REQUEST] i want to know how to merge deepspeed multi gpu optim file into one pytorch optim.pt file ? (Issue #5460)
@ldh127, does the following help?
https://deepspeed.readthedocs.io/en/latest/model-checkpointing.html#zero-checkpoint-fp32-weights-recovery
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
yes ,i use transformers trainer to call deepspeed , it save the deepspeed checkpoint which contains multi gpu model and optim file , i want just one file optim.pt file to choosing sft data , my code can only load one global optim.pt , but deepspeed checkpoint get multi part optim and model file , how can i. merge multi optim file to one global file ? |
@ldh127, why do you say the link is related to ds2universal? Did you try it? Can you clarify how your scenario is different from the use case below? Thanks! |
yes , i try this code , finally i surely get only one .pth file, but you can see my details, i |
you can see the uppon picture , if the finally file which named demo_state_dict.pth contains optim param ,but how can i get the optim state_dict ? if it is the merged optim file , it seems i can use state_dict["optim_state"] like this way to get the only one optim dict ,but it has no optim_state key in the dict , so i donot konw what error in my operate steps |
i also read the code in this url: https://github.com/microsoft/DeepSpeed/blob/4c15ad9f8d51a1950842c69bbbc9d93c73afbcfc/deepspeed/utils/zero_to_fp32.py , but i do not know if i need to update what code , can you give me more detail help? thanks , need some detail |
No description provided.
The text was updated successfully, but these errors were encountered: