Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REQUEST] i want to know how to merge deepspeed multi gpu optim file into one pytorch optim.pt file ? #5460

Open
ldh127 opened this issue Apr 24, 2024 · 8 comments
Labels
enhancement New feature or request

Comments

@ldh127
Copy link

ldh127 commented Apr 24, 2024

No description provided.

@ldh127 ldh127 added the enhancement New feature or request label Apr 24, 2024
@loadams
Copy link
Contributor

loadams commented Apr 24, 2024

Hi @ldh127 - can you please be more specific, share more about what you are trying to do and what errors you are hitting?

@tjruwase
Copy link
Contributor

@ldh127
Copy link
Author

ldh127 commented Apr 25, 2024 via email

@ldh127
Copy link
Author

ldh127 commented Apr 25, 2024

Hi @ldh127 - can you please be more specific, share more about what you are trying to do and what errors you are hitting?

yes ,i use transformers trainer to call deepspeed , it save the deepspeed checkpoint which contains multi gpu model and optim file , i want just one file optim.pt file to choosing sft data , my code can only load one global optim.pt , but deepspeed checkpoint get multi part optim and model file , how can i. merge multi optim file to one global file ?

@tjruwase
Copy link
Contributor

yes,but i think this code is for ds2universe model param ,not for merging multi optim file into to one file , it can  process merge deepspeed multi gpu optim file into one pytorch optim.pt file?  ldh @.***  

@ldh127, why do you say the link is related to ds2universal? Did you try it? Can you clarify how your scenario is different from the use case below? Thanks!

image

@ldh127
Copy link
Author

ldh127 commented Apr 27, 2024

yes,but i think this code is for ds2universe model param ,not for merging multi optim file into to one file , it can  process merge deepspeed multi gpu optim file into one pytorch optim.pt file?  ldh @.***  

@ldh127, why do you say the link is related to ds2universal? Did you try it? Can you clarify how your scenario is different from the use case below? Thanks!

image

yes , i try this code , finally i surely get only one .pth file, but you can see my details, i
0ce59e0b-7dfd-4743-987f-608d008b4a66
this is my deepspeed checkpoint file, i use your code to read this folder ,and finally it merge and save only one file , i use this code ,you can see
807e76d9-504f-413e-bff6-f7ba86a2f28e
,and i get the file like this ,
01e94eb8-fe5f-4737-a2fb-aef60c8c9966
, you can see that i print the state_dict name, it is like base_model.model.model.layers.38.self_attn.q_proj.lora_A.default.weight
base_model.model.model.layers.38.self_attn.q_proj.lora_B.default.weight
base_model.model.model.layers.38.self_attn.k_proj.lora_A.default.weight
base_model.model.model.layers.38.self_attn.k_proj.lora_B.default.weight
, but it seems the models name ,not the optim file name ?

@ldh127
Copy link
Author

ldh127 commented Apr 27, 2024

yes,but i think this code is for ds2universe model param ,not for merging multi optim file into to one file , it can  process merge deepspeed multi gpu optim file into one pytorch optim.pt file?  ldh @.***  

@ldh127, why do you say the link is related to ds2universal? Did you try it? Can you clarify how your scenario is different from the use case below? Thanks!

image

you can see the uppon picture , if the finally file which named demo_state_dict.pth contains optim param ,but how can i get the optim state_dict ? if it is the merged optim file , it seems i can use state_dict["optim_state"] like this way to get the only one optim dict ,but it has no optim_state key in the dict , so i donot konw what error in my operate steps

@ldh127
Copy link
Author

ldh127 commented Apr 27, 2024

yes,but i think this code is for ds2universe model param ,not for merging multi optim file into to one file , it can  process merge deepspeed multi gpu optim file into one pytorch optim.pt file?  ldh @.***  

@ldh127, why do you say the link is related to ds2universal? Did you try it? Can you clarify how your scenario is different from the use case below? Thanks!

image

i also read the code in this url: https://github.com/microsoft/DeepSpeed/blob/4c15ad9f8d51a1950842c69bbbc9d93c73afbcfc/deepspeed/utils/zero_to_fp32.py , but i do not know if i need to update what code , can you give me more detail help? thanks , need some detail

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants