Skip to content

[Feature] Support deepspeed for HF trainer#164

Merged
pppppM merged 1 commit intoInternLM:mainfrom
LZHgrla:lzh/hf_ds
Oct 23, 2023
Merged

[Feature] Support deepspeed for HF trainer#164
pppppM merged 1 commit intoInternLM:mainfrom
LZHgrla:lzh/hf_ds

Conversation

@LZHgrla
Copy link
Copy Markdown
Contributor

@LZHgrla LZHgrla commented Oct 16, 2023

Command

xtuner train $HF_CONFIG --deepspeed deepspeed_zero3

Note: This PR relies on huggingface/accelerate#2060 to disable some errors!

@LZHgrla
Copy link
Copy Markdown
Contributor Author

LZHgrla commented Oct 17, 2023

huggingface/accelerate#2060 has been merged!

This PR is ready to be reviewed and merged!

@pppppM pppppM merged commit c224875 into InternLM:main Oct 23, 2023
llkn-2 pushed a commit to llkn-2/xtuner that referenced this pull request Jul 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

How can I do full parameter fine-tuning the model with FP16

2 participants