Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Support Mixtral 8x7b #263

Merged
merged 5 commits into from
Dec 11, 2023
Merged

[Feature] Support Mixtral 8x7b #263

merged 5 commits into from
Dec 11, 2023

Conversation

pppppM
Copy link
Collaborator

@pppppM pppppM commented Dec 11, 2023

Due to the lack of official dialogue templates from Mixtral, we use InternLM's dialogue templates for its SFT fine-tuning.

# QLoRA (only need a single A100-80G)
xtuner train mixtral_8x7b_qlora_oasst1_internlm_template_e3 --deepspeed deepspeed_zero2


#Full Parameters(with slurm)
srun ${SRUN_ARGS} xtuner train mixtral_8x7b_full_oasst1_internlm_template_e3 --deepspeed deepspeed_zero3 --launcher slurm

Copy link
Collaborator

@LZHgrla LZHgrla left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I left one comment about the requirements/runtime.txt

Others are LGTM!

requirements/runtime.txt Outdated Show resolved Hide resolved
@LZHgrla LZHgrla changed the title [Feature]Support Mixtral 8x7b [Feature] Support Mixtral 8x7b Dec 11, 2023
@LZHgrla LZHgrla merged commit 13b3508 into main Dec 11, 2023
3 checks passed
@pppppM pppppM deleted the moe branch December 12, 2023 02:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants