Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the training parameter setting at stage1 and stage2 #37

Open
Lanxin1011 opened this issue Aug 9, 2023 · 3 comments
Open

Comments

@Lanxin1011
Copy link

Dear the authors,
Thanks for the great work!~ I am trying to reproduce the shikra-7b model by training from vicuna-7b according to the descriptions in the SHIKRA paper. And I'm confused about some details of the parameter setting. Would you please provide me with some solutions to the following questions?

  1. Should I use vicuna-7b as the training init weight? If yes, should I use the raw vicuna-7b or should I replace the "config.json, generation_config.json, special_tokens_map.json, tokenizer_config.json" with that of shikra-7b? Since the above json files are not the same between vicuna-7b and shikra-7b.

  2. What config files should I use at stage1 and stage2? shikra_pretrain_concat8_stage1.py for stage1 and shikra_pretrain_final19_stage2.py for stage2? And could you tell me what's the usage of shikra_pretrain_concat3_stage0.py? Only stage 1 and stage2 are introduced in the paper.

  3. Are the two stages trained seperately which means we train the stage1 first and save the model, then resume from the previous model to train the stage2? And what's the num_train_epochs for stage2? Does it has the same setting of 1.5 epochs as stage1?

    Really Looking forward to you reply~

@Anymake
Copy link

Anymake commented Sep 5, 2023

+1

2 similar comments
@harrytea
Copy link

harrytea commented Sep 6, 2023

+1

@GaoXiaoshan
Copy link

+1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants