-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Did WizardCoder generate data from GPT-4 or GPT-3.5? #1
Comments
Yeah, the paper show the instructions are generated by GPT-4 or GPT-3.5. But, how to generate the response of the new generated instruction is not explicitly shown in their paper. I guess the response should be generated by GPT-4 |
Do you mean only the ###Response is generated by GPT-4? A third choice is generating programming tasks with GPT-3.5, and then pass the task to GPT-4 to generate the solution. FIY: As the just leaked GPT-4 detail shows (https://threadreaderapp.com/thread/1678545170508267522.html), GPT-4 is trained 2 epochs for text-based data and 4 for code-based data! |
@Symbolk my objective is to do just batch inferencing in that case what is going to be my prompt template with an eg. if you can explain thanks in advance |
Hi, did you successfully reproduce the training data? |
Thanks for this repo! I am also reading the paper recently, but I did not notice which LLM the WizardCoder used to generated their Evol-Instruct data. According to your implementation, gpt4_azure is used, is it the same with WizardCoder (considering that Microsoft insiders could use the API for free since early this year), or you just guess they used GPT-4?
The text was updated successfully, but these errors were encountered: