Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Text Generation Parameters #12

Open
AvantiB opened this issue Apr 10, 2024 · 1 comment
Open

Text Generation Parameters #12

AvantiB opened this issue Apr 10, 2024 · 1 comment

Comments

@AvantiB
Copy link

AvantiB commented Apr 10, 2024

Hi,

Thanks for releasing this dataset. I just wanted to know what were the parameters used for generation. Since you mentioned using HuggingFace for generation for all open-source models, if you could please list generation parameters (https://huggingface.co/docs/transformers/en/main_classes/text_generation#transformers.GenerationConfig), it would be great. Additionally, openAI API also requires parameter definition for temperature/top-p. did you specify any or use the default setting?
Thanks

Best,
Avanti

@yafuly
Copy link
Owner

yafuly commented Apr 11, 2024

Hi Avanti,

Thanks for following our work.

For API-based models, we use the default decoding parameters. For other LLMs (Huggingface), we adopt the following parameters:

    --max_new_tokens 1024 \
    --top_p 1.0 \
    --top_k 0 \
    --temperature 0.7 \

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants