Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Default temperature #43

Closed
woctezuma opened this issue May 11, 2019 · 2 comments
Closed

Default temperature #43

woctezuma opened this issue May 11, 2019 · 2 comments

Comments

@woctezuma
Copy link
Contributor

woctezuma commented May 11, 2019

Is there a reason why the default temperature is 0.7?

I have not checked thoroughly yet, but this page shows different temperatures for "detectability baselines" (shown below). It seems that the temperature is chosen closer to:

  • 0.9 for the 117M and 345M models,
  • 0.75 for the 762M and 1542M models.
Model Temperature 1 Top-K 40
117M 88.29% 96.79%
345M 88.94% 95.22%
762M 77.16% 94.43%
1542M 74.31% 92.69%
@minimaxir
Copy link
Owner

No technical reason; in my testing 0.7 was the safest for a fine-tuned model.

Keep in mind that finetuned GPT-2 is different from the raw GPT-2.

@woctezuma
Copy link
Contributor Author

For info, I read the table wrong. :'(

those numbers are accuracies, not temperatures

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants