Skip to content

Fix warning in GPT-Neo example and enable use_cache#210

Merged
lekurile merged 1 commit intomasterfrom
lekurile/fix_warning_gpt_neo
Oct 27, 2022
Merged

Fix warning in GPT-Neo example and enable use_cache#210
lekurile merged 1 commit intomasterfrom
lekurile/fix_warning_gpt_neo

Conversation

@lekurile
Copy link
Contributor

This PR fixes the following warning in the test-gpt-neo.py text-generation example by explicitly specifying the max_length argument:

UserWarning: Neither `max_length` nor `max_new_tokens` have been set, `max_length` will default to 50 (`self.config.max_length`). Controlling `max_length` via the config is deprecated and `max_length` will be removed from the config in v5 of Transformers -- we recommend using `max_new_tokens` to control the maximum length of the generation. 

It also sets use_cache=True when calling the generator to stay consistent with other text-generation examples.

@lekurile lekurile merged commit 6e30c15 into master Oct 27, 2022
hwchen2017 pushed a commit that referenced this pull request Jun 8, 2025
Co-authored-by: Lev Kurilenko <lekurile@microsoft.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants