Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[flax] unfreeze initial cache in gpt models #14535

Merged
merged 1 commit into from
Nov 26, 2021

Conversation

patil-suraj
Copy link
Contributor

@patil-suraj patil-suraj commented Nov 26, 2021

What does this PR do?

Fix flax generate for GPT models when the initial seq_len is 1.

The issue is the init_cache method of flax GPT2 returns the cache as a FrozenDict, but the model’s forward returns cache as a dict.

It works with seq_len > 1 because, when seq_len > 1, we call the body fun outside of the while loop -> body calls forward -> which returns cache as a dict.

then we iterate over body_fn, using lax.while_loop, and it works as the type signature of cache is similar.

It breaks for seq_len = 1 because, when it’s one we directly call the body_fn with lax.while_loop , so here the initial type of cache is FrozenDict but the forward in body_fn returns dict, which raises this error

body_fun output and input must have same type structure, got PyTreeDe...

cc @Narsil

@patil-suraj patil-suraj changed the title unfreeze initial cache in gpt models [flax] unfreeze initial cache in gpt models Nov 26, 2021
Copy link
Contributor

@Narsil Narsil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM,

Thanks for looking into it!

@patil-suraj patil-suraj merged commit 69511cd into huggingface:master Nov 26, 2021
@patil-suraj patil-suraj deleted the fix-flax-generate-gpt branch November 26, 2021 12:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants